Glossary

for the

Academic Excellence Indicator System

2002-03 Report

You may also download this Glossary as a PDF file.
For Spanish, please download the Glosario del AEIS.

Accountability Rating: The 2002-03 school year provides a transition from the accountability rating system that used Texas Assessment of Academic Skills (TAAS) results and annual dropout rates to the new accountability rating system that uses Texas Assessment of Knowledge and Skills (TAKS) results and longitudinal completion rates. Because state statute requires annual district performance ratings (Exemplary, Recognized, Academically Acceptable, and Academically Unacceptable), district 2002 accountability ratings have been carried forward to 2003. For a more complete explanation of the 2002 Accountability System, please refer to the 2001-02 AEIS Glossary and the 2002 Accountability Manual. For information on the future accountability system, please refer to the 2003 Accountability Plan.

Accountability Subset: Also known as the October subset this refers to the group of students whose performance on the state-mandated test would normally be used to determine a school and district's accountability rating. The performance reported in the 2002-03 AEIS reports is based on this subset:

TAKS Participation, included in the AEIS report, shows what percent of a district or school's test takers made up the Accountability Subset. Also see Mobile Subset, and TAKS Participation.

Adopted Tax Rate (calendar year 2002) (District Profile only): This is the locally adopted tax rate set for the 2002 calendar year. The total adopted rate is composed of a maintenance and operation rate (M&O) and a debt service rate (sometimes referred to as the Interest and Sinking fund rate). Rates are expressed per $100 of taxable value. Taxes based on this rate were to be paid by taxpayers in early 2003. The state value shown for the adopted tax rates is the simple average of all the district rates. (Source: Texas Comptroller of Public Accounts, July 2003)

Advanced Courses: This indicator is based on a count of students who complete and receive credit for at least one advanced course in grades 9-12. Advanced courses include dual enrollment courses. Dual enrollment courses are those for which a student gets both high school and college credit. Deciding who gets credit for which college course is described in Texas Administrative Code §74.25:

To be eligible to enroll and be awarded credit toward state graduation requirements, a student must have the approval of the high school principal or other school official designated by the school district. The course for which credit is awarded must provide advanced academic instruction beyond, or in greater depth than, the essential knowledge and skills for the equivalent high school course.

Appendix C lists all courses identified as advanced, with the exception of courses designated only as dual enrollment. Dual enrollment courses are not shown, as the courses vary from campus to campus and could potentially include a large proportion of all high school courses.

Course completion information is reported by districts through the Public Education Information Management System (PEIMS) after the close of the school year. The values, expressed as a percent, are calculated as follows:

number of students in grades 9-12 who received credit for at least one advanced course in 2001-02
divided by
number of students in grades 9-12 who received credit for at least one course in 2001-02

Special education students are included in the results shown for the campus or district and the individual student groups. For purposes of comparison, advanced course completion rates are also shown for the prior year (2000-01). See also Appendix C: List of Advanced Courses. (Source: PEIMS, June 2002, June 2001)

Advanced Placement Examinations: See AP/IB Results.

Annual Dropout Rate: The annual dropout rate is the count of official dropouts summed across all grades (7-12) divided by the number of students summed across all grades (7-12). It is calculated as follows:

number of students who dropped out at any time during the school year
divided by
number of students who were in attendance at any time during the school year

Annual dropout rates are shown for 2001-02 and 2000-01.

Note that a cumulative count of students is used in the denominator as well as the numerator. This method for calculating the dropout rate neutralizes the effects of mobility by including in the denominator every student who enrolled at the school throughout the school year. If the student dropped out, the student was counted as a dropout for the district last attended (as well as for the campus where the student was enrolled in that district). See also Dropout and Leaver Record. (Source: PEIMS, Oct. 2002, June 2002, Oct. 2001, and June 2001)

AP/IB Results: These refer to the results of the College Board Advanced Placement (AP) examinations and the International Baccalaureate (IB) examinations taken by Texas public school students in a given school year. High school students may take these examinations, ideally upon completion of AP or IB courses, and may receive advanced placement or credit, or both, upon entering college. Generally, colleges will award credit or advanced placement for scores of 3, 4, or 5 on AP examinations and scores of 4, 5, 6, or 7 on IB examinations. Requirements vary by college and by subject tested.

Three values are calculated for this indicator:

(1) The percent of students in grades 11 and 12 taking at least one AP or IB examination:

number of 11th and 12th grade students taking at least one AP or IB examination
divided by
number of 11th and 12th grade students

(2) The percent of scores at or above the criterion score (3 on AP or 4 on IB):

number of grade 11 and 12 AP & IB examination scores at or above criterion
divided by
number of grade 11 and 12 AP & IB examination scores

(3) The percent of examinees with at least one AP or IB score above the criterion score:

number of grade 11 and 12 AP or IB examinees who scored at or above criterion
divided by
number of grade 11 and 12 AP or IB examinees

The denominator of equation (1) does not include 11th and 12th grade students served in special education; however, all students who took at least one AP or IB examination are included in the numerator. The performance of special education students is included in both the numerator and denominator of the other equations.

(Sources: Educational Testing Service, a College Board contractor, Aug. 2002, Aug. 2001; The International Baccalaureate Organization, Aug. 2002, Aug. 2001; and PEIMS, Oct. 2002, Oct. 2001)

ARD: This refers to the Admission, Review, and Dismissal committee that determines the individual education plan for every student in special education. See also Special Education and TAKS Participation.

Attendance Rate: Attendance rates reported in AEIS are based on student attendance for the entire school year. Attendance is calculated as follows:

total number of days students were present in 2001-02
divided by
total number of days students were in membership in 2001-02

Attendance rates are shown for 2000-01 and 2001-02. Only students in grades 1-12 are included in the calculations. (Source: PEIMS, June 2002, June 2001)

Auxiliary Staff (District Profile only): This shows the Full-Time Equivalent (FTE) count of staff reported without a role but with a PEIMS employment and payroll record. Counts of auxiliary staff are expressed as a percent of total staff. For auxiliary staff, the FTE is simply the value of the percent of day worked expressed as a fraction. (Source: PEIMS, Oct. 2002)

Average Actual Salaries (regular duties only): For each professional staff type, the total salary is divided by the total FTE count of staff who receive that salary. The total actual salary amount is pay for regular duties only and does not include supplemental payments for coaching, band and orchestra assignments, and club sponsorships. (Source: PEIMS, Oct. 2002)

Average Teacher Salary by Years of Experience (regular duties only): Total pay for teachers within each experience group is divided by the total teacher FTE for the group. The total actual salary amount is pay for regular duties only and does not include supplements. (Source: PEIMS, Oct. 2002)

Average Years Experience of Teachers: Weighted averages are obtained by multiplying each teacher's FTE count by years of experience. These amounts are summed for all teachers and divided by the total teacher FTE count, resulting in the averages shown. Average years experience refers to the total number of (completed) years of professional experience for the individual, while average years experience with a district refers to tenure, i.e., the number of years employed in the reporting district, whether or not there has been any interruption in service. (Source: PEIMS, Oct. 2002)

Budgeted Instructional Operating Expenditures by Program: These are budgeted instructional operating expenditures categorized by the individual program for which they were budgeted: Bilingual/ESL Education (Bilingual and Special Language Programs); Career and Technology Education; Compensatory Education (Accelerated and Title I Part A); Gifted and Talented Education; Regular Education (Basic Educational Services); Special Education (Services to Students with Disabilities); and Other (Alternative Education, Disciplinary Alternative Education, Athletics and Related Activities, and Undistributed). Percentages are expressed per total instructional operating expenditures. Instructional operating expenditures include those activities that deal directly with the instruction of pupils (functions 11, 95). Instructional Leadership expenditures (function 21) are not included. See also Appendix B. (Source: PEIMS, Oct. 2002)

Campus Group: Each school (also referred to as campus) has a unique comparison group of 40 other public schools (from anywhere in the state), that closely matches that school on six characteristics. (Note that only schools that would normally carry a rating of Exemplary, Recognized, Acceptable, or Low-Performing are included in comparison groups.)

The demographic characteristics used to construct the campus comparison groups include those defined in statute as well as others found to be statistically related to performance. They are:

All schools are first grouped by type (elementary, middle, secondary, or multi-level). Then the group is determined on the basis of the most predominant features at the target school. In the attached example (Appendix D) the target school (Sample H S) has 7.6% African American, 36.8% Hispanic, 53.9% White, 28.2% economically disadvantaged, 10.7% limited English proficient, and 23.7% mobile students. Of these features, the most predominant (i.e., the largest) is the percent of White students, followed by the percent of Hispanic students, the percent of economically disadvantaged students, the percent of mobile students, the percent of limited English proficient students, and finally, the percent of African American students. The following steps illustrate the group identification process:

Step 1: 100 secondary campuses having percentages closest to 53.9% White students are identified;

Step 2: 10 schools from the initial group of 100 are eliminated on the basis of being most distant from the value of 36.8% Hispanic;

Step 3: 10 of the remaining 90 schools that are most distant from 28.2% economically disadvantaged students are eliminated;

Step 4: 10 of the remaining 80 schools that are most distant from 23.7% mobile students are eliminated;

Step 5: 10 of the remaining 70 schools that are most distant from 10.7% limited English proficient students are eliminated;

Step 6: 10 of the remaining 60 schools that are most distant from 7.6% African American students are eliminated; and

Step 7: 10 of the remaining 50 schools that are most distant from 7.6% African American and/or 28.2% economically disadvantaged students are eliminated. (This last reduction step is based on the least predominant characteristics among the four student groups evaluated in the accountability system: African American, Hispanic, White, and economically disadvantaged.)

The final group size is 40 schools. This methodology creates a unique comparison group for every campus. Please note the following:

In the performance section of a campus AEIS report, the value given in the Campus Group column is the median of the values from the 40-school group for that campus. (The median is defined as that point in the distribution of values, above and below which one-half of the values fall.) In the profile section of the report, the value given in the Campus Group column is the mean, or average value. If a report contains question marks (?) in the Campus Group column, this means there were too few schools in the comparison group (specifically, fewer than 25 schools) to have confidence in the median values. Such small numbers are considered too unstable to provide an adequate comparison group value.

Class Size Averages by Grade and Subject: These values show the average class size for elementary classes (by grade) and for secondary classes (by subject) for selected subjects. Classes identified as serving regular, compensatory/remedial, gifted and talented, career and technology, and honors students are included in these averages. Fine arts classes, classes not designated as "regular," or classes where the number of students served is reported to be zero are excluded. Districts do not report actual class size averages. A TEA-developed methodology is applied to the teacher class schedule (responsibility) information reported by districts. TEA-computed class size averages are shown on both the campus and district reports.

The methodology differs depending on whether the class is elementary or secondary due to differences in reporting practices for these two types of teacher schedules. For secondary classes, each unique combination of teacher and class time is counted as a class. Averages are determined by summing the number of students served (in a given subject at the campus) and dividing by the calculated count of classes.

For elementary classes, the number of records reported for each grade is considered. For example, a teacher teaching a variety of subjects to the same group of fourth graders all day should have only one record indicating the total number of fourth grade students served. However, an elementary teacher who teaches a single subject to five different sections of fourth graders each day will have five separate records reported, each with a unique count of students served. Average class sizes are calculated by summing all the students served (in a given grade at the campus) and dividing by the sum of the teacher FTE counts for those records. So, for example, a full-time mathematics teacher with five sections of fourth graders, with 20 different students in each, would have an average of 100/5 or 20 students.

College Admissions Tests: See SAT/ACT Results.

Commended Performance: See TAKS Commended Performance.

Comparable Improvement: There are no Comparable Improvement reports for 2002-03. This measure was based on comparing two years of results on the former state-mandated examination, the TAAS, to determine the growth of student performance. A new comparable improvement measure will be developed for the TAKS. For more information on Comparable Improvement, refer to the 2001-02 AEIS Glossary.

Completion/Student Status Rate: These longitudinal rates show the status of the students expected to graduate with the class of 2002 who first attended ninth grade in the 1998-99 school year. This group of students is known as the 1998-99 cohort, and their progress was tracked over four years using the data provided to TEA by districts. Any student who transferred into the 1998-99 cohort is added to it, and any student who transfers out of the 1998-99 cohort is subtracted from it.

This indicator is reported for districts as well as for high schools that have had continuous enrollment in grades 9-12 since at least the 1998-99 school year. The four final outcomes are:

(1) Percent Graduated: Based on the 1998-99 cohort, this shows the percentage who received their high school diploma on time or earlier - by the end of the 2001-02 school year. It is calculated as follows:

number of students from the cohort who received a high school diploma by the end of 2001-02
divided by
number of students in the 1998-99 cohort

(2) Percent Received GED: Based on the 1998-99 cohort, this shows the percentage who received a General Educational Development certificate before March 1, 2003. It is calculated as follows:

number of students from the cohort who received a GED
divided by
number of students in the 1998-99 cohort

(3) Percent Continued High School: Based on the 1998-99 cohort, this shows the percentage still enrolled as students for the 2002-03 school year. It is calculated as follows:

number of students from the cohort who were enrolled for the 2002-03 school year
divided by
number of students in the 1998-99 cohort

(4) Percent Dropped Out: Based on the 1998-99 cohort, this shows the percentage who dropped out and did not return by the fall of the 2002-03 school year. It is calculated as follows:

number of students from the cohort who dropped out before the fall of the 2002-03 school year
divided by
number of students in the 1998-99 cohort

These four outcomes sum to 100% (percentages may not equal 100% due to rounding).

For purposes of comparison, the completion/student status rate for the class of 2001 is also provided. For further information on these rates, see the report Secondary School Completion and Dropouts in Texas Public Schools 2001-02. (Source: PEIMS, Oct. 2002, June 2002, Oct. 2001, June 2001, Oct. 2000, June 2000, Oct. 1999, June 1999, Oct. 1998, June 1998, June 1997, June 1996)

Criterion Score: This refers to the scores on SAT and ACT college admissions tests and the AP and IB tests. For college admissions tests, the criterion scores are at least 24 on the ACT (composite) and at least 1110 on the SAT (total). For AP and IB tests, the criterion scores are at least 3 on AP tests, and at least 4 on IB tests. See also SAT/ACT Results and AP/IB Results.

Data Quality (from District Profile Section): The AEIS reports show the percent of errors a district made in two key data submissions: 1) the PID Error rate, and 2) the Underreported Student percent.

(1) The Person Identification Database (PID) system ensures that each time information is collected for a student, the identifying information matches other data collections for that student. This allows student data to be linked, such as enrollment records, which are collected in October, to attendance records, which are collected in June; or data to be matched across years. It also helps maintain student confidentiality by assigning an ID that does not divulge the student's identifying information.

During the data submission process each district has the ability to run PID Discrepancy Reports that show any PID errors found. The district then has time to correct the errors before its submission is finalized. While the PID error rate has declined significantly over the years, any amount of error has a detrimental effect on the calculation of longitudinal measures such as the four-year dropout rate and the high school completion rate. The AEIS reports show the student PID error rate for PEIMS Submission 1 (Fall 2002).

The rate is calculated as follows:

number of student PID errors found in PEIMS submission 1 (fall 2002)
divided by
number of student records in PEIMS submission 1 (fall 2002)

(2) Underreported students are 7th - 12th graders who were enrolled at any time the prior year, whom the district has not accounted for in the current year. In other words they were not reported as returning to school nor were they reported as a leaver. Leaver reasons include: having graduated or received a GED, having died, having dropped out or having transferred to another school. (For a more complete definition of leavers, see Leaver Record.)

The rate is calculated as follows:

Underreported Students
divided by
Returning Students + Leavers (incl. Overreported Leavers) + Underreported Students

Note that in some cases districts overreport students as leavers. This means that a district might report a student as having left the district but there is no record of the student having been at the district in the prior year. These students are not taken out of the above formula.

Under the accountability rating system, there have been consequences for districts that exceeded certain thresholds for this measure. An underreported rate greater than 10.0% or a number over 1,000 is noted with a double asterisk (**) on the AEIS report. Districts with five or fewer underreported students and a rate greater than 10.0% are not considered to be over the threshold. In the past, any district that exceeded this threshold could not be rated Exemplary or Recognized. (Source: PEIMS, Oct. 2002, June 2002, Oct. 2001)

Distinguished Achievement Program: See RHSP/DAP Graduates.

Dropout: A student is identified as a dropout if he or she is absent without an approved excuse or documented transfer and does not return to school by the fall of the following year, or if he or she completes the school year but fails to re-enroll the following school year.

Dropout counts are obtained from PEIMS records. Districts report the status of all students who were enrolled in grades 7 - 12 in the district during the prior school year in one of two ways: as being currently in school (Enrollment record) or as having left school (Leaver record). The Leaver record provides 30 possible reasons for leaving school. Generally, a school leaver can be put into one of four categories:

(1) The student graduated or received a GED.

(2) The student died.

(3) The student left school with appropriate documentation of continuing education elsewhere.

(4) The student left school for other reasons.

Many students coded with reasons under the fourth category are considered dropouts. However, before the dropout rate is finalized, a statewide reconciliation system is run in which information about reported leavers is merged with statewide enrollment and attendance records, graduation records, and GED records. Students who are found in these files, indicating they were incorrectly reported as dropouts, are excluded from the dropout rate for the school and district. Students not found in those files are considered official dropouts. See also Annual Dropout Rate.

(Source: PEIMS, Oct. 2002, June 2002, Oct. 2001, June 2001, Oct. 2000, and June 2000; and General Educational Development Information File)

Dropout Rate: See Annual Dropout Rate.

Economically Disadvantaged: The percent of economically disadvantaged students is calculated as the sum of the students coded as eligible for free or reduced-price lunch or eligible for other public assistance, divided by the total number of students:

number of students coded as eligible for free or reduced-price lunch or other public assistance
divided by
total number of students

See also Campus Group and Total Students. (Source: PEIMS, Oct. 2002, Oct. 2001; and TEA Student Assessment Division)

Educational Aides: Educational aides are staff who are reported with a role of 033 (Educational Aide) or 035 (Interpreter). These aides are referred to as paraprofessional staff. The FTE counts of educational aides are expressed as a percent of the total staff FTE. (Source: PEIMS, Oct. 2002)

End-of-Course Examination: This indicator is not shown on the 2002-03 AEIS reports because students are no longer required to take an End-of-Course examination after completing Biology, Algebra I, English II, or U.S. History classes as part of the state assessment program.

Enrollment: See Total Students.

Ethnic Distribution: Students are reported as White, African American, Hispanic, Asian/Pacific Islander, and Native American. In the profile section, both counts and percentages of the total number of students are shown. (Source: PEIMS, Oct. 2002, Oct. 2001; Educational Testing Service; American College Testing Program; and TEA Student Assessment Division)

Exclusions: These are counts of individuals who serve public school students, but are not included in the FTE totals for any of the other employee statistics. There are two types of these entries: individuals participating in a shared services arrangement and individuals on contract with the district to provide instructional services. Shared Services Arrangement (SSA) Staff (District Profile only) work in schools located in districts other than their employing district, or their assigned organization shows a code of 751, indicating that they are employed by the fiscal agent of an SSA. Only the portion of a person's total FTE amount associated with the school in another district (or with the 751 organization code) is counted as SSA. SSA staff are grouped into three categories: Professional Staff (which includes teachers, administrators, and professional support); Educational Aides; and Auxiliary Staff. Note that SSA Auxiliary Staff are identified by the type of fund from which they are paid. Contracted Instructional Staff (District and Campus Profile) refers to counts of instructors for whom the district has entered into a contractual agreement with some outside organization. Through the contract, the outside organization has committed to supplying instructional staff for the district. They are never employees of the reporting school district. (Source: PEIMS, Oct. 2002)

FTE: Full-Time Equivalent.

Fund Balance Information (from District Profile Section): The amount of undesignated, unreserved fund balance that existed at the end of the 2001-02 school year is reported for each district.

The unreserved fund balance is not legally restricted and has two components: designated and undesignated. The designated component requires local board action to earmark the balance for bona fide purposes that will be fulfilled within a reasonable period of time. The undesignated component is available to finance monthly operating expenditures.

The amount reported in the AEIS report is the undesignated component, calculated as the difference between the total unreserved fund balance and the designated unreserved fund balance. This balance amount is expressed as a percent of the total budgeted expenditures (for the general fund) for the current year (2002-03) as specified in statute. (Source: Financial Audit Report, Dec. 2002)

Gold Performance Acknowledgment: Gold Performance Acknowledgment (GPA) is a system of recognition for high performance on measures beyond the base indicators used to assign accountability ratings. GPA appeared for the first time in August 2002 and replaced the former system of Additional Acknowledgments. Unlike district accountability ratings, acknowledgments awarded to districts in 2002 were not carried forward and do not appear on the 2002-03 AEIS reports.

Graduates (Class of 2002): In the profile section, this is the total number of graduates (including summer graduates) for the 2001-02 school year, as reported by districts in the fall of 2002. The value includes 12th graders who graduated as well as graduates from other grades. Students in special education who graduate are included in the totals, and are also reported as a separate group. Counts of students graduating under the recommended high school or distinguished achievement programs are also shown.

Students graduating with the class of 2002 could be coded with one of the following types:

Counts of graduates are calculated slightly differently for three of the indicators on the performance section of the AEIS report:

See also Completion/Student Status Rate and RHSP/DAP Graduates. (Source: PEIMS, Oct. 2002)

International Baccalaureate (IB): See AP/IB Results.

Leaver Record: In the fall of each year, districts report all 7th through 12th grade students who were enrolled or in attendance at any point during the prior year but who did not re-enroll the following fall. This group of "leavers" includes students such as those who graduated or received a GED, moved to another district, state, or country, died, or dropped out. This information is sent to TEA in Submission 1 of the annual PEIMS data collection.

After the data submission process is complete, PEIMS and several other statewide databases are searched to determine if any of the leaver records can appropriately be excluded from consideration as dropouts for accountability ratings purposes. Students' leaver records are excluded from the district and campus list of dropouts if the students:

See also Data Quality. (Source: PEIMS, Oct. 2002, June 2002, Oct. 2001, June 2001, Oct. 2000, and June 2000; General Educational Development Information File; Secondary School Completion and Dropouts in Texas Public Schools, 2001-02, Texas Education Agency)

Limited English Proficient (LEP): These are students identified as limited English proficient by the Language Proficiency Assessment Committee (LPAC) according to criteria established in the Texas Administrative Code. Not all pupils identified as LEP receive bilingual or English as a second language instruction, although most do. Percentages are calculated by dividing the number of LEP pupils by the total number of students in the school or district.

This year Section I of the AEIS reports (Performance Section) includes a column showing the performance of LEP students on the different indicators.

See also Campus Group, TAKS Participation. (Source: PEIMS, Oct. 2002)

Longitudinal Dropout Rate: See Completion/Student Status Rate.

Met Standard: This refers to the TAKS passing standard. For a detailed explanation, see TAKS Panel Recommendation.

Mobile Subset: This refers to the group of TAKS test takers whose performance is normally excluded when determining a school or district's accountability rating. Students may take the test but be excluded for accountability ratings purposes if they were not enrolled in that district on the last Friday in the previous October. Note that this calculation is different from that used to determine Mobility (below). See also Accountability Subset, TAKS Participation, and Appendix E.

Mobility (from Campus Profile Section): A student is considered to be mobile if he or she has been in membership at the school for less than 83% of the school year (i.e., has missed six or more weeks at a particular school).

number of mobile students in 2001-02
divided by
number of students who were in attendance at any time during the 2001-02 school year

This rate is calculated at the campus level. The mobility rate shown in the profile section of campus reports under the "district" column is based on the count of mobile students identified at the campus level. That is, the district mobility rate reflects school-to-school mobility, within the same district or from outside the district. See also Campus Group. (Source: PEIMS, June 2002)

n/a: This indicates data that are not available or are not applicable.

Number of Students per Teacher: This shows the total number of students divided by the total teacher FTE count. (Source: PEIMS, Oct. 2002)

Paired Schools: For accountability purposes, schools that reported enrollment but did not have grades in which the state-mandated test was given (e.g. K-2 schools) were paired with schools with which they had a "feeder" relationship to determine accountability ratings. Prior to 2003, districts were asked each year to supply the pairing relationship for the schools before receiving test results. Because schools will not receive an accountability rating this year, no schools will be paired.

Panel Recommendation: See TAKS Panel Recommendation.

Permits by Type (from District Profile Section): This indicates the number of permits issued by permit type. Individuals may be issued more than one permit; for that reason only counts are shown, not percentages. Permit types are

(Source: PEIMS, Oct. 2002)

Per Pupil Expenditures: This value shows budgeted expenditures for groups of functions divided by the total number of students in the district or school. Note that the number shown is not the amount actually spent per pupil, but rather a per-pupil average of the total budget. Per pupil expenditures are shown for total expenditures and for various groupings of operating categories. See also Total Operating Expenditures by Function for definitions of each functional group, and Total Campus Budget by Function for definitions of each functional group shown on the campus report.

In the "per pupil" sections on both the district and campus reports, instructional leadership is combined with the instruction category in order to comply with legislative mandates that instructional costs per pupil and administrative costs per pupil be reported. Please note that when comparing averages for school-level expenditures, the state and district averages include all types of schools. For example, a high school's per pupil expenditures may not be comparable to the state average because the state value includes elementary and middle schools, which typically have lower per pupil expenditures than high schools. Other variables that may affect comparisons are the experience level of teachers and administrators, the types of instructional programs offered, and the student characteristics. See also Appendix B. (Source: PEIMS, Oct. 2002)

Professional Staff: This is a full-time equivalent (FTE) count of teachers, professional support staff, campus administrators, and, on the district profile, central administrators. Staff are grouped according to the PEIMS roles reported. Each type of professional staff is shown as a percentage of the total staff FTE. See also Appendix A. (Source: PEIMS, Oct. 2002)

Reading Proficiency Tests in English (RPTE): See RPTE Change.

Recommended High School Program: See RHSP/DAP Graduates.

Retention Rates by Grade: The retention rate, reported in the profile section, shows the percent of students in Texas public schools who enrolled in 2002-03 in the same grade as their grade in the last reported six-week period of the prior year (2001-02). It is calculated as follows:

number of students not advanced to the next grade
divided by
number of students advanced to the next grade + number of students not advanced to the next grade

Note that all special education retention rates are calculated and reported separately from the rates of non-special education students because local retention practices appear to differ greatly between these two populations of students.

The AEIS report only shows retention rates for grades K-8. Retention rates for all grades may found in Grade-Level Retention in Texas Public Schools, 2001-02, available from TEA. (Source: PEIMS, Oct. 2002, June 2002)

Revenues by Source (District Profile only): Budgeted revenues for groups of object categories are expressed as a percent of total revenue. The amounts appearing as revenue in any of the categories shown are the amounts that were budgeted by districts in the general fund (fund 199, including state food services, and fund 420 for charters), the National School Breakfast and Lunch Program (240, 701), and the debt service funds (599). The groups of object categories are:

The Special Revenue Funds (including Shared Services Arrangements) and the Capital Projects Funds are not reported to the TEA by districts and so do not appear here. See also Appendix B. (Source: PEIMS, Oct. 2002)

RHSP/DAP Graduates: This indicator shows the percent of graduates who were reported as having satisfied the course requirements for the Texas State Board of Education Recommended High School Program or Distinguished Achievement Program. It is calculated as follows:

number of graduates reported with graduation codes for
Recommended High School Program or Distinguished Achievement Program
divided by
number of graduates

See also Graduates. (Source: PEIMS, Oct. 2002, Oct. 2001)

RPTE Change: (Reading Proficiency Tests in English) These tests are designed to measure annual growth in the English reading proficiency of second language learners, and are used along with English and Spanish TAKS to provide a comprehensive assessment system for limited English proficient (LEP) students.

The RPTE is constructed with items from each of three levels of proficiency - Beginning, Intermediate, and Advanced. LEP students in Grades 3-12 are required to take the RPTE until they achieve advanced proficiency. Once they achieve a rating of Advanced they take the TAKS (English or Spanish) in subsequent years.

If the students are at the Beginning or Intermediate level in 2002, the AEIS report shows what percent of them scored at each of the three levels in 2003. Students included in the measure are those who:

Prior year RPTE Change is also shown. (Source: TEA Student Assessment Division)

SAT/ACT Results: These include the College Board's SAT and ACT, Inc.'s ACT Assessment. Both testing companies annually provide the agency with testing information on the most recent test participation and performance of graduating seniors from all Texas public schools. Only one record is sent per student. If a student takes an ACT or SAT test more than once, the agency receives the record for the most recent examination taken.

Three values are calculated for this indicator:

(1) The percent of graduates who took either college admissions test:

number of graduates who took either the SAT or the ACT
divided by
number of graduates

(2) The percent of examinees who scored at or above the criterion score on either test (1110 on the SAT, or 24 on the ACT):

number of examinees who scored at or above criterion
divided by
number of examinees

(3) The average (mean) score for each (SAT total and ACT composite), calculated as follows:

total score for all students who took the SAT
divided by
number of students who took the SAT

and

total score for all students who took the ACT
divided by
number of students who took the ACT

Note that "graduates" in the denominator of equation (1) does not include special education graduates; however, special education graduates who took either the SAT or ACT are included in the numerator. (See Graduates.) For purposes of year-to-year comparison, results are reported for graduating seniors in the class of 2002 and the class of 2001.

(Source: Educational Testing Service, a College Board contractor (SAT) Sept. 2002, Oct. 2001; ACT, Inc. (ACT) Oct. 2002, Oct. 2001; and PEIMS, Oct. 2002, Oct. 2001)

School Type: For purposes of creating the Campus Groups, schools are placed into one of four classifications based on the lowest and highest grades in which students are enrolled (in membership) at the school: elementary, middle (including junior high school), secondary, and both elementary/secondary (K-12).

SDAA: See State-Developed Alternative Assessment.

SEM: See Standard Error of Measurement.

Special Education: This refers to the population served by programs for students with disabilities. Assessment decisions for students in special education programs are made by their Admission, Review, and Dismissal (ARD) committee. The ARD committee is made up of their parent(s) or guardian, teacher, administrator, and other concerned parties. A student in special education may take the TAKS or SDAA tests or be exempted from one or all of them.

Other indicators that include the performance of students in special education are: advanced course completion, attendance rate, dropout rate, completion rate, recommended high school program, TAAS cumulative pass rate, and TAAS/TASP equivalency. Information that would allow the separation of performance of special education students on college admissions tests and on Advanced Placement and International Baccalaureate examinations is not available. Note that in the profile section of the report, retention rates are shown separately for special education and non-special education students. See also State-Developed Alternative Assessment and TAKS Participation. (Source: PEIMS, Oct. 2002, Oct. 2001, and TEA Student Assessment Division)

Special Education Compliance Status: The Texas Education Agency is required to report the special education compliance status (SpECS) of each district and charter in the state on the AEIS reports. Districts and charters may receive a status of:

For a description of each status, refer to Appendix F. If you have questions about this item, contact the Division of Accountability Development and Support at (512) 463-9716. (Source: Division of Accountability Development and Support)

Standard Error of Measurement (SEM): A way to understand the standard error of measurement as it relates to tests is the following:

If a single student were to take the same test repeatedly (with no new learning taking place between testings and no memory of questions), the standard deviation of his/her repeated test scores is denoted as the standard error of measurement.

The TAKS transition plan implemented by the State Board of Education uses the standard error of measurement to phase in the passing standards over three years. For a complete explanation of the plan, see TAKS Panel Recommendation. (Source: Student Assessment Division)

Standardized Local Tax Base (comptroller valuation) (District Profile only): The Comptroller conducts a study each year that uniformly evaluates the property values within school district boundaries. Locally assessed values may vary from the Comptroller's study values. The values certified by the Comptroller's Property Tax Division (Comptroller Valuation) are standardized in that they are deemed to be comparable across the state. Note that the values shown are final for tax year 2002. This is not the property value used for school funding calculations.

Value per Pupil: school district property value, or Standardized Local Tax Base, divided by the total number of students. This per pupil figure is one definition of "wealth." Note that the values shown are final for tax year 2002.

Value by Category: shows aggregates of individual property tax categories expressed as a percent of the Comptroller's property value before the exemptions are applied. Thus, the sum of the category values will exceed the value used for per pupil calculations. Note that the values shown are final for tax year 2002. (Source: Texas Comptroller of Public Accounts, July 2003)

State-Developed Alternative Assessment (SDAA): This test assesses special education students in Grades 3-8 who are receiving instruction in the Texas Essential Knowledge and Skills (TEKS) but for whom TAKS is an inappropriate measure of their academic progress. SDAA tests are given in the areas of reading, writing, and mathematics. Students are assessed at their appropriate instructional levels, as determined by their Admission, Review, and Dismissal (ARD) committees. The SDAA is administered on the same schedule as TAKS and is designed to measure annual growth based on appropriate expectations for each student as decided by the student's ARD committee. The AEIS report shows the percent of students tested who met the ARD committee's expectations for their 2003 performance.

Results are calculated as follows:

number of SDAA test takers who met their 2003 ARD expectations on all tests taken
divided by
number of SDAA test takers

Note that state statute does not permit reporting SDAA results by grade level or subject area, therefore:

This indicator includes only those students who were part of the Accountability Subset. For purposes of comparison the 2002 SDAA values have been recomputed to include only those students in the Accountability Subset.

See also Accountability Subset, and TAKS Participation. (Source: Student Assessment Division)

Student Enrollment by Program: Students are identified as served in programs for Special Education, Career and Technology Education, Bilingual/ESL Education, or Gifted and Talented Education. The percentages do not sum to 100, as a student may be enrolled in more than one of these programs. (Source: PEIMS, Oct. 2002)

Student Success Initiative: In 1999, as part of the mandate for the new TAKS tests, the Texas Legislature included new grade advancement testing requirements. Beginning in 2003, students in 3rd grade must pass the reading portion of the TAKS in order to be promoted to the 4th grade. Students are given three opportunities to pass the TAKS reading test: in March, April, and July. The AEIS report shows two measures on this new indicator:

(1) Students Requiring Accelerated Instruction. Grade 3 students who did not pass the TAKS reading test during the first administration (March) must be provided accelerated instruction in preparation for the second administration in April:

number of eligible students who did not meet the standard
divided by
number of eligible students

The number of eligible students is calculated from the test answer documents and includes all students who were tested, students who should have been tested but were absent, and students whose answer documents are invalid for some reason. (The count of eligible students does not include students who have a special education or LEP exemption.)

(2) TAKS Second Administration Met Standard: The percent of students who took and passed the second administration (in April) of the grade 3 TAKS reading test:

number of students who passed 2nd administration of TAKS reading
divided by
number of students who took 2nd administration of TAKS reading

The measures include results from both the English and Spanish versions of the TAKS grade 3 reading test. Additional Student Success Initiative measures will be developed as data become available.

Other grade advancement testing requirements will be phased in for grades 5 and 8 on reading and mathematics. For more information on the Student Success Initiative, go to the website for TEA's Student Assessment Division.

Students by Grade: Percentages are calculated by dividing the number of students in each grade by the total number of students. (Source: PEIMS, Oct. 2002)

Students with Disciplinary Placements: Counts and percents of students placed in alternative education programs under Chapter 37 of the Texas Education Code (Discipline; Law and Order) are shown (for the 2001-02 school year) in the AEIS reports. Disciplinary placement counts are obtained from PEIMS records. Districts report the disciplinary actions taken toward students who are removed from the classroom for at least one day. Although students can have multiple removals throughout the year, this measure counts students only once and includes only those whose removal results in a placement in a disciplinary alternative education program or juvenile justice alternative education program. It is calculated as follows:

number of students with one or more disciplinary placements
divided by
number of students who were in attendance at any time during the school year

(Source: PEIMS, June 2002)

TAAS (Texas Assessment of Academic Skills): The TAAS was the state-mandated assessment of student performance given to Texas public school students from 1990 through 2002. In 2003 the TAKS (Texas Assessment of Knowledge and Skills) was administered for the first time. For more information about the TAAS, see the 2001-02 AEIS Glossary.

TAAS Exit-level Cumulative Pass Rate (from District Performance Section): The TAAS cumulative pass rate for the class of 2003 shows the percent of students who first took the TAAS exit-level test in spring 2001, and eventually passed all TAAS tests taken (in the same district) by spring 2003. This measure is intended to show the relative success of districts in their efforts to help all their students pass the exit-level TAAS, which is a requirement for graduation from Texas public schools for students until 2003-04. (Students in the class of 2005 will be required to pass the exit-level TAKS test.)

Test takers included in the TAAS Exit-level Cumulative Pass Rate for the class of 2003:

Test takers NOT included in the TAAS Exit-level Cumulative Pass Rate:

The information is available by sex and ethnicity but not by economic status or LEP. The performance of special education students is included in all the values and is not reported separately. Results of this indicator are also shown for the class of 2002. (Source: TEA Student Assessment Division)

TAAS/TASP Equivalency: This indicator shows the percent of graduates from the class of 2002 who did well enough on the exit-level TAAS to have a 75% likelihood of passing the Texas Academic Skills Program (TASP) test. To be counted for this indicator a student must have achieved a TLI of X-81 or higher on the TAAS reading test, a TLI of X-77 or higher on the TAAS mathematics test, and a scale score of 1540 or higher on the TAAS writing test.

Test takers included in the TAAS/TASP Equivalency:

Test takers NOT included in the TAAS/TASP Equivalency:

Results of the TAAS/TASP Equivalency are also shown for the class of 2001. Note that the Accountability Subset does not apply to this indicator.

See also TASP and Graduates. (Source: TEA Student Assessment Division; PEIMS, Oct. 2002, and Oct. 2001)

TAKS (Texas Assessment of Knowledge and Skills): The Texas Assessment of Knowledge and Skills (TAKS) is a comprehensive testing program for public school students in grades 3-11. TAKS replaces the Texas Assessment of Academic Skills (TAAS) and is designed to measure to what extent a student has learned, understood, and is able to apply the important concepts and skills expected at each tested grade level.

Students are tested during the spring semester of each school year in various subjects. The grades and subjects shown on the AEIS reports (for the first administration of the test only):

Every TAKS test is directly linked to the Texas Essential Knowledge and Skills (TEKS) curriculum. The TEKS is the state-mandated curriculum for Texas public school students. Essential knowledge and skills taught at each grade build upon the material learned in previous grades.

For 2003, the AEIS report shows percent passing TAKS in several ways. Below are some definitions:

"Sum of all grades tested" refers to the grades tested at the particular school. For example, the percent passing for reading in an elementary school with a grade span of K-5 is calculated as follows:

number of students who passed the reading test in grades 3, 4, & 5
divided by
number of students who took the reading test in grades 3, 4, & 5

See the definition for TAKS Panel Recommendation for additional information. For a complete list of standards for each grade and subject see Appendix G.

Other important information:

See also TAKS Participation. (Source: TEA Student Assessment Division)

TAKS Commended Performance: This refers to the highest performance level set by the State Board of Education on the TAKS. Students who achieve Commended Performance have performed at a level that was considerably above the state passing standard and have shown a thorough understanding of the knowledge and skills at the grade level tested. Unlike the Met Standard level, there is no phase-in period for this standard. For more information see TAKS and TAKS Panel Recommendation. Also see Appendix G for a complete list of standards for each grade and subject.

TAKS Met Standard: This refers to the TAKS passing standard. For a detailed explanation, see TAKS Panel Recommendation below.

TAKS Panel Recommendation: This refers to the passing standard for the new TAKS test. In November 2002, the State Board of Education adopted two performance standards for the TAKS: Met Standard (i.e. passing) and Commended Performance (i.e. high performance). The Board adopted these standards based on recommendations from approximately 350 educators and citizens who served on TAKS standard-setting panels. Because the new TAKS is much more challenging than its predecessor, the Texas Assessment of Academic Skills (TAAS), the Board agreed to a transition plan to phase in Met Standard over several years. (Commended Performance has no phase-in period.)

The transition plan uses the standard error of measurement (SEM) to phase in the panel's recommended passing standards over three years. For 2003, the standard is set at two SEM below panel recommendation. For 2004, for grades 3 through 10*, the passing standard will be one SEM below panel recommendation and the passing standards would be fully implemented in 2005 (for grades 3 through 10). In general, this phase in means that in the first year, students need to correctly answer three to six fewer questions than when the test is fully implemented.

For example, in 2003 third-grade students were required to correctly answer 20 of 36 questions on the English reading exam to meet the passing standards. In 2004, students must correctly answer 22 out of 36 questions on that exam. In 2005 when the plan is fully implemented, the students will be required to correctly answer 24 of 36 questions correctly.

* There is a one-year delayed phase-in for the grade 11, exit-level TAKS. This is because the grade 10 tests have been built to be predictors of performance on the grade 11 tests. Therefore, the standards in place when students take the grade 10 TAKS must be extended to grade 11 so that for both years those students are required to meet the same passing standard. TAKS Met Standard for the exit-level exam in 2004 will be two SEM below panel recommendation; in 2005 the standard will be one SEM below; and in 2006 it will be at the panel recommendation.

Note that even at the initial phase-in level, the TAKS is a more challenging test for students than the TAAS. Also, unlike the TAAS test, the number of questions a student must answer correctly to pass the TAKS varies by subject and grade.

This year's AEIS reports show the percent of students passing the TAKS at the passing standard for 2002-03: two SEM below panel recommendation, by grade and subject. Further in the report TAKS performance is shown, summed across grades, at three different passing standards: two SEM below Panel Recommendation, one SEM below Panel Recommendation, Panel Recommendation. Additionally, Commended Performance is shown.

For a complete list of standards for each grade and subject see Appendix G. See also TAKS.

TAKS Participation: Every student enrolled in a Texas public school in grades 3-11 must be given the opportunity to take the TAKS (Texas Assessment of Knowledge and Skills) or the SDAA (State-Developed Alternative Assessment). Although it is the intention to test every student in these grades, there are circumstances under which some students are not tested. Additionally, the performance of some tested students is not reported. The reasons for exclusion are as follows:

The percentages are based as much as possible on the total number of students enrolled at the time of testing. Districts are required to submit a TAKS or SDAA answer document for every student enrolled in grades 3 through 11. Students who take subject tests from different assessments (for example TAKS mathematics and SDAA reading) will have multiple answer documents. The methodology used to create TAKS Participation eliminates, as much as possible, duplicate counts of students resulting from multiple answer documents. Appendix E provides a description for each component of TAKS Participation. (Source: Division of Student Assessment)

TAKS Passing Standard: See TAKS Panel Recommendation.

TASP: The Texas Academic Skills Program measures reading, writing, and mathematics proficiency. It is required of all persons entering Texas public institutions of higher education for the first time. The TASP is administered by the Texas Higher Education Coordinating Board.

Teachers by Ethnicity and Sex: These are counts of teacher FTEs by the major ethnic groups and by sex. Counts are also expressed as a percent of the total teacher FTEs. (Source: PEIMS, Oct. 2002)

Teachers by Highest Degree Held (District Profile only): This shows the distribution of degrees attained by teachers in the district. The FTE counts of teachers with no degree, bachelor's, master's, and doctorate degrees are expressed as a percent of the total teacher FTEs. (Source: PEIMS, Oct. 2002)

Teachers by Program (population served): Teacher FTE counts are categorized by the type of student populations served. Regular education, special education, compensatory education, career and technology education, bilingual/ESL education, gifted and talented education, and miscellaneous other populations served are shown. Teacher FTE values are allocated across population types for teachers who serve multiple population types. Percentages are expressed as a percent of total teacher FTEs. (Source: PEIMS, Oct. 2002)

Teachers by Years of Experience (District Profile only): This is the FTE count of teachers with years of professional experience that fall into the ranges shown. Experience in these categories is the total years of experience for the individual, not years of experience in the reporting district or campus. Teacher counts within each range of experience are expressed as a percent of total teacher FTEs. A beginning teacher is a teacher reported with zero years of experience. (Source: PEIMS, Oct. 2002)

TLI (Texas Learning Index): The TLI was used to measure improvement on the former state-administered (TAAS) test. Improvement measures on the TAKS will be developed in 2004 when two years of test results are available.

Total Campus Budget by Function (Campus Profile only): Operating expenditures, by function, are expressed as a percent of the total campus operating budget. Function codes appear in parentheses.

See also Appendix B. (Source: PEIMS, Oct. 2002)

Total Exclusions (District Profile only): These expenditure amounts are omitted from the other financial information presented, in order to provide a more equalized financial picture. Function codes are shown in parentheses following each item.

See also Appendix B. (Source: PEIMS, Oct. 2002)

Total Expenditures by Object (District Profile only): Total budgeted expenditures are grouped into operating and non-operating categories by object of expense. The operating categories are:

The non-operating categories are:

The Special Revenue Funds (including Shared Services Arrangements) and the Capital Projects Funds are not reported to the TEA by districts and so do not appear here. (Source: PEIMS, Oct. 2002)

Total Expenditures for Athletic Programs (District Profile only): Budgeted expenditures for the costs of competitive athletic activities such as football, basketball, golf, swimming, baseball, etc. (program intent code 91). This includes costs associated with coaching as well as sponsors for drill team, cheerleaders, or any other organized activity to support athletics. However, this program intent code does not include expenditures associated with the costs of band. (Source: PEIMS, Oct. 2002)

Total Expenditures for Community Services (District Profile only): Budgeted expenditures for activities or purposes other than regular public education. These are activities relating to the whole community, such as the operation of a school library, swimming pool, and playgrounds for the public (function 61). (Source: PEIMS, Oct. 2002)

Total Operating Expenditures by Function (District Profile only): Operating expenditures by function are expressed as a percent of total operating expenditures. Function codes appear in parentheses.

(Source: PEIMS, Oct. 2002)

Total Revenues (District Profile only): The total for all revenues budgeted in the General Fund (199, including state food services, and fund 420 for charters), the National School Breakfast and Lunch Program (240, 701), and the Debt Service Funds (599). Total Revenues per Pupil is total revenue divided by the total number of students. The Special Revenue Funds (including Shared Services Arrangements) and the Capital Projects Funds are not reported to the TEA by districts and so do not appear here. (Source: PEIMS, Oct. 2002)

Total Staff: Total staff includes professional staff (teachers, professional support, administrators), educational aides, and (on the district profile) auxiliary staff. Minority staff is the sum of the FTE counts for all non-white staff groups (African American, Hispanic, Asian/Pacific Islander, and Native American). This FTE count is expressed as a percent of the total staff FTE. (Source: PEIMS, Oct. 2002)

Total Students: This is the total number of public school students who were reported in membership on October 26, 2002 at any grade, from early childhood education through grade 12. Membership is a slightly different number from enrollment, because it does not include those students who are served in the district for less than two hours per day. For example, the count of Total Students excludes students who attend a nonpublic school but receive some services, such as speech therapy-for less than two hours per day-from their local public school district. (Source: PEIMS, Oct. 2002)

Turnover Rate for Teachers (District Profile only): This shows the total FTE count of teachers not employed in the district in the fall of 2002-03 who were employed as teachers in the district in the fall of 2001-02, divided by the total teacher FTE count for the fall of 2001-02. Social security numbers of reported teachers are compared from the two semesters to develop this information. Staff who remain employed in the district but not as teachers are counted as teacher turnover. (Source: PEIMS, Oct. 2002, Oct. 2001)


(Click here to download this explanation as a printable PDF.)


Appendix A, Appendix B

(Click here to download Appendix A & B as a printable PDF.)


Appendix C

Advanced Academic Courses

2002-03 Academic Excellence Indicator System

English Language Arts

A3220100 English Language And Composition
A3220200 English Literature And Composition
A3220300 International English Language
I3220300 English III
I3220400 English IV
03221100 Research/Technical Writing
03221200 Creative/Imaginative Writing
03221800 Independent Study In English
03231000 Independent Study/Journalism
03240400 Oral Interpretation III
03240800 Debate III
03241100 Public Speaking III
03241200 Independent Study/Speech
03221600 Humanities
03221500 Literary Genres
03231902 Advanced Broadcast Journalism

Mathematics

A3100101 Calculus AB
A3100102 Calculus BC
A3100200 AP Statistics
I3100100 Mathematical Methods
I3100200 Mathematical Studies
I3100300 Mathematics Higher Level
I3100400 Advanced Mathematics Subsidiary Level
03101100 Pre-Calculus
03102500 Independent Study in Mathematics (1st time)
03102501 Independent Study in Mathematics (2nd time)

Computer Science

A3580100 Computer Science I
A3580200 Computer Science II
I3580200 Computer Science I
I3580300 Computer Science II
03580200 Computer Science I
03580300 Computer Science II

Science

A3010200 Biology
A3020000 Environmental Science
A3040000 Chemistry
A3050001 Physics B
A3050002 Physics C
I3010200 Biology
I3020000 Environmental Systems
I3040001 Chemistry I
I3040002 Chemistry II
I3050001 Physics I
I3050002 Physics II

Social Studies/History

A3310100 Micro Economics
A3310200 Macro Economics
A3330100 United States Government And Politics
A3330200 Comparative Government And Politics
A3340100 United States History
A3340200 European History
A3350100 Psychology
A3360100 Human Geography
A3370100 World History
I3301100 History, Standard Level
I3301200 History: Africa, Higher Level
I3301300 History: Americas, Higher Level
I3301400 History: East And Southeast Asia, Higher Level
I3301500 History: Europe, Higher Level
I3302100 Geography, Standard Level
I3302200 Geography, Higher Level
I3303100 Economics, Standard Level
I3303200 Economics, Higher Level
I3304100 Psychology, Standard Level
I3304200 Psychology, Higher Level
I3366010 Philosophy
I3000100 Theory Of Knowledge
03310301 Economics Advanced Studies
03380001 Social Studies Advanced Studies

Fine Arts

A3150200 Music Theory
A3500100 History Of Art
A3500300 Art/Drawing
I3250200 Music SL
I3250300 Music HL
I3600100 Art/Design HL
I3600200 Art/Design SL-A
I3600300 Art/Design SL-B
I3750200 Theatre Arts SL
I3750300 Theatre Arts HL
03150400 Music IV Band
03150800 Music IV Orchestra
03151200 Music IV Choir
03151600 Music IV Jazz Band
03152000 Music IV Instrumental Ensemble
03152400 Music IV Vocal Ensemble
03250400 Theatre IV
03251000 Theatre Production IV
03251200 Technical Theatre IV
03502300 Art IV Drawing
03502400 Art IV Painting
03502500 Art IV Printmaking
03502600 Art IV Fibers
03502700 Art IV Ceramics
03502800 Art IV Sculpture
03502900 Art IV Jewelry
03503100 Art IV Photography
03503200 Art IV Graphic Design
03503500 Art IV Electronic Media
03830400 Dance IV

Advanced Languages (Modern or Classical)

A3410100 French IV Language
A3410200 French V Literature
A3420100 German IV Language
A3430100 Latin IV (Vergil)
A3430200 Latin V (Catullus-Horace)
A3440100 Spanish IV Language
A3440200 Spanish V Literature
I3120400 Japanese IV
I3120500 Japanese V
I3410400 French IV
I3410500 French V
I3420400 German IV
I3420500 German V
I3430400 Latin IV
I3430500 Latin V
I3440400 Spanish IV
I3440500 Spanish V
I3450400 Russian IV
I3450500 Russian V
I3480400 Hebrew IV
I3480500 Hebrew V
I3490400 Chinese IV
I3490500 Chinese V
I3996000 Other Foreign Language IV
I3996100 Other Foreign Language V
03110400 Arabic IV
03110500 Arabic V
03110600 Arabic VI
03110777 Arabic VII
03120400 Japanese IV
03120500 Japanese V
03120600 Japanese VI
03120777 Japanese VII
03400400 Italian IV
03400500 Italian V
03400600 Italian VI
03400777 Italian VII
03410400 French IV
03410500 French V
03410600 French VI
03410700 French VII
03420400 German IV
03420500 German V
03420600 German VI
03420700 German VII
03430400 Latin IV
03430500 Latin V
03430600 Latin VI
03430777 Latin VII
03440400 Spanish IV
03440500 Spanish V
03440600 Spanish VI
03440700 Spanish VII
03450400 Russian IV
03450500 Russian V
03450600 Russian VI
03450777 Russian VII
03460400 Czech IV
03460500 Czech V
03460600 Czech VI
03460777 Czech VII
03470400 Portuguese IV
03470500 Portuguese V
03470600 Portuguese VI
03470777 Portuguese VII
03480400 Hebrew IV
03480500 Hebrew V
03480600 Hebrew VI
03480777 Hebrew VII
03490400 Chinese IV
03490500 Chinese V
03490600 Chinese VI
03490777 Chinese VII
03980400 American Sign Language IV
03980500 American Sign Language V
03980600 American Sign Language VI
03980700 American Sign Language VII
03996000 Other Foreign Language IV
03996100 Other Foreign Language V
03996200 Other Foreign Language VI
03996300 Other Foreign Language VII

 


Appendix D

(Click here to download Appendix D as a printable PDF.)


Appendix E

(Click here to download Appendix E as a printable PDF.)


Appendix F

Special Education Compliance Status 2003

The Texas Education Code (TEC) requires the Texas Education Agency to determine the special education compliance status (SpECS) of each district and charter in the state on an annual basis. This document explains the methodology the Agency has established for determining the 2003 SpECS of each district and charter. It is important to note that the 2003 SpECS of each district and charter will be based upon information available to the Agency as of July 1, 2003.

The eight SpECS categories for 2003 are defined as follows.

1. Desk Audit: Compliant

This category is assigned to a district or charter if the district or charter does not meet the criteria for any of the following seven categories of SpECS.

2. Desk Audit: Self-Evaluation Pending

a. The district or charter is selected to participate in a modified self-evaluation or CSESER (Comprehensive Special Education Self-Evaluation Review) during the 2003-04 school year based on the Data Analysis System (DAS); or

b. The district or charter participated in a CSESER during the 2002-03 school year, and the Agency has not completed its review of the results of that CSESER as of July 1, 2003.

3. Desk Audit: Site Visit Pending

a. The district or charter is selected to receive a District Effectiveness & Compliance (DEC) on-site visit during the 2003-04 school year based on:

(1) DAS; or

(2) Information obtained from complaints/due process hearings filed with the Agency concerning special education; or

b. The district or charter received a DEC visit during the 2002-03 school year (based on DAS or information obtained from complaints/due process hearings filed with the Agency concerning special education) and the Agency has not finalized the written DEC report relating to such visit as of July 1, 2003.

4. Site-Visit/CSESER: Compliant

a. The district or charter received a DEC visit during the 2002-03 school year and the written report of the visit contained no special education citations, or the district or charter received a DEC visit during the 2001-2002 school year and the written report of the visit contained no special education citations but the district or charter received a 2002 SpECS of Desk Audit: Site Visit Pending due to the fact that the Agency had not completed and mailed the written DEC report relating to such visit as of June 28, 2002; or

b. The district or charter participated in a CSESER during the 2002-03 school year, and the results of the review of the CSESER have confirmed that no further action is necessary.

5. Site-Visit/CSESER: Corrective Action Compliant

a. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations noted during one or more on-site monitoring visits conducted by the Agency), and the Agency issued written findings on or before July 1, 2003, that the corrective actions were sufficient to bring the district or charter into compliance with federal and state laws relating to special education; or

b. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations resulting from a CSESER completed by the district or charter), and the Agency issued written findings on or before July 1, 2003, that the corrective actions were sufficient to bring the district or charter into compliance with federal and state laws relating to special education.

6. Site-Visit/CSESER: Corrective Actions Pending

a. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations noted during one or more on-site monitoring visits conducted by the Agency), and the corrective actions were under review by the Agency as of July 1, 2003; or

b. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations resulting from a CSESER completed by the district or charter), and the corrective actions were under review by the Agency as of July 1, 2003.

7. Site-Visit/CSESER: Corrective Actions Unresolved

a. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations noted during one or more on-site monitoring visits conducted by the Agency), and the Agency has notified the district or charter that the corrective actions are unacceptable or insufficient to bring the district or charter into compliance with federal and state laws relating to special education; or

b. The district or charter was involved in the implementation of corrective actions during the 2002-03 school year (based on special education compliance citations resulting from a CSESER completed by the district or charter), and the Agency has notified the district or charter that the corrective actions are unacceptable or insufficient to bring the district or charter into compliance with federal and state laws relating to special education.

8. Sanctions Imposed

This is the SpECS assigned to each district and charter for which one or more of the sanctions or interventions authorized by state law or rule have been imposed by the Agency (and have not been removed as of July 1, 2003) as a result of issues or concerns relating to the district's or charter's special education program.

If you have questions about the Special Education Compliance Status, please contact the Division of Accountability Development and Support at (512) 463-9716. For a more detailed explanation see the website http://www.tea.state.tx.us/account.eval/specs2003.html. (Source: Division of Accountability Development and Support)


Appendix G

Spring 2003 TAKS Reading (English) Performance Standards

R

e

a

d

i

n

g

 

(E

n

g

l

i

s

h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 3

Panel Recommendation

36

24

66.7%

 

One SEM Below

22

61.1%

 

Two SEM Below [2003 Standard]

2[1]

55.6%

 

Commended Performance

34[1]

94.4%

 

Grade 4

Panel Recommendation

40

27

67.5%

 

One SEM Below

25

62.5%

 

Two SEM Below [2003 Standard]

23

57.5%

 

Commended Performance

38

95.0%

 

Grade 5

Panel Recommendation

42

29

69.0%

 

One SEM Below

27

64.3%

 

Two SEM Below [2003 Standard]

25

59.5%

 

Commended Performance

39

92.9%

 

Grade 6

Panel Recommendation

42

27

64.3%

 

One SEM Below

24

57.1%

 

Two SEM Below [2003 Standard]

21

50.0%

 

Commended Performance

38

90.5%

 

Grade 7

Panel Recommendation

48

33

68.8%

 

One SEM Below

30

62.5%

 

Two SEM Below [2003 Standard]

27

56.3%

 

Commended Performance

45

93.8%

 

Grade 8

Panel Recommendation

48

34

70.8%

 

One SEM Below

31

64.6%

 

Two SEM Below [2003 Standard]

28

58.3%

 

Commended Performance

45

93.8%

 

Grade 9

Panel Recommendation

42

29

69.0%

 

One SEM Below

27

64.3%

 

Two SEM Below [2003 Standard]

25

59.5%

 

Commended Performance

37

88.1%

Spring 2003 TAKS English Language Arts Performance Standards[2]

E
n
g
l
i
s
h

L
a
n
g
u
a
g
e

A
r
t
s

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 10

Panel Recommendation

73

47

64.4%

 

One SEM Below

44

60.3%

 

Two SEM Below [2003 Standard]

41

56.2%

 

Commended Performance

64

87.7%

 

Grade 11

Panel Recommendation

73

43

58.9%

 

One SEM Below

40

54.8%

 

Two SEM Below [2003 Standard]

37

50.7%

 

Commended Performance

63

86.3%

Spring 2003 TAKS Reading (Spanish) Performance Standards

R
e
a
d
i
n
g

(S
p
a
n
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 3

Panel Recommendation

36

23

63.9%

 

One SEM Below

21

58.3%

 

Two SEM Below [2003 Standard]

191

52.8%

 

Commended Performance

331

91.7%

 

Grade 4

Panel Recommendation

40

25

62.5%

 

One SEM Below

22

55.0%

 

Two SEM Below [2003 Standard]

19

47.5%

 

Commended Performance

36

90.0%

 

Grade 5

Panel Recommendation

42

27

64.3%

 

One SEM Below

24

57.1%

 

Two SEM Below [2003 Standard]

21

50.0%

 

Commended Performance

37

88.1%

 

Grade 6

Panel Recommendation

42

23

54.8%

 

One SEM Below

20

47.6%

 

Two SEM Below [2003 Standard]

17

40.5%

 

Commended Performance

35

83.3%

Spring 2003 TAKS Writing (English) Performance Standards [3]

W
r
i
t
i
n
g

(E
n
g
l
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 4

Panel Recommendation

32

22

68.8%

 

One SEM Below

20

62.5%

 

Two SEM Below [2003 Standard]

18

56.3%

 

Commended Performance

30

93.8%

 

Grade 7

Panel Recommendation

44

28

63.6%

 

One SEM Below

26

59.1%

 

Two SEM Below [2003 Standard]

24

54.5%

 

Commended Performance

40

90.9%

Spring 2003 TAKS Writing (Spanish) Performance Standards [3]

W
r
i
t
i
n
g

(S
p
a
n
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 4

Panel Recommendation

32

20

62.5%

 

One SEM Below

18

56.3%

 

Two SEM Below [2003 Standard]

16

50.0%

 

Commended Performance

28

87.5%


Spring 2003 TAKS Mathematics (English) Performance Standards

M
a
t
h
e
m
a
t
i
c
s

(E
n
g
l
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 3

Panel Recommendation

40

27

67.5%

 

One SEM Below

24

60.0%

 

Two SEM Below [2003 Standard]

21

52.5%

 

Commended Performance

37

92.5%

 

Grade 4

Panel Recommendation

42

28

66.7%

 

One SEM Below

25

59.5%

 

Two SEM Below [2003 Standard]

22

52.4%

 

Commended Performance

39

92.9%

 

Grade 5

Panel Recommendation

44

30

68.2%

 

One SEM Below

27

61.4%

 

Two SEM Below [2003 Standard]

24

54.5%

 

Commended Performance

40

90.9%

 

Grade 6

Panel Recommendation

46

29

63.0%

 

One SEM Below

26

56.5%

 

Two SEM Below [2003 Standard]

23

50.0%

 

Commended Performance

41

89.1%

 

Grade 7

Panel Recommendation

48

28

58.3%

 

One SEM Below

25

52.1%

 

Two SEM Below [2003 Standard]

22

45.8%

 

Commended Performance

44

91.7%

 

Grade 8

Panel Recommendation

50

30

60.0%

 

One SEM Below

27

54.0%

 

Two SEM Below [2003 Standard]

24

48.0%

 

Commended Performance

45

90.0%

 

Grade 9

Panel Recommendation

52

31

59.6%

 

One SEM Below

28

53.8%

 

Two SEM Below [2003 Standard]

25

48.1%

 

Commended Performance

45

86.5%

 

Grade 10

Panel Recommendation

56

33

58.9%

 

One SEM Below

29

51.8%

 

Two SEM Below [2003 Standard]

25

44.6%

 

Commended Performance

51

91.1%

 

Grade 11

Panel Recommendation

60

33

55.0%

 

One SEM Below

29

48.3%

 

Two SEM Below [2003 Standard]

25

41.7%

 

Commended Performance

54

90.0%

Spring 2003 TAKS Mathematics (Spanish) Performance Standards

M
a
t
h
e
m
a
t
i
c
s

(S
p
a
n
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 3

Panel Recommendation

40

27

67.5%

 

One SEM Below

24

60.0%

 

Two SEM Below [2003 Standard]

21

52.5%

 

Commended Performance

37

92.5%

 

Grade 4

Panel Recommendation

42

28

66.7%

 

One SEM Below

25

59.5%

 

Two SEM Below [2003 Standard]

22

52.4%

 

Commended Performance

37

88.1%

 

Grade 5

Panel Recommendation

44

30

68.2%

 

One SEM Below

27

61.4%

 

Two SEM Below [2003 Standard]

24

54.5%

 

Commended Performance

39

88.6%

 

Grade 6

Panel Recommendation

46

29

63.0%

 

One SEM Below

26

56.5%

 

Two SEM Below [2003 Standard]

23

50.0%

 

Commended Performance

40

87.0%

Spring 2003 TAKS Social Studies Performance Standards

S
o
c
i
a
l

S
t
u
d
i
e
s

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 8

Panel Recommendation

48

25

52.1%

 

One SEM Below

22

45.8%

 

Two SEM Below [2003 Standard]

19

39.6%

 

Commended Performance

42

87.5%

 

Grade 10

Panel Recommendation

50

29

58.0%

 

One SEM Below

26

52.0%

 

Two SEM Below [2003 Standard]

23

46.0%

 

Commended Performance

45

90.0%

 

Grade 11

Panel Recommendation

55

28

50.9%

 

One SEM Below

25

45.5%

 

Two SEM Below [2003 Standard]

22

40.0%

 

Commended Performance

49

89.1%

Spring 2003 TAKS Science (English) Performance Standards

S
c
i
e
n
c
e

(E
n
g
l
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 5

Panel Recommendation

40

30

75.0%

 

One SEM Below

27

67.5%

 

Two SEM Below [2003 Standard]

24

60.0%

 

Commended Performance

37

92.5%

 

Grade 10

Panel Recommendation

55

35

63.6%

 

One SEM Below

31

56.4%

 

Two SEM Below [2003 Standard]

27

49.1%

 

Commended Performance

50

90.9%

 

Grade 11

Panel Recommendation

55

30

54.5%

 

One SEM Below

27

49.1%

 

Two SEM Below [2003 Standard]

24

43.6%

 

Commended Performance

50

90.9%

Spring 2003 TAKS Science (Spanish) Performance Standards

S
c
i
e
n
c
e

(S
p
a
n
i
s
h)

 

Standard

Total Points Possible

Number Correct

Percent
Correct

Grade 5

Panel Recommendation

40

30

75.0%

 

One SEM Below

27

67.5%

 

Two SEM Below [2003 Standard]

24

60.0%

 

Commended Performance

37

92.5%

The numbers and percents shown on this table are based on the first administration of the spring 2003 TAKS test. It should not be used to anticipate the exact number and percent correct required to meet the standard or achieve Commended Performance on future test administrations. This is because the numbers may differ slightly from those shown above to ensure that equivalent standards are maintained for each TAKS administration.


[1] March 2003 Grade 3 Reading TAKS standards.
[2] An essay rating of 2 or higher is required for Met Standard on the English Language Arts tests.
[3] An essay rating of 2 or higher is required for Met Standard and an essay rating of 3 or higher is required for Commended Performance on the grades 4 and 7 writing tests.

Performance Reporting | TEA Home