Adding It All Up

What does it really mean to pass the TAAS?

Even with the generally buoyant, election-stoked mood about public education in Texas, you don't have to look far to find the bad news. No farther than the comfortable north Houston living room of Larry and Stephanie Johnson on a Wednesday night.

Ever since Stephanie's teenage son, Stecil, died in a car wreck in 1997, 30 or so of his friends have gathered each week to talk about their lives. Most sit in a big circle, casually flopping against one another. One young woman has brought along her daughter, a toddler who leans back on the couch and falls asleep. Crisp athletic gear and Afros rival preppy, tucked-in T-shirts and fades. The mood of the group has changed over time, lightening from the nihilistic adolescent depression after Stecil's death, when his friends were convinced they would share his fate.

The topic of the night is education. High school senior Stayve Thomas, an outgoing rapper who says he considers it his responsibility to make school "entertaining for [his] peers," has brought in a friend who dropped out with only four months to go. Stayve's goal is to enlist the members of STECIL (the name was converted into an acronym for "Strengthened Through Education -- Community Inspired Leaders") to help him convince Komesha Mitchell to go back to school. If she does, he says, he'll marry her.

Neither his promise nor the conversation appears to sway Komesha, who has enrolled in GED classes and says, "School just wasn't for me." For some of these mostly black teens, questions about high school come down mostly to this: Do they go, or do they drop out? Only 50 percent of African-Americans in the state who enter ninth grade get a diploma four years later, and that number hasn't changed as TAAS scores have, to use HISD's term, "soared."

The group talks about favorite teachers and not-so-favorites, such as the coach whose idea of a history class was showing films every day. One student, a football player headed for a scholarship at Michigan State, mentions almost off-handedly that he was permitted to cheat on tests because as an athlete he had "pull" with teachers and administrators. He expects college to be easy, he says, because if you're a football player "all you have to do is be in class."

Those who have already made it to college have a different story: Their first semester came as a shock. All of them felt ill prepared for upper-level work. Asked if they felt high school administrators cared for them, one says, "They care about us passing TAAS and getting out of there."

Another chimes in, "They care about us passing TAAS so they can look a little good."

Across the state, the headlines are so regular as to form a refrain: TAAS scores are up; TAAS scores are up; TAAS scores are up. The Texas Assessment of Academic Skills, which tests primarily reading, math and writing, is the way Texas measures its education system. It is given in grades three through eight and again in high school, comes in English and Spanish versions, and sometimes -- to the learning disabled -- is even administered orally. A school's passing rate on the TAAS is the primary way parents, districts and the state can tell how well the school is doing; it's the main factor in a school's rating of "exemplary," "recognized," "acceptable" or "low-performing." Since 1994, statewide passing rates have risen from 60 to 84 percent in math, 76 to 87 percent in reading and 79 to 87 percent in writing. The Houston Independent School District's scores have increased even more than the state's.

The state spends more than $20 million a year to give the TAAS test, which is considered the linchpin of the state's accountability system. A growing number of states are struggling to implement good accountability systems, which have several components: clear curriculum standards; school report cards or ratings; sanctions for poor performance and assessment, a way to measure whether schools are meeting standards. Texas's accountability system is considered advanced compared to most states and rates well in independent reviews.

Yet not much scrutiny has been given to the assessment component, the TAAS. In fact, with all the hoopla over the passing rate, the test itself is almost overlooked -- not surprising considering that the passing rate is an easy-to-grasp number, while it is necessary to plow through countless studies, stats and technical data to get at the heart of the TAAS.

What does it mean to pass the TAAS? In practical terms, except for the fact that a student must pass it to get his high school diploma, nothing. It doesn't mean a student can get into college. It doesn't mean a student can do high school math. All it means, if outside assessments are to be believed, is that a student can pass a pretty easy test.

"Fred poured an eight-ounce glass of juice from a full quart pitcher. How many ounces of juice were left in the pitcher?"

This TAAS question requires only simple subtraction, but if a student doesn't know how many ounces are in a quart, he might be stymied. No problem. Just flip back a few pages and find a weights and measurements conversion chart to make things a little easier, although there are still a few extra calculations needed to get from ounces to quarts. But by what grade should a student know how to answer this question?

The state thinks students should master subtraction, with the help of a conversion chart, by the time they take the exit-level TAAS test, the one kids have to pass before they graduate from high school. This is the same test that asks students to estimate the length of a pencil: Is it 19 millimeters, 19 centimeters or 1.9 meters? It's the same test that Texas Education Agency Commissioner Mike Moses uses to refute charges that the TAAS test is too easy -- "Let's take the exit-level math test," Moses says. "I think you will be surprised."

The test walks students through more challenging questions, such as "Which ordered pair is the point of intersection of the lines y = 2x + 1 and 2y = x - 4?" On an algebra test, a student might have to graph those equations. On the TAAS, the lines are already graphed. All the student has to do is look at the graph and count over two and down three, to where the lines intersect. Still, there are more difficult problems: One asks students to find the lateral surface area of a cylinder. Most people might find that one tough, but once again the problem isn't as daunting as it could be; the necessary formulas are in the front of the booklet.

If these questions don't seem like high school-level material, it's because they're not. Although TEA officials insist the exit-level math test queries students on "early high school" skills, according to their own specifications it tests only curriculum for the eighth grade.

By contrast, a 12-year-old in Japan has to answer questions like this: "Jenny wanted to purchase two dozen pencils and a pen. Those items cost $8.45, and she did not have enough money. So she decided to purchase eight fewer pencils and paid $6.05. How much was a pen?"

Concerned about studies showing that American students can't compete internationally, the American Federation of Teachers (AFT) looked at five American eighth-grade math tests: three widely used commercial tests and two state tests, including the TAAS. According to a panel of teachers and education experts, the TAAS had the lowest expectations for students, with all but two questions rated "easy" on a scale of easy, medium and hard (the other two were "medium"). Though none of the other tests rated well, the TAAS is the only one of the five that is all multiple choice.

Another report, this one funded by the Texas Education Agency and written by Temple Independent School District Director of Math and Science Kathleen Coburn, notes that 71 percent of questions on the exit-level TAAS cover material from the fifth-, sixth- and seventh-grade levels. Coburn says a generation of downgraded expectations in math has created chronically underprepared teachers. "Now nobody even knows what mathematics are. They think it's just computation. ...It just shows we've got a huge conceptual problem. There's something rotten. Something's very wrong."

Even more damning for the TAAS, a recent, more thorough independent review found the reading portion of the test for fourth, eighth and tenth grades had gotten easier over time which, if true, would render the state's rising scores meaningless. Harvard researcher Sandra Stotsky noted that the total number of words, the length of the passages and the number of unusual words (words not found on a standard list of 3,000 common words) had declined dramatically since 1995. That year, the difficulty of the passages, Stotsky says, was evenly distributed above and below the fourth-grade level. In 1998, however, there were more passages below grade level than at grade level, and none above.

Over that period of time, the percentage of fourth-graders passing the test increased from 79 to 89.

A similar study poked holes in the math portion of the TAAS. Although the TEA says firmly that all TAAS questions are "on grade level," researchers Paul Clopton, Wayne Bishop and David Klein said questions on the high school TAAS ranged from third to seventh grade, with most falling at the sixth-grade level.

The researchers said while the math test stayed at the same level of difficulty from year to year, it "focused on raising achievement only to a minimal level," a level "not consistent with the high expectations for mathematics achievement that are being called for from one end of the country to another."

For a school to be rated "acceptable," only 45 percent of its students have to pass the TAAS, though that number increases by 5 percentage points each year.

"By [the TEA's] own standards," says George Scott, president of the watchdog group that commissioned the two studies, "thousands and thousands of kids don't measure up. This is a vicious cycle of distortion that makes people believe these 'acceptable,' these 'recognized,' these 'exemplary' tags really mean something."

Every government institution has a critic like George Scott. As president of the Tax Research Association, he's adversarial, he's impolitic, he splits hairs and tracks inconsistency with an eagle eye. He's fed up, persistent, and he demands data and explanations. The game Scott plays with the Texas Education Agency is like a game of schoolyard tetherball: They spin data one way; he spins it the other.

Defenders of the state's accountability system call Scott -- off the record -- an obsessed, publicity-hungry man whose idea of constructive criticism is to call a press conference and release his latest study. Scott, who resigned from the Commissioner's Accountability Advisory Group in a huff after the agency tried to block him from getting some data due to pending litigation, maintains that the public has the right to know exactly what the much-trumpeted gains in education really mean.

John Stevens, executive director of the Texas Business and Education Coalition, hotly defends the education agency from attackers like Scott. "The people doing this are not charlatans and manipulators. I know they are insulted by accusations that they are simply manipulating the system and these do not reflect real gains," Stevens says. "These are good and honorable professionals who are doing a good job for Texas. ...I think it's unfair for [Commissioner Mike Moses's] leadership and their work to be denigrated by people whom I believe have an ideological and political agenda to discredit public schools. It's not fair."

The state's defenders are quick to accuse critics of being anti-public-school, a blanket defense that rebuffs earnest critics and snipers alike. In the TEA world-view, it seems, critics are either 1) out to dismantle the free world, beginning with public education or 2) have no idea what they're talking about.

For his part, Scott angrily insists he's always been on the side of public education. In a 1996 letter to Moses, Scott warned of his fear that public education's greatest risk was being "cannibalized" by school vouchers. Scott's allies include pro-public-school, if critical, minority leaders such as Roy Malonson, chairman of the Acres Homes Chamber of Commerce. Scott has weighed in on virtually every school issue, from bonds to administrative spending to TAAS.

Scott's aggressive stance gets more media attention than it does results from the education establishment. Newspaper editorials laud Scott's efforts, while educators sniff at his "methodology."

For example, last school year the Houston Independent School District began administering the Stanford, a commercial standardized test, in an effort to complement the data provided by the TAAS. Scott and the Texas Research Association saw this as an opportunity to get an outside measure of how well the state accountability system was working. He looked to see if kids at schools rated "exemplary" had scored on or above grade level according to the Stanford. He found that 38 to 83 percent, depending on the grade, had not. Schools rated "recognized" or "acceptable" fared even worse.

It seems like a fairly simple comparison, but HISD spokesman Terry Abbott complained to the Houston Chronicle that Scott's report was "not fair," and the district asked a group of University of Houston sociologists to take Scott's TRA study apart. They released a report saying Scott had made "inappropriate methodological assumptions," but they skirted the issue of his actual findings.

The researchers looked at each student who took both tests and concluded that one test was a pretty good predictor of performance on the other, which would seem to actually lend support to Scott's comparison. But the researchers did not say how well kids passing the TAAS actually performed on the Stanford.

"They were just saying, 'Nasty George, he shouldn't be saying our schools aren't as good as our schools say they are,' " says independent testing contractor Carl Shaw, who has served as director of testing for the Houston and Fort Bend independent school districts and on numerous Texas Education Agency committees on assessment.

As for the TRA studies, the TEA released a written rejoinder so poorly argued it appears it willfully misunderstood the studies. The agency called researchers' statements that test questions are below grade level "inherently opinion," despite the fact that students' actual performance on the questions correlated perfectly with the study; in other words, more students correctly answered questions the researchers said were at the third-grade level than questions they said were at the sixth-grade level.

Calling the studies' authors "novices" in the testing field and arguing that "since none of these hired reviewers are from Texas, they cannot be familiar with the daily workings of the statewide assessment program," the unsigned TEA rejoinder accuses TRA's researchers of offering "inaccurate, unsubstantiated and controversial opinions regarding the quality of the TAAS."

But the authors' resumes don't bring the word "novice" to mind. Stotsky, who wrote the reading study, has helped develop the Massachusetts English language assessment, has evaluated education programs for NASA and has ranked state reading standards for the Fordham Foundation's national survey. As for the math study's authors, David Klein and Wayne Bishop are both math professors at California State University, and both have served on numerous California education committees. Paul Clopton, a biomedical research statistician, founded the math education advocacy group Mathematically Correct (of which Klein and Bishop are members) and serves on the panel that is developing California's statewide math test.

In an attempt to prove that the researchers have some menacing "agenda," the TEA quoted from the Mathematically Correct web site, saying the organization is "devoted to the concerns raised by parents and scientists about the invasion of our schools ... and the need to restore basic skills to math education."

The agency's implication is clear: Mathematically Correct is an anti-public-school organization concerned about government control of what children learn. Yet log on to the web site, and it's easy to see what the agency left out: The organization is mostly concerned about "the invasion of our schools by the New-New Math," referring to a trend toward what some consider to be the dumbing-down of math curricula.

As for the reading study, the education agency again quibbles over methodology. But it doesn't argue with Stotsky's assertions that the reading passages have gotten easier; instead, they say, the test's difficulty has stayed the same because the test questions have gotten harder.

Asked if she found any valid information in the two Tax Research Association studies, TEA's Associate Commissioner for Curriculum, Assessment and Technology Ann Smisko replied, "No, not really. The analysis is not as we would have analyzed the test."

The agency is not the only defender of the TAAS. Although educators initially fought against accountability, many have come on board rather than miss the train altogether. As the screws tighten-- this year, for example, the state will include special ed and Spanish TAAS scores in the accountability ratings -- educators don't want to see the TAAS toughen up, especially since they're still struggling to get kids to pass. "There is nothing wrong with that test," says Houston Federation of Teachers President Gayle Fallon. "It's a minimum-skills test."

Confronted by critiques of the TAAS, the Texas Education Agency reacts not by reexamining the test, but by circling the wagons. Smisko says she has not seen the American Federation of Teachers study of the TAAS math test. Told that it rated TAAS the easiest of five tests, she chuckles and says, "That's the opinion of a panel. We have teacher committees that look at every single item. ...We have more than 6,000 educators participating in the item development process."

The TAAS has been sold to the public as a criterion-referenced test, which means it tests an objective set of things which the Texas Board of Education has decided it wants kids to know. Kids don't get a percentile ranking telling them how well they did compared to everyone else who took the test, as they would on a national standardized test such as the Stanford, because that's not the point.

However, most people don't realize that the TAAS is not solely based on what kids ought to be able to do; it's calibrated to what they already can do. Each potential TAAS question is field-tested, and the ones that are too hard, too easy or biased (if a significant number more boys get it right than girls, for example) are thrown out. According to Temple ISD's Coburn, the actual test is constructed so that if the kids who answered the field questions were taking it, a certain percentage of them would pass: 75 percent on the reading test and 60 percent on the math test. Which is why a statewide overall passing rate of 72 percent should come as no surprise.

The practice of basing tests on how well students do is widely accepted by testing experts, and it makes a certain amount of sense, particularly if, as in Texas, the questions are based on the state standards to begin with. But the practice is not without its critics, who say it tethers student achievement to old levels. Matthew Gandal, director of standards and assessment at Achieve Inc., a Cambridge-based nonprofit organization that reviews state accountability systems, says that while states must avoid making tests so difficult that few can pass, "it does seem counterintuitive at a time when we're trying to raise our standards higher than they've ever been before -- to a level that kids in other countries have reached -- to strip our tests of questions that kids can't answer correctly simply because kids can't answer them correctly."

Alicia Ruffin is a serious young black woman on the cusp of 20, though she still wears the sassy overalls of a schoolkid. Ruffin says she made A's and B's when she attended Klein Forest High and passed the TAAS test with ease.

But when she started at the University of Houston two semesters ago, everything got harder. Because she didn't do well enough on the Texas Academic Skills Program, the test students have to pass before entering a state college or university, Ruffin had to take a remedial math course. Her GPA dropped to a 2.6, she says.

"I was really shocked," says Ruffin. "I kind of felt like I wasn't college material. I didn't know if I was smart enough."

Ruffin's situation is a perfect example of the difference between what people think the TAAS indicates and what it actually says. Using one of his favorite theatrical expressions, Scott says, "You can pass the TAAS test and still need a Hubbell Telescope to even be able to see college."

Although a 70 -- out of 92 -- is a passing score, you have to score an 80 before the Texas Education Agency says you have even a 75 percent chance of passing the public university entrance exam. You have to score an 85 to be exempt from taking the entrance exam.

The fact that the passing bar is set so low does a particular disservice to minority students like Ruffin, because the focus is on how many pass, rather than how many score high. Part of the accountability system's job is to help close "the equity gap," the chronic disparity in scores between whites and the economically advantaged on the one hand, and everybody else on the other.

Blacks and Hispanics are gaining on whites, if you look at what percentage are passing. But if you look at what percentage of students are scoring above 85 on TAAS, according to charts Scott created using TEA data, gains for blacks and Hispanics haven't been as high. Forty-two percent of whites scored above 85 in 1998, while in contrast 20 percent of Hispanics and 15 percent of blacks reached that score. "Yes, more students are passing the TAAS test," Scott says. "But let's look at closure of the equity gap in a more meaningful way."

After the now-famous Edgewood case that forced the state to equalize funds to rich and poor districts, the Texas Legislature defined seven public education goals, one of which was that "the achievement gap between educationally disadvantaged students and other populations will be closed." That was in 1984. By the time the state defined its educational goals for the new millennium, "Goals 2000," the standard had changed; now the equity gap would simply "decrease." This is one of those subtle details that Scott picks up on. It's evidence, he says, that the state is all but forsaking the bottom rung of students.

The Texas Education Agency proudly touts the narrowing of the equity gap, but has it looked at the disparities at the top achievement levels? Apparently officials there didn't see what Scott sees. "We've taken a look at those numbers," Smisko says. "I do not recall that there's a wider gap."

The TAAS's lack of rigor and the low number of students who make top marks have serious financial and personal ramifications down the line, particularly when students try to get advanced degrees. Stevens, executive director of the Texas Business and Education Coalition, says that while the TAAS paints a good portrait of elementary kids, it loses students by the time they get to high school, a problem that could be fixed by Moses's proposal to add tests in ninth and 11th grade.

Stevens points out that Texas's flagship universities such as the University of Texas at Austin and Texas A&M can't keep up with their cousins in other states when it comes to retaining students and granting diplomas. "[The graduation rate is] dramatically lower," Stevens says. "Now that's something that ought to concern us."

In the two-year budget cycle ending in 1999, the state's public colleges and universities spent $172 million on remedial education for people who could not pass the entrance exam, says State Higher Education Coordinating Board spokesman Ray Grasshoff. That's up from $38.6 million spent in the two-year budget cycle ending in 1989, a 445 percent increase. Remediation is far more expensive, and less effective, than early intervention, which is why Governor Bush's education plan emphasizes reading in the early grades. Of the class that entered Texas colleges in 1989, according to a study by conservative public policy analyst Jeff Judson, 53.6 percent required remediation and, of those, only 4.9 percent had received a degree after six years. Grasshoff says the number of students needing remedial help (some portion of which are returning adult students) has hovered at around 50 percent, noting that the number of hours of remediation the state provides is no longer increasing dramatically.

Judson also notes that the remediation problem is not unique to Texas; nationwide, 90 percent of students entering urban community colleges require remediation. This failure to prepare students for the wider world translates into figures such as this one cited by Judson: According to the National Center for Education Statistics, one in three San Antonians aren't literate enough to fill out a job application.

As TAAS scores rise, so does public opinion. Texas education, apparently, has climbed out of the dry hole it drilled itself into during the '80s. Commissioner Moses lauds teachers and students for pulling through the dark times. There's still a long way to go, of course, but the TEA takes pride in what it has wrought.

Yet since the beginning of the TAAS in the early '90s, educators have been looking for some indication -- other than the word of the Texas Education Agency -- that rising TAAS scores mean better-educated Texans. The proof is equivocal at best. On some national tests, Texas's scores have even gone down.

Pat Porter, the TEA's deputy director of assessment, says there are two independent validations of Texas's gains: the National Assessment of Educational Progress (NAEP, pronounced "nape"), a group of tests that track a statistical sampling of students in 48 participating states (in 1990 only 40 were participating); and the commercial norm-referenced tests that the state gives to a sample of students every few years.

Porter points with pride to a November 1998 report, in which the National Education Goals Panel examined Texas and North Carolina, the two states with the largest average gain on the NAEP reading and math tests between 1990 and 1996. The report applauds both states for their highly developed accountability systems, both of which, the report says, were the result of sustained involvement by the states' business communities.

Texas's minority students in particular made an impressive showing on the NAEP math tests: The state is No. 1 among states with similar demographics, including New York and California. And although Texas ranks 20th overall in eighth-grade math, its black and Hispanic students are sixth and ninth, respectively.

But what of subjects that the TAAS test doesn't cover? In science, which is only tested on the eighth-grade TAAS and is not included in state accountability ratings, the NAEP scores plummet; Texas ranks 28th of 30 and its Hispanic students 21st of the 24 states who gave a representative Hispanic sample, lending some credence to critics who say subjects tested by TAAS are emphasized to the exclusion of others.

NAEP reading and math figures lose some of their luster under close examination. First of all, the test sample is small, about 2,000 to 3,000 students per test. Second, national scores over time haven't improved all that much. Between 1992 and 1996, Texas fourth-graders jumped 11 points in math, on a 500-point scale. That's about the equivalent of two points on a 100-point scale. The gain bumped Texas from 19th to sixth in the country.

In fourth-grade reading, Texas's average score on the NAEP actually dropped one point, from 213 to 212 points out of 500, between 1992 and 1994, the only two years that test was given.

"You have to break it down," says the Texas Business and Education Coalition's Stevens, explaining that the largest gains on the NAEP can be seen by ethnic group. "The overall result is lower because Texas is getting much more diverse." In other words, since blacks and Hispanics score lower on average than whites and Asians, a growing population of blacks and Hispanics means a lower overall average, even if the scores for those particular ethnic groups go up. Stevens's contention carries in math, but not in reading: Black and Hispanic scores in reading went down between 1992 and 1994, a period which covers only the very beginning of the accountability system.

As for the commercial standardized test Texas gives, there's not much comparative data, since the Legislature has frequently changed its mind about which test should be given. For 1995 and 1996, the state sampled 12,500 students in each grade (three through eight and ten) and used the seventh edition of the Metropolitan Achievement Test (MAT7). In 2000, the state will give the MAT7 and will continue to do so every three years, unless the Legislature changes the law again.

In reading, Texas's average percentile ranking on the MAT7 lingers slightly below the national average, and between 1995 and 1996 most grades gained between one and two percentage points in reading.

In math, the news was not as good: Although scores hovered above the national average, most grades declined between one and four percentage points on the MAT7 percentile ranking, yet TAAS passing rates went up about five points.

So on the NAEP, math is up and reading is down. On the MAT7, reading is up and math is down. On the TAAS, both math and reading are up.

Is that proof that the TAAS is valid? The TEA's Pat Porter says the TAAS is more sensitive to instruction than norm-referenced tests and children are more highly motivated to perform well on it. Gains on the TAAS, she says, "are validated by our gains on the NAEP, and our students are at the national average on a norm-referenced test."

Shaw says the fact that the TAAS is calibrated to instruction makes it easier to beat. "Every time you give a test it loses part of its validity, because what you're trying to do is see what the curriculum is doing in this area without influence of the test. And everything a teacher learns about the test, they teach more directly to the content ... So we don't know from the TAAS itself how effective our curriculum is."

To Shaw, celebrating the state's gains on the TAAS is a way of sidestepping the point: "Why is anybody missing any of these questions?" Shaw asks. "We've gotten very little bang for our buck on TAAS improvement. We ought to be hanging our head in shame."

Non-TAAS indicators of educational health aren't so hot. While eighth-grade passing rates on TAAS went from 68 to 83 percent from 1996 to 1998, a gain of 15 points, the percentage of students passing the end-of-course algebra test (usually taken in eighth or ninth grade) increased only 11 points, to a still-dismal 39 percent passing, and that on a test that combines prealgebra with algebra.

And while the state says the dropout rate is 9.1 percent, the so-called "on-time graduation rate" tells a different story: About 60 percent of the number of students who enroll in ninth grade get diplomas four years later, and that number declined slightly from 1995 to 1997. For blacks and Hispanics, the dropout figures are worse: Less than half get diplomas after four years. Because so many students -- 21 percent -- disappear after ninth grade, they don't even take the exit-level TAAS.

The one organization dedicated to helping states evaluate their assessments hasn't attracted much attention from Texas. Achieve Inc., founded by state governors after the 1996 National Education Summit, was set up to provide comparability and continuity among states in the growing national trend toward high-stakes accountability systems and testing. The organization can conduct a "rigorous review" of any state's accountability system and helps states compare their tests to others nationally and internationally.

But Achieve's Director of Standards and Assessment Matthew Gandal says that, although there's "been some interest" from Texas, the state hasn't sent anyone to either of Achieve's annual meetings (out of 25 states invited, 21 are participating), nor has it asked for a one-on-one review.

"From our perspective, Texas is the model of the nation for standards and accountability," says Linda Edwards, a spokesperson for Governor Bush. "And Achieve is working to get other states to do what we've already been doing in Texas. So, in a sense, Texas is a model for the efforts of Achieve."

Yet Stevens says he hopes Texas will avail itself of Achieve's ability to do comparative analysis in the future, adding that it troubled him when Texas didn't look at what other states had done before coming up with its own curriculum standards (Texas standards rank third in the nation, according to the Fordham Foundation, and got a grade of "B").

Despite the agency's squeamish reaction to the American Federation of Teachers report, George Scott's studies and other outside reviews of the TAAS, Smisko says she'll take another look at Achieve. "Obviously, with our state progressing so well in testing and kids' performance, we believe we need to move on ... We welcome an objective analysis," she says, putting an emphasis on "objective."

The TAAS, which was originally intended to measure student performance, became a tool for measuring the success of educators and schools when the Legislature passed the education reform bill in 1993. Like any government-mandated test, it has always been, as Governor Bush put it ever so mildly when he recently announced changes to his TAAS-based plan to end social promotion, "controversial." The National Association for the Advancement of Colored People has called the test racially biased, and the Mexican-American Legal Defense Fund is suing the state for the same reason. Parents have complained that the test is inappropriate, immoral and undermines parental authority. In short, it's a miracle the test exists at all.

But since the oil crash of the late '80s, when it became painfully obvious that the state's economy would have to diversify to survive, companies that do business in Texas have taken a serious interest in the state's education system. Organizations such as the reform-minded Texas Business and Education Council, founded by large companies in 1989, and the Governor's Business Council, a group of 100 CEOs started by former governor Ann Richards, have been the major architects of the accountability system.

These groups have the leverage to change the system -- TBEC initiated Moses's proposal to add ninth- and 11th-grade tests, Stevens says -- and they're not satisfied with it yet.

In fact, the changes Stevens has on his agenda echo what George Scott has been saying for years. Some are already in motion: This year the test scores of special education students and those that take the TAAS in Spanish will be counted as part of the accountability system, which is supposed to end the dubious -- if legal -- practice some schools use of exempting large numbers of students by designating them special ed or not proficient in English. By the end of next year, the TAAS will be brought in line with Texas's new curriculum, the Texas Essential Knowledge and Skills, which schools began using this year and which experts say is richer and more rigorous than Texas's previous standards. And Moses wants to include science and social studies on the new 11th-grade test.

Furthermore, Stevens says teachers and schools might be more careful about the test's validity -- in other words, less likely to cheat -- if they had scores back soon enough to use them diagnostically in their own classes instead of just as a way to determine school ratings.

"Essentially, this thing is working, in its broadest way," says Stevens, "and the flaws and the problems need to be kept in perspective. I guess that's the concern I have with some of the critics. Their sense of proportion is wrong."

Yet Stevens sounds for all the world like George Scott when he says he'd like to see the number of students that score 85 or higher on the TAAS made part of the accountability system, particularly for schools with exemplary ratings. "The highest accountability rating shouldn't just be based on how many kids perform above minimum requirements on the test," Stevens says. " 'Exemplary' ought to mean that."

Roy Malonson, an ally of Scott's, chairman of the Acres Homes Chamber of Commerce and publisher of African-American News & Issues, says planned improvements for the TAAS should have come much earlier. "They're ending a lot of things after our report," says Malonson. "We've been hammering them for ten years."

Malonson says the large-business interests that have been the architects of the accountability system have very different concerns from small-businessmen like him. Since those businesses can afford to train their employees, Malonson says, they don't care if the accountability system is a "dog-and-pony show" that attracts development but doesn't produce fundamentally better results (a particularly interesting criticism, since Scott's organization also is funded by major companies).

"I depend directly on public education," Malonson says. "I can get the cream of the crop that doesn't even know how to answer the door or answer the phone."

But Stevens says large businesses have just as much interest in seeing the system work, saying that the demands on today's workers require critical thinking skills. "We need educated workers, not just trained workers," he says.

Where Malonson and Scott see foot-dragging, Stevens and consultants such as Darvin Winick, who advises the Governor's Business Council, see strategic implementation.

Businesses realize, they say, that you can't change something as large and decentralized as public education all at once. Bringing the exit-level test in line with real-world expectations and ending exemptions are measures their organizations support, but all in due time. In the end, the issue is not so much how the system should change, but how fast it can.

They point to the new curriculum as evidence that the system is progressing. But even as the accountability system improves, the TAAS might still be left behind in the dust. The new test, according to TEA's Director of Assessment Keith Cruse, won't be any harder than the old one, since the tests are calibrated for difficulty from year to year.

Experience has shown that teachers are more likely to teach what's on the test than what's on the curriculum; if the TAAS emphasizes one kind of problem-solving skill, so will teachers. Some schools reportedly abandon all subjects not covered by the TAAS, and new material is pushed aside in favor of one more round of pretest practice. Instead of learning algebra and geometry, high school students review eighth-grade math. Teachers go to special TAAS training sessions; administrators check practice tests, buy test strategy books and spend their summers coming up with plan after plan designed specifically to raise scores. Bonuses and careers hang on the outcome of the TAAS -- not students' careers, but educators' careers.

And that, perhaps, is the shameful part: not so much that children can't pass the test (although that's distressing enough), but that so much effort, time, energy and money is poured into preparing for a test so easy that when a student does succeed on the TAAS, it means hardly anything at all.

E-mail Shaila Dewan at shaila_dewan@houstonpress.com.

Show Pages
 
My Voice Nation Help
0 comments
Sort: Newest | Oldest
 
Houston Concert Tickets
Loading...