By Camilo Smith
By Craig Malisow
By Jeff Balke
By Angelica Leicht
By Jeff Balke
By Sean Pendergast
By Sean Pendergast
By Jeff Balke
"It's like the chicken or the egg," says UH's Rothstein. "If we knew what the technology was, then we might be able to make the laws better. But until the industry has an idea what's going to be required by the law, they are moving cautiously on deciding what technology to use."
Toward that end, the government and health care industry are working together to come up with a set of requirements that will guide the development of security systems. The goal is to create a way for the industry to evaluate the level of security offered by a particular technology.
Ron Ross is the director of the National Information Assurance Partnership, which is leading the government's half of the joint effort with the health care industry. Ross says the "real killer" is trying to develop computer products that can communicate with each other while controlling who has access to what information. The problem, he says, is that no one has been able to agree how much protection should be applied to medical information.
"If I'm protecting my nuclear launch codes, then I'm going to have stronger algorithms, encryption, longer key codes, what have you, than if I just wanted to have some privacy for my e-mail messages," Ross says. "Typically you use just enough security that's appropriate for the information. How much security is appropriate for medical records? I'm not sure we can answer that question yet. The technology is out there. It's just a question of selecting the right stuff."
While more than 400 companies involved in health care are participating in the joint government-industry effort, not everyone is looking forward to the end result. One analysis, by Blue Cross and Blue Shield, predicts that implementing security measures to meet the government's requirements could cost the health care industry more than $40 billion.
Tom Gilligan, director of the American Federation of Health Care Transactions, a lobbying group, says the health care industry has processed billions of electronic transactions in the last 15 years and has amassed a "track record that's hard to improve upon." He says fears that a technological breakdown would lead to a violation of patient confidentiality are "unreasonable."
"Technology is no more guilty than a pet rock," Gilligan says. "Every horror story that's been told, every privacy violation that's occurred, every record that's been published somewhere it shouldn't have been, was caused by an individual who had legitimate access, then abused it. It's a people problem, and no amount of technology applied to this issue is going to solve that."
What's certain, though, is that medical records do end up in the wrong hands, and they are often used to hurt or embarrass someone.
In 1992 an anonymous source released U.S. Representative Nydia Valazquez's psychiatric records to the New York Daily News on the eve of an election. A Maryland banker who sat on a state health commission obtained confidential information from a cancer registry, cross-referenced it with a list of people who had borrowed money from his bank, then called in the loans. The names of 4,000 Florida AIDS patients were leaked to the media, even though they were stored in a computer in a locked room accessible by only three people.
While such incidents are extremely rare, given the massive amount of electronic health information that is becoming available, no amount of legislation or technological wizardry will make everyone feel secure. After all, does it matter who or what is to blame if it happens to you?
"It's creepy," says Karen Shore of the National Coalition of Mental Health Physicians and Consumers. "Even if there is no harm to the patient, it makes people feel creepy, and why are there people so insensitive that they set up a system that makes people feel creepy, that they're being violated?"
There is no easy answer, and it may be too late to hope for one. That's one reason why when Ron Ross speaks to groups about computer security, he brings along two props: a copy of the Constitution and a floppy disc. He says they illustrate the collision course between what is and what should be.
"We're going to have to learn to deal in a digital world with all the things we've come to know and love over the last 200 years," he says.
E-mail Brian Wallstin at firstname.lastname@example.org.