Vol. 12 •Issue 4 • Page 14
The Nuts And Bolts of Quality Control in the Cancer Registry
Most of us don’t think about nuts and bolts. But if you’ve ever purchased a piece of equipment with some vital fittings missing, you might begin to appreciate the importance of the quality assurance initiatives behind making sure every bolt is in place and every nut is secured. Just as the structural engineer checks the integrity of a skyscraper, so must the cancer registrar maintain the accuracy, quality control initiatives and structural integrity of the data used for resource assessment in cancer programs, vital statistics and medical research funding.
Fittings that Don’t Fit
“For data to be effective and usable, it has to be reliable and it has to be consistent in the collection method and in the coding,” explained Ilona Kay Gebhard, CTR, program manager for education and training at the North American Association of Central Cancer Registries (NAACCR). But exactly what is unreliable or inconsistent data?
It’s tempting to label data as “good” or “bad,” but at the Florida Cancer Data System at the University of Miami, both Joy Houlahan, CTR, and Steven Peace, CTR, caution those who make such a generalization.
“We don’t like the term ‘bad data’ because so often what amounts to failed edits or appears to be discrepant data is not necessarily ‘bad data,'” said Peace. “It might be inconsistent with the standards or it might look a little funny, but it’s not necessarily incorrect.” As he explained, data can be incorrect, inconsistent or nonstandard. However, “The most dangerous kind of data is data that’s missing from an abstract,” he noted. So how does the quality team know where to look for the stripped nuts and loose bolts that render data unreliable?
Time to Get Out the Wrenches
As with any project, quality begins at the ground level. For example, if something has been coded wrong, a software program may flag it. “Usually when the data goes through the edits, an edit remark comes up and tells you something is wrong,” Houlahan remarked. There are also online edit checks in most software packages that can be run before the case leaves the hospital, according to Peace. And from this point, “When a case leaves the hospital, it comes to a central cancer registry and is subject to a more extensive set of edits,” he said.
Calling in the Structural Engineers
Gebhard knows a thing or two about these larger edit sets. For 11 years, she did quality control for the Surveillance, Epidemiology and End Results (SEER) program. “The two major audits you can do are reabstracting and case finding,” Gebhard explained. “Reabstracting is where you randomly select charts of a particular site or sites and you visit the hospitals and you re-abstract all the information for that particular case.” It is then compared to what the central registry has in their master data file for the abstracting and coding of the case.
In case finding, however, “The process involves visiting every hospital department that diagnoses or treats cancer. The auditor reviews all records or logs and records anyone with a cancer diagnosis. The list is then compared to the central registry master case-list and discrepancies are followed up. This ensures that the registry is complete in the case ascertainment,” said Gebhard.
Needless to say, in any situation where the big guns are brought in to look for mistakes, resentment can develop from those whose work is being questioned. As a result, “We get a lot of negative feedback from the hospitals and registrars for just about everything we do,” Houlahan shared. “They think the central registry is just piling them up with work, but it’s really not the case. We’re just making sure our data is as precise as we can get it.”
As Peace sees it, “It’s their data too, and if they’re doing an annual report and need some data, they can ask us for it.” Although he is quick to point out that hospital registrars take a great deal of time and effort to abstract the cases in the highest quality manner possible, Peace’s concern lies in the fact that quality control activities, on the whole, don’t comprise the day-to-day activities in a typical hospital registry. And so, “When we come along and ask to verify or validate data or question the quality of their work in any way, we’re often perceived as criticizing their work,” he said. “Unless we become part of their routine, we’re always seen as an outsider.”
Quality Control on the Hospital Level
“Hospital registrars can also do reabstracting and recoding audits on each other if there’s more than one person in the registry,” volunteered Gebhard. “They can exchange their abstracts and based on what has been found, see if the other person can duplicate the codes that the original abstractor has.” Another suggestion Gebhard has for implementing quality control on the hospital level is getting in the habit of running reports. With reports, “Registrars can make sure their case finding is complete,” she noted. Finally, Gebhard pointed out, “If they are accredited through the Commission on Cancer (COC) with the American College of Surgeons (ACS), they are required to have at least 10 percent of their cases reviewed by a physician.”
Houlahan thinks that using a doctor as a resource is a good idea to find out if what’s recorded is really what they’re trying to convey. To illustrate, she relayed the story of a pathologist who was using a term differently than a registrar’s normal interpretation. As a result, a tumor in the colon was coded as a stomach tumor. The registrar abstracting the case would continually receive edit failures, and as Houlahan noted, “If you accumulate a number of these edit failures, they can count against you.” In this case, a little clarification from the doctor could have gone a long way.
But there’s another argument for involving doctors in the process. “We found instances where medical oncologists and surgeons don’t speak quite the same language as pathologists,” said Peace. “By bringing them all into the quality control mechanism, you provide an opportunity for open dialogue between all of these different people to make sure you’re saying the same thing and coding and interpreting things in a similar manner.”
Lastly, both Houlahan and Gebhard pointed toward education initiatives, whether it’s verifying something in a book or attending a seminar. “I stress the importance of using the manual over relying on memory,” said Houlahan. And Gebhard pointed out, “The rules governing abstracting and coding change often, so continuing education is important.” But as she also stressed, “Experience is a great educator.”
“The way we look at it, quality control begins with the abstractor,” said Peace. “It’s a responsibility of everybody throughout the system including the abstractor, the physician, the registry manager and the people who use the data.”
Quality Control from The Vendor’s Perspective
All the quality and education initiatives in the world won’t help if the staff isn’t qualified, according to Linda Case, RHIT, CTR, director of quality improvement (QI) and technical support at Precyse Solutions LLC in King of Prussia, PA. “When we conduct our hiring process we have pre-employment exams that test the quality of the registrars we are hiring,” she said. But beyond guaranteeing a competent staff, an internal QI program is in place for Case’s organization.
“Each month a QI review is conducted on approximately 15 abstracts on all colleagues. We’re looking at 22 data fields to include tumor, staging and treatment information,” said Case. “In addition, we review site, histology and staging text fields to ensure accuracy and completeness.”
Karen Phillips, registry product specialist for IMPAC Medical Systems in Mountain View, CA, has her own checklist of things the “time-impaired” registrar can do. Among the items on her list, some suggestions include:
•For visibility, try placing the abstract in the patient’s chart.
•Recode systemic errors when a study requires that data item.
•Pick six to 10 items to review when adding follow-up data.
•Beware of imprecise codes–8, 9, 0, NA, blank, NOS.
•Keep a file of problem cases and questions.
•Check site percentages against state and national data.
•Document, document, document!
Out of Sight, Out of Mind
No, we don’t often think of the nuts and bolts that keep the wheels on our vehicles or let stand the bridges we drive across. Likewise, little consideration of QI in the registry is given on the part of hospital administrators, government agencies, researchers and others who rely on the data, once cooked. The QI onus rests on the registrars.
“Data quality is extremely important for cancer registries because so many people use our data in so many different ways,” Peace stressed. “We don’t really know how people are using it once we’ve collected it. It could be used for medical research, marketing cancer programs or even needs assessments for resource allocation for machines or services.” But ultimately, as Peace stated, “They rely on our data very heavily and make data-driven decisions based on it, so quality is very important to everybody.”
Linda Gross is an assistant editor at ADVANCE.
>Used Data Don’t Lie, So Use Your Data!
Quite simply, data quality control is just a way to assure that our data are useful. Does it meet the needs of the people who are going to use it? Right away that implies that we have to set some standards—that is—anticipate what our data will be used for. The fundamental purpose of registries is to document the kinds of cancers that are occurring and the effectiveness of treatments delivered, in order to lengthen disease-free intervals and lifetime survivals. Ultimately our data should lead to improving the quality of patient care. As Mark Twain once said, “There are three kinds of lies—lies, damned lies and statistics.” Quality control is a way to assure that our data don’t lie.
The Commission on Cancer (COC) of the American College of Surgeons (ACS) has for many years defined the quality control process in cancer registries. Quality control must be monitored by the hospital cancer committee and must be documented. A number of requirements are mandated:
•A physician advisor must be appointed to the registry.
•Procedures must address casefinding, abstracting and staging, timeliness and reporting.
•Physicians must review a random sample of 10 percent of cases.
•Edit checks must be in place.
•Procedures to correct errors must be documented.
Even more important than meeting requirements for approval by the ACS is that registries use the data. A cancer registrar who generates several reports a day is likely to be a registrar who has high quality data.
So what does it take to put an effective quality control program in place? You’re probably already doing more than you think. Every time you do a study, you are doing quality control. Every time you analyze data and say, “Gee, that looks odd.” And you go back, investigate and improve some of the data. That’s quality control. The most effective quality control occurs as we go along.
The best quality control effort in the world is simply to use your data. Use your data even if your registry is new, even when it points out your mistakes, even when the reports haven’t been requested. Don’t wait to get a request. Don’t wait to get permission. All that’s required is to document your efforts, improve your accuracy and concentrate on the data elements that are most important.
Cancer registry software vendors can make this process much easier by offering pre-defined reports (the more the better) and a friendly ad hoc report writer. Extensive edit checks that are updated frequently to meet state and national requirements, along with convenient error correction are essential elements in the efficient operation of a any hospital-based or central registry.
Karen Phillips is a registry product specialist for IMPAC Medical Systems in Mountain View, CA.