Avoiding Common Deficiencies in CoC Compliance


Vol. 18 •Issue 12 • Page 21
Avoiding Common Deficiencies in CoC Compliance

Of the 36 standards, five seem to cause more problems than others.

It takes meeting and exceeding 36 standards of care. It takes planning, studying and executing quality improvement (QI) initiatives year after year. It takes a rallying effort from the cancer committee, cancer registrars, medical oncologists, pathologists and hospital administration. It takes resources, time and money. And in the end, it’s all voluntary.

But becoming an American College of Surgeons (ACoS) Commission on Cancer (CoC)-approved cancer program means providing your cancer patients the No. 1 thing they seek most in a time of uneasiness and uncertainty: comfort in knowing your program offers high level, quality cancer care, said Lisa D. Landvogt, administrator of approvals and standards at the CoC. And no one can put a price tag on that.

Join the Elite

The number of CoC-approved cancer programs in the United States and Puerto Rico is currently more than 1,400, and although this number represents about 25 percent of all hospitals, an estimated 80 percent of all newly diagnosed cancer patients are treated in these facilities, cites the CoC.

To join this elite group and have your program listed on the American Cancer Society’s Web site (www.cancer.org) your program needs to meet 36 standards, all fully detailed in the CoC’s Cancer Program Standards 2004 Revised Edition.

Of those, five CoC standards have seen more common deficiencies for cancer programs than others. While reading, keep in mind there are three main categories that trigger a deficiency, said Loretta Lausin, CTR, executive director of CHAMPS Oncology Data Services and a CoC-trained independent consultant: The process (something did not happen); the documentation (it happened, but was not documented); and omissions (it happened and was documented, but key details are missing).

Standard 2.1, 2.3: Cancer Leadership

To meet these standards, the hospital or medical staff bylaws or policies have to delineate clearly the responsibilities of the cancer committee and exactly who or where the committee reports to, explained Joyce L. Jones, CTR, the COO of MLT Medical Coding Inc. and a CoC-trained independent consultant. This is an example where the process might be in place, but the documentation needs additional support to show it.

“Everyone has to report to somebody but a lot of facilities don’t necessarily show that easily—for example, who the cancer committee reports to when it identifies a problem that needs to be addressed beyond its scope,” Jones stated.

The programs that do this well, do it very well, she added. “They have a job

description for the cancer committee, which lists all their responsibilities, authority and accountabilities, and also a job description for each of the four coordinators,” Jones explained.

Sample job descriptions can be found in the CoC’s Best Practices Repository www.facs.org/cancer/coc/bestpractices.html, including “Organizational Structure” under Standard 2.1 and “Coordinator Job Descriptions” under Standard 2.3.

Standard 3.7: Annual Call for Data

The CoC saw the percentage of facilities with a deficiency in this standard improve significantly in recent years, from 35 percent in 2006 to 16 percent in 2007, according to Landvogt. “This is an excellent improvement in the message the National Cancer Data Base (NCDB) and surveyors are sending to folks,” Landvogt added. “The special reports the NCDB are creating—eQuip, CP3R—have really opened the eyes of the cancer programs to how important these clean submissions are.”

Landvogt advised running edits through your software prior to submitting the data to the NCDB to catch errors and avoid a deficiency to begin with. “It’s a combination of teamwork between cancer programs, software vendors and the CoC,” she said. “It’s always going to be a winning situation when you can get all of those entities together.”

To help programs meet this standard and others, the CoC has initiated online education tools for cancer programs in its Online Education Center at www.facs.org/cancer/webcast, including a Web cast called “Understanding National Cancer Data Base Submission Requirements and Edits.”

Standard 4.6: CAP Protocols

Nearly 30 percent of facilities saw a deficiency in this standard in 2007, Landvogt said, and according to Jones the problem lies in an assumption. “The misnomer a lot of facilities go by is that their lab is College of American Pathologists (CAP)-certified so they have to be fulfilling that standard,” Jones said. “But that doesn’t necessarily mean they’re following these checklists, because CAP checklists are not currently one of the requirements for a CAP-accredited lab.”

Those programs that are doing the checklists but aren’t meeting the standard might find it’s due to omissions—namely, negative findings are missing.

“If the margins are not positive [the pathologists] might think they don’t have to document it,” Jones said. “But each of the required data items on the checklist needs to be documented in the path report—whether it’s negative, positive, or, if they can’t assess it they need to say ‘This can not be assessed.’ That would mean it’s compliant because it’s documented. If they just didn’t say anything, it would be deficient.”

Jones suggested making sure surgery is tagging the specimen thoroughly enough so the pathologist can identify what quadrant the breast tumor is in. “It’s a cooperative effort to make sure the specimens are oriented correctly for the pathologist, so he/she knows what is the top, bottom, etc,” she said. Landvogt also noted that programs using synoptic reporting are seeing more positive outcome related to this standard, and she encourages discussing this with your pathologists to see if it’s achievable.

You can download templates of the checklist requirements needed in a pathology report from the CoC Web site under Standard 4.6 in its Best Practices Repository.

Standard 8.1 and 8.2: Quality Improvement

The trouble here is that programs either don’t understand what complies with these standards, or they might be meeting the standard but don’t have the documentation to show it, Jones explained.

Complying with Standard 8.1 may be twofold. It starts with the cancer committee bringing up a QI initiative in the beginning, identifying the data collection needed to monitor, and the final results of the data analysis with a focus on whether this directly affected cancer care. Without this cause and effect, the CoC may not be able to see that the QI result was a clear goal of the cancer committee, and it will be harder to mark as compliant. “Sometimes we don’t necessarily see that it was brought to the committee until after the final analysis is done and presented,” Jones confirmed.

The second part is making sure this cause-and-effect relationship is shown, or documented clearly in the cancer committee minutes. “Without proper documentation, the CoC does not have the information they need, and as a result, may not give you the rating you truly deserve,” Lausin said.

This applies to all standards and on the detail of the cancer committee minutes—often, it’s simply a matter of educating the note-takers on exactly what needs to be included, Jones explained.

“Rather than saying, ‘This topic was discussed,’ they need to say, ‘What final decision was made on that topic?’ The outcome,” Jones said. “And they need to document it clearly so it can be followed in the minutes of the cancer committee.”

The CoC provides a sample “Quality Improvement Plan” under Standard 8.1 in its Best Practices Repository, as well as “Cancer Committee Minute” templates to model after under Standard 2.2.

One Word

The key word to this whole process, Landvogt said, is multidisciplinary. “It’s been multidisciplinary from day one,” she said. “No one person should carry a standard—a deficiency or a commendation. The whole team should take the credit or take the appropriate action for change if necessary.

“The registrar is kind of the hub in the wheel,” Landvogt continued, “But everyone on the cancer committee needs to be involved, from the medical oncologists, pathologists, administrators, nursing, cancer registry and rehab services—all of the people who touch the cancer patient in one way or another need to celebrate the successes and solve the problems together. Everyone needs to be wrapped around the same goal of ‘quality patient care close to home,’ which is our motto.”

References:

1. The American College of Surgeons Commission on Cancer. Cancer Program Standards 2004 Revised Edition, March 2006. www.facs.org/cancer/coc/cocprogramstandards.pdf

2. The American College of Surgeons Commission on Cancer. Standards Update 2008.

www.facs.org/cancer/coc/standardsupdate2008.pdf

Ainsley Maloney is an assistant editor with ADVANCE.

Essential Resources

8The American College of Surgeons (ACoS) Web site: www.facs.org; to get directly to the Commission on Cancer (CoC) Web site, visit www.facs.org/cancerprogram

8Best Practices Repository: the CoC has compiled various examples and templates from programs that have met the standard extremely well for others to model after, to be downloaded for free at www.facs.org/cancer/coc/bestpractices.html

8Deficiency Resolution Guide: programs that are approved by the CoC with a contingency are given 1 year to correct any deficiencies before receiving their certificate. The requirements for successful resolution can be found at www.facs.org/cancer/coc/deficiencyresdoc.pdf

8Online Education Center: the CoC provides a series of fee-based Web casts with audio, slides and a written transcript that cancer programs can use to receive education and training on the CoC requirements, at www.facs.org/cancer/webcast

Standard 4.3: In Evolution

Tracking down physicians is never fun. In recent years, Standard 4.3 has haunted cancer registrars because of the significant amount of time it took to ensure that physicians record a complete American Joint Committee on Cancer (AJCC) stage for each case.

The American College of Surgeons (ACoS) Commission on Cancer (CoC) listened, and is re-evaluating Standard 4.3 over the course of the year. Because of this, for all surveys performed in 2008, the cancer committee must discuss with the surveyor the development of processes set forth to reach the goal for acquiring a pre-treatment clinical staging, but will not be held accountable for a certain percentage. Read: all programs surveyed in 2008 will be automatically compliant with Standard 4.3 unless they choose to be evaluated for a commendation.

Here’s to 2008 being your 1-year Get Out of Standard 4.3 Card.

Source: The CoC Standards Update 2008, www.facs.org/cancer/coc/standardsupdate2008.pdf

—By Ainsley Maloney

Hire a Consultant

If your cancer program is concerned about deficiencies and serious about obtaining Commission on Cancer (CoC) approval, consider hiring a consultant to guide you.

One option is CoC-trained independent consultants, an elite group of 28 consultants across the county who were selected by the CoC some 4 years back and who undergo rigorous annual training at the CoC headquarters. “We did recruitment a few years back and kept this core group for consistency while we were tweaking some of the standards and survey processes,” said Lisa D. Landvogt, administrator of approvals and standards at the CoC. To see the list of consultants visit www.facs.org/cancer/ctrconsultant.html.

The reasons to hire them are plenty. For one, the 36 standards aren’t brief—they take up 108 pages in the Cancer Program Standards 2004 Revised Edition. A consultant can help you navigate each standard’s meaning and detail the exact documentation you need to meet it. “The consultants know, and if they need clarification, they know exactly where to go to find it,” said Loretta Lausin, CTR, executive director of CHAMPS Oncology Data Services and a CoC-trained independent consultant.

The CoC consultant can review your program quarterly, run mock surveys on site or review your documentation remotely, and, by doing concurrent consulting rather than last-minute, programs may be able to find and fix mistakes early on.

“A lot of times things are really happening, it’s just that everyone is knee-deep in doing all these great things for their program that they forget to document it,” said Lausin. “That’s really a shame. It’s actually more cost effective to bring in a consultant early on rather than jeopardizing accreditation.”

—By Ainsley Maloney

About The Author