A Critical Quality Consideration – Which Oncology Clinical Certifications Matter?

A Critical Quality Consideration – Which Oncology Clinical Certifications Matter?

Promoting Quality to Improve Outcomes and Cut Costs

Which Oncology Clinical Certifications Matter?

By Wes Chapman

May 23, 2013


Professional medical societies have certification programs which certify the medical practices at certain locations as being “certified” for practicing the highest standard of care. These certifications are provided by a variety of medical societies in oncology including: 1)  American College of Surgeons Commission on Cancer (ACS CoC), 2)  American Society of Clinical Oncologists (ASCO), 3) American College of Radiologists, 4) American College of Radiation Oncologists, and 5) American Society for Radiation Oncology. We have seen these certifications increasingly used to determine the potential for participation in quality based incentive plans, and have taken a systematic look at how they compare based on a series of objective quality criteria. We rated and ranked the certifications based on a scale of 1 (low score is best) to 5, and then compiled the totals into a ranking of 1-5 stars, based on a scale graded between 10 – the lowest possible score and 50.


On this basis CoC and QOPI both received 5 stars, with scores of 14 and 20 respectively. Both ACR and ACRO received scores of 4 stars, with respective scores of 26 and 23. While ASTRO holds much promise for its certification program, it is still in formation, but it still received 3 stars based on the work done to date.

The Solution for Payers: Promote Quality Certification

Private health insurance companies are in a unique position to speed the acceptance and reach of quality certifications by promoting their use within their provider networks. This is particularly important in oncology, where outcome data is difficult to assemble and interpret.

Some of the largest payer organizations in the U.S. – including Aetna, BCBS, Humana, and United – have already endorsed the Quality Oncology Practice Initiative Certification Program (QOPI) of the American Society of Clinical Oncology (ASCO). We have seen certifications of clinical programs at provider sites used by payers in evaluating provider organizations for participation in new alignment structures including private ACOs. Based on payer reliance on these certification programs, we have undertaken a systematic look at how they differ and their potential value in the evaluation of clinical care.

Which Standards Are Worth It?

The certification bodies considered in this paper are non-profit oncology specialty societies that focus on education for their members, along with research into evidence-based treatment guidelines. The criteria were selected as representative of best practices in quality systems in general, and medical quality systems in particular. Criteria for the institution:

  • Quality/certification a key component of mission statement
  • Specialty area expertise, including education and research
  • Large membership with national reach
  • Solid reputation; respected in the medical community
  • Non-profit and non-lobbying

Criteria for the standards:

  • Wide-ranging within the specialty, covering many aspects of care
  • Evidence-based
  • Include clinical pathways
  • Updated frequently

Criteria for the certification process

  • Audited frequently; annually is preferred
  • Requires documented and timely response to nonconformities
  • Encourages continuous improvement
  • Transparent; requires public reporting of patient outcomes

Oncology Certification Programs to Consider

To date, no certification program for oncology meets our full list of criteria; however, they serve as guidelines for choosing from what is currently available, and for determining how those programs can be improved. The table that begins on the next page allows a comparison of the major certification programs in oncology. An analysis following the table points out the shortcomings and advantages of each certification program and rates them on a scale of 1 to 5.

The table compares these oncology certification programs:

  • CoC: From the American College of Surgeons Commission on Cancer (ACS CoC)
  • QOPI: From the American Society of Clinical Oncologists (ASCO)
  • ACR: From the American College of Radiologists (ACR)
  • ACRO: From the American College of Radiation Oncologists
  • ASTRO: From the American Society for Radiation Oncology

Note that, for a number of years, ASTRO and ACR jointly certified radiation oncologists through the ACR program. In October of 2012, ASTRO withdrew from the partnership in order to create its own certification program. The ASTRO standards are still in development.

Snip 1Snip 2Snip 3Snip 4

Analysis of the Table

American College of Surgeons Commission on Cancer (CoC) Accreditation

CoC Accreditation with commendation: Score = 14


  • Does not require public reporting of outcomes to the general public – except with commendation.
  • Evidence-based measures are highly-specific and by no means comprehensive. The CoC has a distinct advantage with its huge database of cancer cases, the NCDB, so it is hard to believe that they can put together a more comprehensive set of quality measures.
  • Their definition of “continuous improvement” is somewhat arbitrary, but CoC requires its awardees to execute at least two continuous improvement efforts every year.
  • Several safety measures are outsourced. In the case of radiation oncology, this is commendable, because other organizations hold much more expertise in these areas. Inclusion of evidence-based processes would be desirable.
  • No public reporting of outcomes.


  • Strict requirements for Cancer Program Practice Profile Reports (CP3R) measure performance.
  • Data quality requirements, beyond chart abstractions.
  • Data requirements for diagnostic purposes and treatment planning.
  • Requires a multi-specialty tumor board. This tumor board leads the different programs required by accreditation, including prevention, screening, and continuous improvement programs.

QOPI: Score = 20


  • Data quality requirements are more loose than most, with an audit upon achieving accreditation / reaccreditation, but doesn’t necessarily audit participants every year
  • Data auditing requires chart abstraction is cumbersome and expensive In a world where electronic medical record systems are common, other forms of reporting could be accepted.
  • Could benefit from including data quality requirements.
  • No public reporting of outcomes


  • QOPI has strict performance and quality-of-care requirements.
  • Has a well-defined set of safety measures in a separate standard, to ensure safe patient care.
  • Well-respected, comprehensive set of evidence-based quality measures.
  • QOPI raises the bar for cancer center performance—in 2011, performance required by QOPI was 72.62% and today it is 75%—instead of leaving it up to each individual cancer center.

ACR: Score = 26


  • Clinical pathways and processes are not specified.
  • Safety standards are loosely-defined when compared against ACRO.
  • Does not include a lower-bound on minimum acceptable process improvement.
  • Neither public reporting nor collection of outcomes data.
  • Could benefit from including data quality requirements.
  • Not entirely clear about what or how they will measure performance. They do not state how or why they will score different charts.


  • ACR’s guidelines can do tremendous good.
  • Requires compliance to safety guidelines established by ACR and ASTRO.

ACRO: Score = 23


  • Does not require public reporting of outcomes data.
  • Could benefit from including data quality requirements.
  • Does not necessarily include evidence-based guidelines; charts are reviewed with an eye towards NCCN’s evidence-based pathways.


  • Establishes a process-based approach and defines an evidence-based process for delivering safe radiation oncology therapy.
  • Comprehensive set of safety requirements that include processes, staffing, and other areas relevant to clinical quality.
  • Process-based continuous improvement requirement.
  • Very clear regarding requirements and criteria for chart reviews.
  • Reminiscent of an ISO 9001 quality management system, but adapted for a radiation oncology clinic.

ASTRO: Score = 33


  • Not fully completed yet.
  • It is unclear whether continuous improvement will be a requirement.


  • Will include a very comprehensive set of safety measures, perhaps the most comprehensive of any accreditation.
  • Will include public reporting of outcomes data, the first radiation oncology certification to require this.
  • Performance standards will derive from evidence-based guidelines.


Quality Rankings Tabular Analysis

Snip 5

Rating the Certifications

We have rated the certifications on a scale of 1 to 5 and also make recommendations for their areas of application. (Obviously, a provider that doesn’t perform radiation therapy or diagnostics should not be expected to have certification in that area.)

ACRO: 4 Stars – Recommended for radiation oncology clinics and cancer centers that offer radiation therapy.

CoC with commendation: 5 Stars – Recommended for cancer centers.

QOPI: 4 Stars – Recommended for medical oncology clinics and cancer centers.

CoC (without commendation): NR – We recommend CoC with commendation first, or else CoC combined with QOPI.

ACR: 4 Stars – We recommend ACRO first, but would recommend ACR over nothing at all.

ASTRO: 3 Stars – – This seems to be a promising, comprehensive certification, but we must defer a recommendation until the standard is published.

Note that any of these certifications is preferable to no quality certifications at all. In the case of certifications not listed here, payers can use the criteria and table headings to compare and rate them before deciding whether to promote them among their network providers.


All of the certifications reviewed offer worthwhile measures for medical practice, and are increasingly moving in the direction of the best quality management systems – like ISO 9001 – requiring best practice adherence, continuous improvement and outcome analysis to test effectiveness. Clearly data access and quality loom as major issues for all medical quality systems, and the gigantic investment in EMRs has not yet facilitated real time access to meaningful data sufficient to power any of the certifications.

Wes Chapman
Written by Wes Chapman

1 Comment responses

  1. Avatar

    This is a topic that is near to my heart.
    .. Best wishes! Exactly where are your contact details though?