Ask Us Anything: IME Supporter Edition (Part 2)

This is a continuation of our special AUA: IME Supporter Edition from earlier this week. Click here to view Part 1.

As a reminder, if you have an issue (professional or personal) you would like help with, click here to submit your question(s). We like offering advice and people seem to enjoy hearing our opinions (we won’t comment as to whether our advice is any good).

Now, on to the questions!

Q: In terms of the number of participants, are you seeking the highest number of participants or evidence that it is the right target audience?

Each organization views this differently; however, I would venture that for the majority of us, we are seeking evidence that the right audience participated in the education. High numbers can catch one’s attention, but our medical teams are getting more savvy. Depending on the disease state, a high number of promised learners will instantly draw skepticism and dilute the value of what we are trying to convey. When medical starts asking questions about who participated in the program, and the only information we can share is a large number and not evidence that it was our target audience, we’ve lost their interest and a chance for them trusting the value of the data we’re sharing.

Q: Trying to determine what each grantor requests for outcomes/impact is challenging. Whatever happened to the Outcomes Standardization Project

The OSP has been effective in establishing consistent definitions for terms commonly used across our industry. From what I have observed, many providers have adopted these OSP definitions, and when reviewing grant submissions, I do appreciate when groups apply them. If a provider chooses not to adopt OSP definitions, I would expect them to clearly explain how they are defining clinicians and the phases of engagement within their program.

All that said, expecting full alignment in outcomes reporting across companies is likely unrealistic (note from Derek: Yup.) Supporters have emphasized for several years that outcomes reports are critical for IME teams, but there is also increasing diversity in how programs are designed and evaluated. As technology continues to influence both society and education, CME programs have evolved and innovated in ways that generate more varied and unique datasets. Because of this, providers should move beyond focusing primarily on large participation numbers or basic pre- and post-test metrics. Instead, they should think more strategically about how to demonstrate program impact. The goal should be to communicate outcomes in a way that clearly conveys the value of the program to stakeholders who may not have a background in CME, while highlighting insights that will resonate with industry colleagues.

Q: Regarding multi-support, if it takes more than a year to get sufficient funding even to meet the contingency plan, do you prefer the provider keep seeking funding or would you like the funding returned?

Communication is vital in multifunder situations. First, I think the contingency plan should always include what can be accomplished with the amount of funding you requested from a single supporter. Then, once you receive approval of funding from a supporter, regular monthly updates on other support decisions are helpful. As the proposed start date approaches, there should be a discussion of whether the committed supporter wishes to move forward with just their support, wait for further decisions, or request that funding be returned. There are very few programs that I would be willing to wait a year for. And to my fellow supporter colleagues, I’d be interested in hearing why it is taking more than a year to make a decision.

Q: Where do the number of learners and the cost per learner rank on the list of things to look for when reviewing a proposal?

Cost per learner is not something I prioritize. What’s more important is the right audience and the audience generation methodology applied. If the provider is leveraging lists or distribution partners, I will dig into that. Can they deliver the right audience? What is the mix of disciplines I could expect (e.g. if I’m expecting physicians, is the program going to give me more nurses or pharmacists?). When I see large numbers, I question the authenticity of those numbers. I also try to dig into the demographics more. If the report has large numbers, and I am able to dig and drill down into my specific audience and learn that only 10% of the audience was my target learner, I will get frustrated. But the CME provider can redeem themselves if they perform a deeper analysis or segmentation of that small target audience. What is most important is seeing the data and impact on the audience that the supporter is interested in reaching.

Q: Is there ever any concern (Legal? Internal?) that RFPs might be seen as guiding content because of the detail provided?

This is absolutely something that our Ethics & Compliance and Legal departments are concerned about and why they are involved in the review of all RFPs before they can be posted. Some companies are more conservative than others, and like most things in IME, “guidance” and “influence” are open to interpretation. This is why providers may feel that some RFPs don’t really say anything about what the supporter wants to see. In these cases, the internal compliance folks likely have a wide interpretation of what constitutes influence on content, which ensures the information in RFPs remains at a very broad level.

Ask Us Anything: IME Supporter Edition (Part 1)

It’s time for the March edition of Ask Us Anything, and we are doing things a little differently this time around. Last month at the 2026 Alliance Annual Conference, I moderated a session called A Method to Our Madness: The Strategy Behind Grant Review with a panel of four other grant supporters: Amanda Kaczerski, Annette Schwind, Brigitte Azzi, and Riaz Baxamusa.

During the session, I gave the audience the opportunity to text in questions for the panel, which seemed to be a big hit. Maybe too big of a hit as we barely got through a quarter of the questions that came in. At the end of the session, I mentioned that we would try to answer more of the submitted questions and maybe I would turn it into a CMEpalooza blog post. Fast forward a month later and here we are, with not one but two blog posts in follow-up to that session. Today is Part 1. I’ll post Part 2 on Thursday.

A huge thank you to the panel for taking extra time to go through the questions and provide thoughtful responses. We tried to get to as many of the questions as we could, but I can’t promise every single one of them has been answered. Also, at the request of the panel, I am keeping their responses anonymous. That allowed everyone to feel a little more comfortable with responding.

Lastly, a reminder that all comments below are our own and do not necessarily reflect the position of our individual employers. Examples are for illustrative purposes only, intended to facilitate discussion around grant processes and strategy. None of the information provided implies a positive funding decision from any organization.

As a reminder, if you have an issue (professional or personal) you would like help with, click here to submit your question(s). This can be about anything CME related (or not CME related). Maybe you want some help deciding what to wear on the opening day of baseball season — we can help with that.

Now, on to the questions!

Q: What percentage of available funding is dedicated to digital versus face to face?

In my experience, there is not a percentage of funds allocated to a specific delivery format. Our portfolio of supported activities has shifted toward hybrid and digital formats over the past few years, primarily because of scalability and accessibility. That said, we still see strong value in face-to-face programs when the design benefits from interaction, like case-based workshops.

Q: What percentage of your budget is dedicated to unsolicited proposals? What is the approval rate for unsolicited proposals?

We review both RFP and unsolicited requests. Unsolicited proposals make up a meaningful portion of submissions, but approval rates tend to be lower simply because they aren’t always aligned to defined gaps. For instance, we may proactively issue an RFP around a specific emerging clinical challenge, while unsolicited proposals might bring forward innovative formats or audiences we hadn’t identified.

Q: Would you rather have a lower budgeted program with lower audience numbers or a higher budgeted program with higher anticipated reach?

Reach alone isn’t the deciding factor. What we’re really looking for is the relationship between the educational need, the audience, and the potential impact on clinical decision-making. Sometimes, a highly targeted program reaching a smaller group of specialists can be very impactful. Other times, a larger initiative addressing a widespread practice gap makes more sense. It really depends on the problem the education is trying to solve. For example, a small tumor board program addressing complex biomarker interpretation might reach fewer clinicians but still be highly valuable.

Q: When it’s a multi-support request, do you ever communicate directly with the other potential grantors?

No, we don’t communicate directly with other potential supporters about specific grant requests (more general discussion about congresses or general budget availability). Each company independently evaluates proposals based on its own processes and compliance frameworks. What we typically look for in multi-support requests is transparency around the total program budget and the anticipated funding mix so we can understand the scope of the activity.

Q: How do you navigate the ACCME SCS boundary regarding not directing evaluation of CME? How do you get what you want in outcomes assessment without directing it?

That’s an important boundary. As commercial supporters, we cannot direct the educational design or evaluation methods. What we can do is clearly describe the educational gaps and learning objectives we hope will be addressed in an RFP or grant description. Providers then independently determine the best educational and outcomes approach to address those gaps. For instance, we may identify a gap in guideline-aligned testing or sequencing, but the provider determines whether that is evaluated through competence questions, performance measures, or another method. We can give generic templates and guidelines for how we want the data to be delivered (e.g., N numbers, verbatim questions on slides, evaluation by demographic, etc.)

Q: If you’re not asking for something (e.g., faculty names or past outcomes report), does that mean you don’t want to see it? 

If something is not requested in the application requirements, it is not necessary for the review process. That said, concise information that helps explain the educational strategy like past outcomes summaries or faculty expertise can strengthen a proposal if it is directly relevant and not excessive. For example, a summary showing how a previous program influenced clinician confidence or decision making can help demonstrate credibility and experience supporting that type of education.

Q: Are scalable contingency plans viewed as diminishing the value of the main proposal goals?

Not at all! This helps us understand what can be accomplished if full funding isn’t achieved for whatever reason. Scalability enables us to make the most of limited funding budgets and round out our overall education portfolio in alignment with our strategy. It also may give us an opportunity to try out a particular educational approach with a lower initial investment. Having this information right in the proposal helps us understand what can be accomplished at different funding amounts right during initial review and avoids delays asking for additional information.

Q: For a medical association applying for a grant, do you have a preference for a plan that calls for engagement of a MEC to design and develop the activity vs internal design and development?

I would encourage organizations to be realistic about their capabilities in light of the educational intervention they are proposing and partner when doing so enhances those capabilities. Grant funding is competitive, and if partnering with other organizations–whether it is with a MEC or other type of organization–helps you provide the level of excellence required, then do it! As supporters, we are constantly pushed to demonstrate the value of the IME we support, and our timelines are often bounded by fiscal years. So, if partnering can help you do more in-depth outcomes measurement, or report more frequently on progress, or provide project management support for your understaffed education department, it may be worth the time and effort to establish the partnership. My experience with medical associations is that they often have excellent ideas but their ability to implement them in a timely manner and measure the impact of the education is limited when compared to MECs.

***DATE CHANGE*** CMEpalooza Spring Changed to April 29!

Top 5 Reasons to Move the Date of CMEpalooza Spring from Wednesday, April 22 to Wednesday, April 29

5. The Philadelphia Phillies have a day game against the despicable New York Mets on April 22, and Scott and Derek have to be available for in-person booing

4. Scott realized Nick at Nite is running an all-day Alf and Silver Spoons marathon that cannot be missed on April 22

3. Derek bought a super challenging jigsaw puzzle and really needs the time to focus

2. April 22 is National Jelly Bean Day, and no one should ever schedule a conference on National Jelly Bean Day

1. Derek has to attend another conference on April 22 and—because he is a big doofus—forgot that it is the same day as CMEpalooza Spring until he went to book his travel two days ago

[very very very long sigh]

Yes, we are moving the date of CMEpalooza Spring to Wednesday, April 29. I repeat, we are moving the date of CMEpalooza Spring to Wednesday, April 29.

Yes, it is because I am a big doofus and forgot I have a conflict.

Yes, you can make fun of me for it.

Yes, Scott is annoyed with me but has been a good sport about it.

Yes, our faculty and panelists have been amazing in accommodating the date change. The agenda remains intact.

Yes, it is a minor miracle that it took until our 23rd CMEpalooza for this to happen. A small pat on our backs for that.

Once again, we are moving the date of CMEpalooza Spring to Wednesday, April 29. Update your calendars now. Regularly scheduled blog posts to follow.