It’s time for the March edition of Ask Us Anything, and we are doing things a little differently this time around. Last month at the 2026 Alliance Annual Conference, I moderated a session called A Method to Our Madness: The Strategy Behind Grant Review with a panel of four other grant supporters: Amanda Kaczerski, Annette Schwind, Brigitte Azzi, and Riaz Baxamusa.
During the session, I gave the audience the opportunity to text in questions for the panel, which seemed to be a big hit. Maybe too big of a hit as we barely got through a quarter of the questions that came in. At the end of the session, I mentioned that we would try to answer more of the submitted questions and maybe I would turn it into a CMEpalooza blog post. Fast forward a month later and here we are, with not one but two blog posts in follow-up to that session. Today is Part 1. I’ll post Part 2 on Thursday.
A huge thank you to the panel for taking extra time to go through the questions and provide thoughtful responses. We tried to get to as many of the questions as we could, but I can’t promise every single one of them has been answered. Also, at the request of the panel, I am keeping their responses anonymous. That allowed everyone to feel a little more comfortable with responding.
Lastly, a reminder that all comments below are our own and do not necessarily reflect the position of our individual employers. Examples are for illustrative purposes only, intended to facilitate discussion around grant processes and strategy. None of the information provided implies a positive funding decision from any organization.
As a reminder, if you have an issue (professional or personal) you would like help with, click here to submit your question(s). This can be about anything CME related (or not CME related). Maybe you want some help deciding what to wear on the opening day of baseball season — we can help with that.
Now, on to the questions!
Q: What percentage of available funding is dedicated to digital versus face to face?
In my experience, there is not a percentage of funds allocated to a specific delivery format. Our portfolio of supported activities has shifted toward hybrid and digital formats over the past few years, primarily because of scalability and accessibility. That said, we still see strong value in face-to-face programs when the design benefits from interaction, like case-based workshops.
Q: What percentage of your budget is dedicated to unsolicited proposals? What is the approval rate for unsolicited proposals?
We review both RFP and unsolicited requests. Unsolicited proposals make up a meaningful portion of submissions, but approval rates tend to be lower simply because they aren’t always aligned to defined gaps. For instance, we may proactively issue an RFP around a specific emerging clinical challenge, while unsolicited proposals might bring forward innovative formats or audiences we hadn’t identified.
Q: Would you rather have a lower budgeted program with lower audience numbers or a higher budgeted program with higher anticipated reach?
Reach alone isn’t the deciding factor. What we’re really looking for is the relationship between the educational need, the audience, and the potential impact on clinical decision-making. Sometimes, a highly targeted program reaching a smaller group of specialists can be very impactful. Other times, a larger initiative addressing a widespread practice gap makes more sense. It really depends on the problem the education is trying to solve. For example, a small tumor board program addressing complex biomarker interpretation might reach fewer clinicians but still be highly valuable.
Q: When it’s a multi-support request, do you ever communicate directly with the other potential grantors?
No, we don’t communicate directly with other potential supporters about specific grant requests (more general discussion about congresses or general budget availability). Each company independently evaluates proposals based on its own processes and compliance frameworks. What we typically look for in multi-support requests is transparency around the total program budget and the anticipated funding mix so we can understand the scope of the activity.
Q: How do you navigate the ACCME SCS boundary regarding not directing evaluation of CME? How do you get what you want in outcomes assessment without directing it?
That’s an important boundary. As commercial supporters, we cannot direct the educational design or evaluation methods. What we can do is clearly describe the educational gaps and learning objectives we hope will be addressed in an RFP or grant description. Providers then independently determine the best educational and outcomes approach to address those gaps. For instance, we may identify a gap in guideline-aligned testing or sequencing, but the provider determines whether that is evaluated through competence questions, performance measures, or another method. We can give generic templates and guidelines for how we want the data to be delivered (e.g., N numbers, verbatim questions on slides, evaluation by demographic, etc.)
Q: If you’re not asking for something (e.g., faculty names or past outcomes report), does that mean you don’t want to see it?
If something is not requested in the application requirements, it is not necessary for the review process. That said, concise information that helps explain the educational strategy like past outcomes summaries or faculty expertise can strengthen a proposal if it is directly relevant and not excessive. For example, a summary showing how a previous program influenced clinician confidence or decision making can help demonstrate credibility and experience supporting that type of education.
Q: Are scalable contingency plans viewed as diminishing the value of the main proposal goals?
Not at all! This helps us understand what can be accomplished if full funding isn’t achieved for whatever reason. Scalability enables us to make the most of limited funding budgets and round out our overall education portfolio in alignment with our strategy. It also may give us an opportunity to try out a particular educational approach with a lower initial investment. Having this information right in the proposal helps us understand what can be accomplished at different funding amounts right during initial review and avoids delays asking for additional information.
Q: For a medical association applying for a grant, do you have a preference for a plan that calls for engagement of a MEC to design and develop the activity vs internal design and development?
I would encourage organizations to be realistic about their capabilities in light of the educational intervention they are proposing and partner when doing so enhances those capabilities. Grant funding is competitive, and if partnering with other organizations–whether it is with a MEC or other type of organization–helps you provide the level of excellence required, then do it! As supporters, we are constantly pushed to demonstrate the value of the IME we support, and our timelines are often bounded by fiscal years. So, if partnering can help you do more in-depth outcomes measurement, or report more frequently on progress, or provide project management support for your understaffed education department, it may be worth the time and effort to establish the partnership. My experience with medical associations is that they often have excellent ideas but their ability to implement them in a timely manner and measure the impact of the education is limited when compared to MECs.

Until recently, there was one minor exception — pasta. In my late 20s, I had bought one of those pasta rollers (it looks exactly like this photo) to make my own noodles and, well, it was a pain. My kitchen was too small, the process was too laborious, and the results kinda sucked. So, for many, many years, the pasta machine stayed tucked away in the basement of our new house, despite its much more functional kitchen and my improving cooking skills. Dried pasta from the store was, well, it was fine. It was a shortcut that I didn’t mind taking because it saved me from a task that I really didn’t enjoy.