How to Evaluate Neurodiversity Training Vendors: A Buyer's Checklist
Last updated
Most HR teams evaluating neurodiversity training vendors over-weight curriculum content and under-weight what determines whether the training actually produces measurable change: who delivers it, how outcomes are measured, what happens after the training event, and whether the vendor will tell you they're not the right fit when they're not. The seven evaluation criteria below cover both the content and the structural questions. Bring them to every pitch.
The seven evaluation criteria
Each criterion has a question to ask in the pitch and an answer pattern that should pass. Bring this list — and the specific phrasings — to every vendor evaluation conversation.
1. Lead-trainer track record
Does the vendor's lead trainer have direct coaching or hiring experience with the audience the training addresses?
The question: "How many neurodivergent professionals or managers has your lead trainer directly worked with in the past five years?"
Pass pattern: A specific number with context — "Debra Solomon has personally coached or trained over 500 managers and individual professionals across 40-plus organizations over 30 years." Or the equivalent for whichever vendor you're evaluating. Marketing ranges ("thousands of professionals impacted") without specificity are a weaker signal.
2. Outcome measurement plan
What does the vendor measure post-training, and how do they report it?
The question: "What three metrics will we be able to see at 30, 90, and 365 days, and what does success look like at each milestone?"
Pass pattern: Specific metrics, with leading and lagging indicators named. The four metrics from our ROI framework — 90-day retention, comparative offer rate at structured-interview, manager confidence post-disclosure, and cost-curve trajectory — are a useful benchmark for what a serious answer looks like. Vendors who can't name specific metrics, or who name only completion rates, are signaling that they don't measure outcomes themselves.
3. Sourcing model
Is the vendor training-led, staffing-led, or hybrid?
The question: "If I want to build internal capacity rather than depend on your placements, can you describe what year 3 looks like?"
Pass pattern: An honest answer that names the trade-off. Staffing-led vendors should acknowledge that their per-placement model recurs and that internal capacity isn't their primary deliverable. Training-led vendors should acknowledge that their model is slower to first-hire than staffing-led alternatives. Vendors who claim to be both equally well — without naming the trade-off — are signaling that they haven't thought about it.
4. Honest scope
Will the vendor tell you when you're not their right customer?
The question: "Describe a scenario where you'd recommend we don't work with you."
Pass pattern: A specific scenario with reasons. The strongest answer names a customer type the vendor genuinely doesn't serve well — by scale, by industry, by time horizon, by what's being asked of the program. Vendors who say "we work with everyone" or "we can adapt to anything" are signaling either incompetence or dishonesty; both are dealbreakers.
5. Post-training reinforcement model
What happens after the workshop ends?
The question: "Walk me through the 90 days after your last training session — what's your support and what's ours?"
Pass pattern: An explicit handoff plan with named deliverables. Peer debrief facilitation, manager forums, refresher cycles, scenario practice on real situations. Vendors who answer this with "we have a content library you can access" are signaling that they don't run reinforcement — they run training events. The library is artifact; reinforcement is process.
6. Compliance and legal grounding
Does the curriculum cite primary sources (ADA, EEOC, JAN) accurately?
The question: "Can you walk me through where in your curriculum you reference the ADA's interactive process, and what your trainers say managers should do when an accommodation request lands?"
Pass pattern: A coherent legal frame that names ADA.gov, EEOC guidance, and the Job Accommodation Network as primary sources. Vendors who get the legal mechanics wrong — confusing "disclosure" with "accommodation request," for example — train your managers to handle real situations badly. The legal frame is the floor, not a feature.
7. Voice match for your audience
Will the training read as credible to your specific audience?
The question: "Can I see a 5-minute clip of your lead trainer delivering a manager-facing session?"
Pass pattern: Yes, on request, with a recent clip. Trainers who lose your managers in the first five minutes lose the rest of the curriculum too. Voice match is harder to evaluate than the structural criteria above, but it's predictive of completion rates and engagement scores. If a vendor refuses to share a sample clip, treat that as a red flag.
Questions to ask in every pitch
Five questions that are diagnostic regardless of vendor type. They surface the difference between vendors who have thought about your situation and vendors who deliver the same pitch to everyone.
"What does your weakest engagement look like, and what happened?" Honest vendors have weak engagements. They can describe them. The weakest engagement description tells you more about the vendor than the strongest one does.
"Who would you describe as your toughest current customer, and what makes them tough?" Vendors describing customer difficulty without specifics are dodging. Vendors describing it with specifics are demonstrating the operational awareness you want in a partner.
"What's the most important question I haven't asked you yet?" The vendor's answer to this question is signal-dense. A serious answer reveals what the vendor thinks differentiates them. A non-answer reveals that they don't know.
"What happens if our pilot cohort produces worse results than expected at 90 days?" Vendors should have a remediation playbook. If the answer is "we'd extend the engagement," that's the right shape. If the answer is "that doesn't happen," they're signaling they don't measure outcomes seriously.
"If we cancel the program after 6 months, what do we keep?" Strong vendors leave behind something durable: trained internal trainers, documented process, manager-readiness rubrics. Weak vendors leave behind a content library that nobody opens after the engagement ends.
Vendor scorecard
A scoring template for the seven criteria. Copy this structure, set your weightings before evaluations begin, and use it to compare vendors after each pitch.
| Criterion | Weight (1–5) | Vendor A score (1–5) | Vendor B score (1–5) | Notes |
|---|---|---|---|---|
| 1. Lead-trainer track record | — | — | — | Specific number? Direct experience? |
| 2. Outcome measurement plan | — | — | — | Specific metrics at 30/90/365? Leading + lagging? |
| 3. Sourcing model honesty | — | — | — | Named the trade-off? |
| 4. Honest scope | — | — | — | Specific scenario where they're not the fit? |
| 5. Post-training reinforcement | — | — | — | Explicit handoff plan? Or content library? |
| 6. Compliance grounding | — | — | — | Cites ADA / EEOC / JAN correctly? |
| 7. Voice match | — | — | — | Sample clip shared? Audience credibility? |
Red flags
Five patterns that should disqualify a vendor regardless of how strong other parts of the pitch are.
- Refusal to share trainer credentials. If the vendor's lead trainer's background isn't on the website, isn't shared on request, or comes back as "our team includes industry experts" without specifics, treat the trainer as unverified. The training is only as good as who's delivering it.
- Measurement claims with no baseline. "We improve retention by X%" is meaningless without "compared to what." Strong vendors describe the baseline. Weak vendors quote percentages.
- "We work with everyone" answer to the honest-scope question. The answer is dishonest in addition to being a sales tic. No vendor serves every customer well. The vendor who can't name their wrong-fit customer is the wrong fit for yours.
- Curriculum that doesn't cite primary legal sources. Training that doesn't reference ADA.gov, EEOC, and JAN is teaching managers to handle real-world situations on vibes. The legal floor is non-negotiable.
- Post-training plan that's a content library. "We have a learning portal you can access after the training" is not reinforcement. It's an artifact your managers won't open. The reinforcement plan should be a process, not a database.
Frequently asked questions
How long should vendor evaluation take?
Four to eight weeks from RFP to selection for enterprise programs; two to three weeks for mid-market. Compressing the timeline below four weeks for enterprise typically produces a selection that's based on pitch quality rather than program quality — and the gap between those two predictors is where most regrettable vendor selections happen.
Should we evaluate vendors with a paid pilot or just a pitch?
Paid pilot is the better evaluation — but only after you've scored the pitch on the seven criteria above. Running a paid pilot with a vendor who would have scored badly on the criteria is expensive learning. Use the pitch evaluation to narrow to two or three candidates, then run a paid pilot only with the finalist or two who pass the structural criteria.
How do we compare vendors when their published prices and packages don't match up?
Normalize to per-manager-trained cost over a 3-year horizon, with the staffing-margin and per-placement components broken out separately. Two vendors that look similar on year-1 cost can produce very different 3-year cost curves depending on whether the per-hire margin recurs. The TCO framework on the ROI page covers this normalization in detail.
What if every vendor we evaluate fails one of the criteria?
Weight the criteria. Some are dealbreakers — outcome measurement plan, compliance grounding, and honest scope are non-negotiable in our view. Others are tradeoffs: voice match and lead-trainer track record can sometimes be addressed by pairing with internal SMEs. Have your weighting set before the evaluations begin; setting it after risks rationalizing the candidate you already preferred.
Should we share the scorecard with the vendors?
Yes for the criteria, not for the weightings. Sharing the seven criteria with vendors before their pitch produces better pitches — they prepare specifically for the questions you care about, and the gap between vendors widens on the criteria that matter most. Sharing your weighting risks vendors gaming the criteria they know you weight highest at the expense of the ones they know you don't.
External sources we cite and trust
Primary sources for the legal and structural grounding the seven criteria reference.
- ADA.gov — Employment — Americans with Disabilities Act, the interactive process, and employer obligations.
- EEOC — Disability Discrimination — Equal Employment Opportunity Commission guidance on hiring, accommodation, and complaint procedures.
- Job Accommodation Network (JAN) — accommodation database and employer-side guidance.
- Harvard Business Review — Neurodiversity as a Competitive Advantage — Austin and Pisano's foundational piece on the workplace case.
For the ROI framework that contextualizes the measurement criterion, see the ROI of neurodiversity training. For the interview-redesign side of the program, see the inclusive interviewing guide. For the upstream procurement question this checklist assumes you've answered — whether to buy from any vendor or build internally — see the three honest questions before build vs. buy.