Most organizations treat L&D as a production function — someone hands you a topic, you build a course, you ship it. I build something different: development infrastructure that connects learning to performance, embeds into the flow of work, and gives leaders the visibility to actually manage capability. Here's how that happens.
"L&D done right isn't a department — it's an organizational capability. The goal isn't training completion. It's a workforce that performs differently on Monday morning than it did on Friday."
— The question behind every program I buildMost learning problems aren't learning problems. Before building anything, I run an organizational diagnostic — mapping performance gaps, root causes, and whether development is actually the right lever. Training that doesn't address the real constraint is expensive noise.
Programs that run parallel to work don't stick. I design development into team rhythms — embedded in onboarding, manager 1:1s, sprint cadences, and performance conversations — so learning isn't a separate event but a continuous byproduct of getting work done.
Completion rates are a reporting metric, not a performance metric. I build programs with KPIs tied to business outcomes from the start — time-to-contribution, error reduction, retention, revenue impact — and design the measurement architecture alongside the content.
Sustainable capability doesn't live in the L&D function — it lives in managers. I invest heavily in building manager capability as a development multiplier: coaching skills, feedback frameworks, and development conversations that scale impact beyond any program I could build alone.
Whether it's a 30-day onboarding program or an enterprise-wide capability transformation, the same underlying discipline applies. Scope and speed vary — the rigor doesn't.
Stakeholder interviews, performance data review, and gap analysis before any design begins. I'm looking for the delta between current and required capability — and whether development is the right intervention or if the root cause is process, tooling, or management. The output is a capability map and a clear program brief.
Program structure, modality mix, sequencing, and KPI framework — all aligned to business outcomes and built to scale. I bring a formal roadmap to senior leadership before a single piece of content is produced: here's the strategy, the investment, the timeline, and what success looks like at 30, 90, and 180 days.
Content development, LMS configuration, facilitation design, and manager enablement — built in parallel, not sequentially. I use AI-enhanced production workflows to compress timelines without sacrificing quality, and I design for the real operational environment, not an idealized one.
Launch is not the finish line. I instrument programs with learning analytics, performance data integrations, and regular stakeholder reviews. Every program generates a feedback loop that informs the next iteration — and I build that iteration cadence into the program contract from day one.
The scope of work spans strategy through execution — which means I can own a full capability transformation or plug into a specific need at any layer.
Program Leadership
Organizational Development
Design & Development
AI & Innovation
Every sector has its own constraints, politics, and stakes. Having operated across enough of them means I adapt fast and skip the learning curve that slows other leaders down.
HHS, VA, Army — compliance, AI adoption, IACET standards, zero-fail environments
Twitch / Amazon — engineering education, DevOps onboarding, 500-person distributed org
Terminix / Frontdoor — 10k+ employee LMS transformation, frontline capability at scale
Ally Financial, USAA — regulated content, rigorous compliance, banking-specific LMS
Vyond — global certification programs, product proficiency, user education at scale
ELB Learning — bespoke enterprise development, award-winning simulations, P&G, Coca-Cola
Post-acquisition capability alignment, culture integration, rapid onboarding at scale
SGT E-5, 31S SATCOM — two overseas tours, TS/SCI, zero-defect operational training
Philosophy and process only mean something when they produce results. The case studies show what this approach looks like when applied to real organizations with real constraints — and what changed because of it.