“Hello, it’s me. I was wondering if after all these years you’d like to meet.” — Adele
That line was everywhere in 2016. So was the problem I was staring at inside Ticketmaster.
The live event industry was running hot. Our tech infrastructure and our workforce were struggling to keep up. Two camps had formed. The first: engineers with war scars from prior outages who cared deeply about protecting the core systems that kept everything working. The second: people who saw the industry accelerating and knew we had to move faster to stay relevant. Both camps were right. They almost never worked together.
Major events were producing website crashes, long queues, and rapid sell-outs. Consumer frustration was public and loud. Animosity was building internally. Blame moved in both directions. We trained both groups separately, using different curricula and different consultants. Training costs were astronomical. The needle moved a little.
Then we changed something. We introduced an internal program focused on community and culture first, knowledge second. A week-long curriculum, with one-hour courses taught by our own in-house experts. We put people from both camps in the same room. When you graduated, you earned access to work in both worlds.
What happened wasn’t just better knowledge transfer. It was empathy. People from the protect-the-core camp started to understand the pressure on the other side. The move-fast camp started to understand why people who had lived through prior failures were so careful. The two groups had been solving the same problem from opposite ends of a wall that years of separate training had never knocked down. Putting them in a room together knocked it down.
We didn’t solve it with better content. We solved it by changing who was in the room.
Here’s what I keep coming back to after that experience:
You can have exactly the right information, delivered clearly, by a credible source, to people who signed up voluntarily, and still change essentially nothing.
This week, I found a data point that puts a number on that problem. And it should make anyone building online courses genuinely uncomfortable.
The Number Nobody Wants to Talk About
The average online course has a completion rate of 10 to 15 percent.
One in ten students who enrolls actually finishes. Maybe two.
The industry has spent years treating this as a motivation problem. People are busy. Attention spans are short. If they wanted it badly enough, they’d finish. And so course creators respond the way you’d expect: they make better videos, tighten up the structure, add bonus modules, and rewrite the welcome sequence. The content gets more polished. The completion rate stays flat.
Because this isn’t a content problem.
It’s a culture problem.
What the Data Actually Says
When courses are built around a structured community (real accountability loops, cohort windows, peer engagement, shared progress), completion rates don’t improve by a few points. They transform.
- Courses with integrated community: 70 percent completion or higher.
- Cohort-based formats with peer accountability: 85 to 90 percent.
Even peer engagement features alone, with no other changes, triple completion rates from around 15 percent to 45 percent or more.
That’s not a content upgrade. That’s a different product.
And here’s the business case alongside the education case: courses bundled with community access generate 4.5 times more revenue than standalone courses. The culture that works for completion also works for the business.
Why Community Changes the Math
In a solo-learner course, quitting is invisible. You stop showing up and no one notices. There’s no cost to stopping. The rational choice, when Tuesday night arrives and something else is competing for your attention, is usually to defer. Deferral becomes abandonment.
Add community, even a small structured group of strangers, and the math changes. Your absence is noticed. You told someone last week what you were working on. There’s a check-in tomorrow. The cost of quitting is no longer zero.
That small social pressure, operating at scale across a cohort, is what moves completion rates from 12 percent to 80 percent. Not inspiration. Not better video production. Accountability culture.
The practical elements that work:
✓
Fixed cohort windows: Start and end on specific dates. Evergreen enrollment kills momentum. When everyone starts together and moves through together, peer norms create sustained motion.
✓
Weekly synchronous touchpoints: Even 20 minutes of live time per week dramatically reduces dropout. The scheduled moment creates a commitment point that async content never does.
✓
Peer accountability pairs: More effective than instructor encouragement. People will do more for a peer they don’t want to let down than for an authority figure they’ve never met in person.
✓
Visible shared progress: Not just individual dashboards. Progress the whole cohort can see shifts norms about what “normal” participation looks like.
None of this requires expensive technology. It’s mostly structural.
This Week’s Finds
-
Story 1
The corporate training world ran your experiment already
TalentLMS 2026 Workplace Learning Report ↗
The TalentLMS 2026 Workplace Learning Report tracked completion rates across enterprise L&D programs by delivery format. Self-paced e-learning: 3%. Cohort-based programs with cohort windows, peer accountability, and structured check-ins: 90%. These are not consumer course numbers. These are corporate training programs with mandatory participation, HR tracking, and budget accountability behind them. And the self-paced format still failed at a 97% rate. The format, not the content, is doing the work.
-
Story 2
The tech stack became the obstacle
Newzenler Creator Tech Stack Report 2026 ↗
The Great Consolidation is reshaping the creator tool landscape. Over 80 mergers and acquisitions hit the course and community platform space in 2025 alone, as platforms raced to bundle what used to require 10+ separate tools. The pattern is repeating across the industry: creators who started with best-of-breed stacks are abandoning them for integrated platforms. Not because the individual tools got worse, but because the coordination overhead was quietly eating their margin and momentum. The stack was the bottleneck.
-
Story 3
Reddit is now #2 in Google search visibility. Your community content should be there.
ATAK Interactive ↗
Reddit ranks behind only Wikipedia in Google’s overall search visibility — and it’s not a fluke. Google’s Perspectives filter, which surfaces real community discussions in search results, now pulls heavily from Reddit threads. Posts with 50+ upvotes in active subreddits consistently appear on page one for high-intent queries. A community that generates real discussion isn’t just retention infrastructure. It’s organic SEO infrastructure.
-
Story 4
Facebook is charging more to reach people who care less
Buffer State of Social Media Engagement 2026 ↗
Buffer’s 2026 State of Social report found Facebook engagement has dropped roughly 36 percent over two years. Average engagement for business pages is around 0.15 percent of followers. The shift: the burden of performance in paid social media has moved almost entirely to creative quality. Targeting advantages are mostly gone. What wins now is a system for generating and testing creativity at speed.
-
Story 5
The AI productivity number everyone keeps quoting is wrong
MartechView ↗
“AI increases productivity by 44 percent” has become the standard claim in every vendor deck and boardroom conversation. The actual number from Duke University’s CMO Survey is 8.6 percent. That’s still a meaningful improvement. But 8.6 percent and 44 percent describe very different expectations. The gap between those numbers is where a lot of wasted budget lives right now.
The Problem with Volunteering
A Harvard Business School study found that new hires with mentors outperformed those without and stayed longer. But when participation is voluntary, the people who would benefit most are the least likely to sign up.
This same pattern shows up in online courses, coaching programs, and committees. Anywhere that a valuable resource exists but self-selection controls access. If you’re building any kind of learning program and relying on motivated self-selection to fill it, that’s a design flaw, not a participant failure.
Everyone in online education is trying to solve for better content when the actual bottleneck is better culture. The platforms that sell “community features” as an upsell to their course product have it backwards. Community isn’t the premium add-on. It’s the foundation. The content is what happens inside the community.
IMG’s position: we’re building systems designed to change behavior, not libraries designed to deliver information. The completion rate data isn’t an indictment of student motivation. It’s a design scorecard. If your students aren’t finishing, your system isn’t finished.