Activation rate experiments work when they cut friction or surface value before the user has to commit. This article catalogues 12 experiments run at real B2B SaaS companies, with the hypothesis, the change shipped, the disclosed lift number, and the public source for each. The companies: Canva, Routific, Appcues, Calendly, MYOB, Dropbox, Sked Social, Attention Insight, Loom, Slack, Notion, and MedNet. Lifts range from +10% to 2.5x. Use these as templates for your own backlog, not as guarantees -- activation is product- and persona-specific.
What experiments improve SaaS activation rate?
The highest-leverage activation experiments fall into five patterns: personalize onboarding by intent or persona, strip the signup form to the minimum, replace empty states with templates or sample data, add an onboarding checklist with a progress bar, and defer the paywall or signup until after the aha moment. Across 12 documented case studies below, lifts range from +10% (Canva intent-based onboarding) to 2.5x (Appcues single-CTA welcome) to a 3x trial-to-paid lift (Sked Social checklist).
The average B2B SaaS activation rate is 34-37% per Userpilot's 2024 benchmark report, with top-quartile products above 50%. A 25% relative improvement in activation correlates with about a 34% lift in MRR over 12 months per ProductLed. That's why activation is the highest-leverage step in the funnel: every dollar of paid acquisition compounds against your activation rate.
Before you copy any of these, run the experiment against your actual baseline. Personalization that works for Canva's prosumer audience may not work for an enterprise security product. The pattern is replicable; the lift is not.
1. Canva: Intent-based onboarding interstitial (+10% activation)
Hypothesis: Personalizing onboarding by what users intend to make will route them to relevant templates and lift activation.
Change shipped: Canva growth manager Xingyi Ho added an interstitial in the poster onboarding flow asking users to pick from six poster types (Event, Retail, Music, Fundraising, Holiday, Advertising). Each selection routed to a different first-run experience with thematically appropriate images and templates.
Lift: +10% activation on the posters product, translating to tens of thousands of additional activated users per month and hundreds of thousands of dollars in annual revenue per Appcues' Canva case study.
Why it worked: Canva serves dozens of use cases, and a generic poster onboarding under-performed for every persona. Routing by intent lets each user see the version of Canva that solves their problem.
After shipping, the team rolled the same intent-routing pattern across Canva for Work and the homepage onboarding.
2. Routific: Embedded product tour for a complex workflow (+50% lift, then 70%+ activation)
Hypothesis: A guided product tour will collapse the 75-minute time-to-value for a route-optimization product and lift activation.
Change shipped: Routific built an in-app product tour that walked new users through importing stops, configuring vehicles, and generating their first optimized route. The tour borrowed visual conventions from Google Maps so users could pattern-match.
Lift: An immediate 20% lift over historical activation averages, then sustained iteration pushed activation past 70% with average time-to-activate down 60% per ProductLed's Routific case study.
Why it worked: Routific's product is genuinely complex (multi-stop logistics is hard) and the empty workspace was intimidating. The tour gave users a structured path through their first successful run, which is the activation moment for the product.
The lesson: complex products with high TTV benefit from explicit guidance. Simple products often don't.
3. Appcues: Single-CTA welcome modal (2.5x activation in 15 minutes)
Hypothesis: The biggest dropoff is between welcome completion and creating a flow, and choice paralysis is causing it.
Change shipped: Appcues replaced their multi-option post-welcome screen with a single redirect into the flow builder. Total design effort: 15 minutes. Baseline activation: 12.6%.
Lift: 2.5x activation, lifting baseline from 12.6% to roughly 31% per Appcues' own case study.
Why it worked: Multi-option welcome screens force users to decide what to do next before they understand what the product is. A single CTA removes that decision and pulls users into the action that creates value (building a flow).
This is the cheapest experiment on the list -- 15 minutes of design and a deploy -- and the highest-multiple lift. Run this one first if your activation funnel has a multi-option welcome.
4. Calendly: Goal-capture survey + personalized lifecycle emails (+16% activation)
Hypothesis: Lifecycle emails tied to a user's stated goal at signup will out-perform a generic drip.
Change shipped: Calendly deployed a Sprig in-app survey at signup capturing each user's immediate goal (e.g. one-on-one client meetings, team interviews, sales calls). They then triggered branched email sequences based on the answer.
Lift: Highly personalized emails increased user activation rates by over 16% per Calendly's engineering blog.
Why it worked: A 16% lift on millions of monthly signups is a large absolute number. Calendly runs 18-20 concurrent A/B tests at any time, and goal-driven personalization is one of their most replicated patterns.
The survey itself gathered thousands of qualitative responses in under six hours, which also informed product roadmap, not just emails.
5. MYOB: Persona-branched onboarding (+21% activation)
Hypothesis: Sole traders and accountants need different first-run experiences from the same accounting product.
Change shipped: MYOB used Appcues to branch onboarding flows by self-identified user role. Sole traders saw an invoicing-first path; accountants saw a client-list import path.
Lift: +21% new-user activation per Appcues' MYOB customer story.
Why it worked: Accounting software has wildly different first-job-to-be-done by persona. A sole trader wants to invoice their first client. An accountant wants to import their existing client base. Forcing both through the same flow under-serves both.
The pattern generalizes: any SaaS product where the first-job differs by role should branch onboarding by role. The cost is one persona-selection screen and N onboarding paths.
6. Dropbox: Strip the signup form to the minimum (3-5% per field removed)
Hypothesis: Every required field costs conversion, and the marginal cost compounds.
Change shipped: Dropbox iteratively tested signup form length and quantified that every additional required field cost roughly 3-5% in conversion. They eventually shipped a two-field signup (email + password) and used OAuth for Google sign-in.
Lift: A two-field approach increased their conversion rate by over 40% versus the longer baseline, per the case study summarized in Userpilot's onboarding research. Their explainer video on the homepage added another 10% on signups.
Why it worked: Cognitive load is a conversion variable. Each form field is a decision and a typing tax. Defer everything that isn't required for first-run (full name, company, role, phone) into in-app profile completion after the aha moment.
This applies to every SaaS signup form. Audit yours and ask whether each field actually needs to be there before activation.
7. Sked Social: Onboarding checklist with progress bar (3x trial-to-paid conversion)
Hypothesis: An endowed-progress checklist will pull users to first value and increase paid conversion.
Change shipped: Sked Social added an onboarding checklist with a progress bar showing endowed progress (the first step pre-checked). Each step pulled the user toward the activation moment of scheduling their first social post.
Lift: Trial-to-paid conversion tripled per ProductLed's analysis.
Why it worked: The endowed-progress effect is a documented behavioral pattern (Nunes & Drèze, 2006): people are more motivated to complete a task when progress is visible and they're already partway through. The pre-checked first step exploits this.
Replicate by listing the 4-7 actions that lead to your activation moment, showing them as a checklist with a progress bar, and pre-checking 'Sign up' so users land at 1/N rather than 0/N.
8. Attention Insight: Interactive walkthrough + checklist (+47% feature use, +83% engagement)
Hypothesis: Walkthroughs beat empty states for technical first-runs that require uploading user content.
Change shipped: Attention Insight implemented an interactive walkthrough alongside an onboarding checklist for their AI heatmap product. The walkthrough showed users how to upload a design and generate their first heatmap.
Lift: 47% relative increase in users creating heatmap analyses and 83% increase in engagement with the 'Areas of Interest' feature per Userpilot's case study.
Why it worked: Heatmap generation requires uploading a design file, which is a non-trivial first action. Without guidance, users either uploaded the wrong file type or didn't upload at all. The walkthrough gave them a clear path through their first successful generation.
Apply this pattern to any product where the activation moment requires the user to bring their own data (uploads, integrations, imports).
9. Loom: Record before signup wall (5 free recordings before friction)
Hypothesis: Forcing signup before the aha moment kills demand for a try-before-buy product.
Change shipped: Loom let unauthenticated guests record up to five videos before the signup wall. The recording itself is the aha moment, so users hit value before any friction.
Lift: Specific lift numbers aren't public, but the pattern is documented across Loom's developer docs and Atlassian's product strategy. Users who have already experienced value are up to 5x more likely to convert per paywall research summarized by Adapty.
Why it worked: Loom creates a viral artifact (the video). Each unauthenticated recording is shared with a recipient who lands on Loom and becomes a candidate signup. The product itself drives acquisition, so removing the signup wall amplifies the loop.
This only works for products with viral artifacts (Calendly links, Figma files, Loom videos). For products without a shareable output, signup-first is usually correct.
10. Slack: Workspace setup before team invite (documented PLG pattern)
Hypothesis: Asking users to set up the workspace before inviting teammates builds personal commitment first.
Change shipped: Slack's signup flow asks for the team name first, then 'What will you mainly use Slack for?', then walks the user through a default channel setup before prompting for invites. The use-case question routes onboarding to relevant suggested integrations and channels.
Lift: Slack hasn't published the specific A/B numbers, but the pattern is documented across multiple onboarding teardowns including Userpilot and UserGuiding. The activation moment for Slack is sending the first message in a channel, and users who set up the workspace personally are more likely to invite the right teammates.
Why it worked: Multi-player products activate at the team level, but the individual signup must commit personally before they ask others to join. Slack's order (workspace -> use case -> channels -> invite) builds escalating commitment.
If your product is multi-player, audit whether you ask for invites before the user has personally invested in setup.
11. Notion: Pre-built templates instead of empty workspace (documented PLG pattern)
Hypothesis: Empty docs scare new users, and templates show them what good looks like.
Change shipped: Notion replaced its blank-workspace default with a template gallery: 50+ pre-built templates for meeting notes, project plans, team wikis, and personal pages. New users land inside something that already shows the product's pattern.
Lift: Notion uses Statsig to run hundreds of experiments per quarter, including iterations on onboarding question order. Specific template-gallery lift isn't public, but Nielsen Norman Group research cited by Eleken shows 30-45% improvement in task completion when empty states include contextual guidance and examples.
Why it worked: Notion's 'blank canvas' is its strength and its weakness. New users couldn't visualize what to do with infinite flexibility. Templates compress that decision into 'pick one of these and modify'.
Replace your product's empty states with templates or sample data. If sample data, label it clearly so users know to replace it.
12. MedNet: Simplified setup instructions (+40% activation)
Hypothesis: Setup instructions are too dense to complete in one session, and pacing them will lift activation.
Change shipped: MedNet rewrote its setup instructions for clarity and paced the steps across multiple sessions instead of cramming them into one.
Lift: Activation increased by 40% while average session duration dropped from 8 minutes to 3 minutes per case study summary in SaaSFactor's onboarding research.
Why it worked: Counter-intuitively, shorter sessions can mean higher activation. If your setup is dense and users are bouncing before completing, breaking it into shorter sessions with email reminders to return is often better than asking for 30 minutes of attention upfront.
The lesson: time-on-task isn't always a positive signal. If users are spending 8 minutes on setup but not activating, the setup is too long, not the engagement too low.
How do you pick which activation experiment to run first?
Run the cheapest experiment that targets your biggest funnel dropoff first. The five-step process:
- Instrument the funnel. Use Mixpanel, Amplitude, or PostHog to chart signup -> profile complete -> first action -> activation moment -> retention. Find the largest absolute dropoff.
- Generate hypotheses against that step. If the dropoff is signup-to-first-action, the candidates are: empty state, missing guidance, wrong default destination, choice paralysis, friction in the first action.
- Estimate effort and expected lift per hypothesis. A single-CTA redirect is 15 minutes of design (Appcues). A persona-branched onboarding is 2-4 weeks of build (MYOB).
- Run the cheapest, highest-expected-lift experiment first. Sample-size requirements depend on your signup volume and the effect you can detect.
- Ship the winner, archive the loser, write up both. Even losing experiments compound your team's knowledge.
Most teams over-build experiments. Of the 12 above, four shipped in under a week (Appcues 15 min, Dropbox field removal, Sked checklist, MedNet copy edits). The cheap ones are usually the right ones.
What metrics should you track for activation experiments?
Track three layers of metrics: the activation event, the proxy behaviors that lead to it, and the downstream retention signal.
- Activation event: a single binary -- did the user reach the moment that correlates with retention? For Slack it's sending the first message. For Loom it's recording the first video. For Calendly it's scheduling the first meeting.
- Proxy behaviors: the 3-5 actions on the path to activation. For each: completion rate, time to complete, dropoff to next step.
- Retention signal: do activated users from the variant retain better than activated users from the control? A variant that lifts activation but tanks retention isn't a win.
Guardrail metrics matter. A welcome flow that lifts activation by 30% but cuts retention 10 percentage points is a loss. Mixpanel and Statsig both let you set guardrails on the experiment so you don't ship a regression.
For a deeper definition of activation versus aha moment versus first value, see our growth metrics FAQ cheatsheet.
| Company | Hypothesis | Change | Lift |
|---|---|---|---|
| Canva | Personalizing onboarding by intent will surface relevant templates faster | Added a 'What are you making?' interstitial routing to themed templates | +10% poster activation |
| Routific | A guided tour will collapse 75-min time-to-value for a complex workflow | Built an embedded product tour leveraging Google Maps mental models | 20% lift, then 70%+ activation |
| Appcues | Choice paralysis after the welcome modal is killing activation | Replaced the multi-option screen with a single CTA into the flow builder | 2.5x activation (12.6% to ~31%) |
| Calendly | Onboarding emails tied to stated user goals will out-perform generic drips | Triggered personalized lifecycle emails from a goal-capture survey | +16% activation |
| MYOB | Different personas need different first-run paths | Branched onboarding by user role (sole trader vs. accountant) | +21% new-user activation |
| Dropbox | Every required signup field costs conversion | Stripped signup down to email + password (no name, company, etc.) | 3-5% per field removed |
| Sked Social | An endowed-progress checklist will pull users to first value | Added an onboarding checklist with progress bar | 3x trial-to-paid conversion |
| Attention Insight | Walkthroughs beat empty states for technical first-runs | Interactive walkthrough + onboarding checklist | +47% heatmap creation, +83% feature engagement |
| Loom | Forcing signup before the aha moment kills demand | Allowed up to 5 recordings before the signup wall | Documented PLG pattern (Atlassian) |
| Slack | Setting up the workspace before inviting builds commitment | Workspace name + use-case Q before any team invite | Documented PLG pattern |
| Notion | Empty docs scare new users; templates show 'what good looks like' | Replaced blank workspace with 50+ pre-built templates | Documented PLG pattern |
| MedNet | Setup instructions are too dense to complete in one session | Simplified copy + paced setup over multiple sessions | +40% activation, session time 8 min to 3 min |