Directory

⚓ T352900 Thank You page experiment: Final analysis of English Wikipedia experiment
Page MenuHomePhabricator

Thank You page experiment: Final analysis of English Wikipedia experiment
Closed, ResolvedPublic

Description

User story:

As a English Wikimedian, I want to understand the impact of the Thank You page experiment on English Wikipedia, so that I can decide is we should continue this experiment.

Documentation:

https://www.mediawiki.org/wiki/Extension:GrowthExperiments/Technical_documentation/Campaigns/Creation_of_customized_landing_pages

Background:

Previous similar task: T331495: Quick Analysis: Thank You Pages: custom account creation pages for sv, it, ja, fr, nl
Previous research: Newcomer Experience Pilot Project- Thank You Pages and Thank You Banners
Previous analysis of this campaign: T352116: Thank You page experiment: First week analysis of English Wikipedia experiment

Acceptance Criteria:

Review accounts created with the campaign parameters typage-6C-en-2023 and typage-6C-IAD-2023

Provide the following metrics:

  • Page views
  • Unique visitors
  • Registrations
  • Registration %
  • Activations
  • Activation %

In this case we're analyzing the effects of these campaigns for different time periods.

  • For the typage-6C-en-2023 campaign, we're analyzing across the entire period from deployment on 2023-11-28 until its end on 2024-01-01. Because this is a longer campaign, we use data for whole days rather than restrict it by timestamp to make the data gathering process simpler and more efficient.
  • For the typage-6C-IAD-2023 campaign, we're analyzing it from deployment on 2023-12-23 12:00 until end on 2024-01-01 12:00.

Event Timeline

KStoller-WMF moved this task from Inbox to Blocked on the Growth-Team board.

Thanks! Moving to blocked until the banner campaign is over.

I've collected data and calculated the statistics for both of the campaigns using the date/time ranges listed in the task description.

typage-6C-en-2023 Campaign

For the typage-6C-en-2023 campaign we get the following statistics (see the Definitions section below for metric definitions):

PlatformPage viewsUnique visitorsRegistrationsRegistration rateActivationsActivation rate
Desktop6,0455,2031,91236.7%21911.5%
Mobile web7,6656,8022,48636.5%2228.9%
Totals13,71012,0054,39836.6%44110.0%

These numbers are very similar to those calculated after one week as posted in T352116#9387844. In these final statistics we have a slightly higher desktop registration rate (36.7% as compared to 36.0%) and a slightly lower mobile web registration rate (36.5% versus 37.9%). These differences can point to changes in user behaviour over the holiday period, e.g. that vacation time means desktop users registered during the later stages of the campaign might be more likely to be home rather than at work.

I've also calculated the revert rate of the edits made by these users, as well as what proportion of their edits came through the Suggested Edits module on the Newcomer Homepage.

When it comes to the revert rate across all edits, with the knowledge that contribution amounts vary greatly between users, the revert rate is 8.9% out of <900 edits (we're not reporting specific numbers per our Data publication guidelines). The rate varies substantially by platform, on desktop it's 6.8% out of <450 edits, while on mobile web it's 11.0% out of <450 edits. Compared to the one-week statistics these are about 2 percentage points higher across the board. That being said, these revert rates are still much lower than those of typical newcomers.

The low revert rate might be a result of the high proportion of Suggested Edits these newcomers make. Overall the proportion is 61.8% out of <900 edits. The rate is lower on desktop (53.6% out of <450 edits) than on mobile web (70.0% out of <450 edits). These proportions are very similar to what we saw one week after deployment and indicates that the rate has been stable across the campaign.

typage-6C-IAD-2023 Campaign

The typage-6C-IAD-2023 campaign was deployed towards the end of the year, meaning this is the first set of statistics gathered for that campaign. It also uses a banner, which means that we are likely to see very different behaviour patterns. We've also run Thank You page- and banner-based campaigns in 2021/22 and there noticed the differences in patterns.

PlatformPage viewsUnique visitorsRegistrationsRegistration rateActivationsActivation rate
Desktop10,3448,3722513.0%4317.1%
Mobile web104,01096,9105540.6%7313.2%
Totals114,354105,2828050.8%11614.4%

The registration rate on the desktop platform (3.0%) is comparable to the overall registration rate of our previous banner-based campaign (3.7%), while the mobile web rate is a lot lower. I don't have a hypothesis for why the mobile web rate is so much lower, although in previous campaigns we've often ended up asking whether there's a mismatch in user expectation. When it comes to activation rates, the desktop rate (17.1%) is in line with the overall rate of the previous campaign (16.8%). The mobile web rate comes in somewhat lower (13.2%), but I don't think that's a substantial difference.

We've also investigated the revert rates and Suggested Edit rates for these users. As before, we're not reporting specific numbers. The revert rate for these newcomers is on par with or slightly higher than that of the average newcomer. Overall it's 28.6% out of <300 edits, and it's slightly lower on desktop (28.3% out of <100 edits) than on mobile web (28.8% out of <200 edits). This might be because these accounts have specific edits in mind when they visit Wikipedia, because the Suggested Edits rate is much lower than that of the Thank You page campaign (61.8% as reported above). Overall, the Suggested Edits rate is 44.4%, with it being slightly higher on desktop (44.6%) and somewhat lower on mobile web (42.9%).

Definitions
  • Page views are gathered using the webrequest dataset, with agent_type set to user
  • Unique visitors are calculated by hashing the concatenation of the visitor's IP address and User-Agent strings, and aggregated on a daily basis.
  • Activation means making an edit within 24 hours of registration that is subsequently not reverted within 48 hours.

Thank you @nettrom_WMF! I've added a summary of this analysis here:
https://www.mediawiki.org/wiki/Growth/Newcomer_experience_projects#Scaling_the_new_donor_Thank_you_page_to_English_Wikipedia

Please review and feel free to add to this summary.

I believe we can consider this task resolved. Thanks!