Page MenuHomePhabricator

namespaceDupes.php doesn't have limit on write queries
Closed, ResolvedPublic

Description

Currently all of s5 have been lagging because of a transaction done by namespaceDupes.php which is taking an hour (and counting) to go through. All caused by T350431: Run maintenance scripts on Serbian projects (a namespace change that affected many page that some are heavily linked as well)

The delete, and update queries of namespaceDupes.php don't have any limits on them

The query:

DELETE /* NamespaceDupes::checkLinkTable  */ FROM `pagelinks` WHERE (pl_from = '927158' AND pl_names

(on srwiki)

Event Timeline

Change 971332 had a related patch set uploaded (by Thcipriani; author: Thcipriani):

[mediawiki/core@master] Disable namespaceDupes.php for now

https://gerrit.wikimedia.org/r/971332

Change 971287 had a related patch set uploaded (by Thcipriani; author: Thcipriani):

[mediawiki/core@wmf/1.42.0-wmf.3] Disable namespaceDupes.php for now

https://gerrit.wikimedia.org/r/971287

while ( true ) {
			$res = $dbw->select(
				…
				[
					'ORDER BY' => [ $titleField, $fromField ],
					'LIMIT' => $batchSize

This is where it (appears to) select in batches.

				$dbw->update( $table,

The update() and delete() here are likewise batches.

There appears to be no begin/commitTransaction/startAtomic calls here, thus the default on the CLI is for each of these to be committed directly and not to e.g. buffer in an ever growing transaction.

			$this->waitForReplication();

Here it waits for replication between each batch iteration.

From what I can tell, the only thing that it does unlimited, is that it will update or delete the links that belong to a single page ID in a single commit, regardless of how many there are on that page. This is the same as how any links update works in MediaWiki core afaik and is hard to split up further without creating atomicity problems for web server code.

Maybe the batch size is too large? Given that each page can be quite large, the default batch size essentially gets amplified by 10-100x in terms of impact, so this should probably have a much smaller default batch size.

Explain of select version of the query (P53135) gives this:

+------+-------------+-----------+-------+----------------------+---------+---------+------+------+-------------+
| id   | select_type | table     | type  | possible_keys        | key     | key_len | ref  | rows | Extra       |
+------+-------------+-----------+-------+----------------------+---------+---------+------+------+-------------+
|    1 | SIMPLE      | pagelinks | range | PRIMARY,pl_namespace | PRIMARY | 265     | NULL | 500  | Using where |
+------+-------------+-----------+-------+----------------------+---------+---------+------+------+-------------+

It can be that 500 different conditions posing as one query could trigger a full table scan (as it uses range index instead of exact match).

Change 971332 merged by jenkins-bot:

[mediawiki/core@master] Disable namespaceDupes.php for now

https://gerrit.wikimedia.org/r/971332

Change 971287 merged by jenkins-bot:

[mediawiki/core@wmf/1.42.0-wmf.3] Disable namespaceDupes.php for now

https://gerrit.wikimedia.org/r/971287

Mentioned in SAL (#wikimedia-operations) [2023-11-03T01:13:05Z] <thcipriani@deploy2002> Started scap: Backport for [[gerrit:971287|Disable namespaceDupes.php for now (T350443)]]

Mentioned in SAL (#wikimedia-operations) [2023-11-03T01:14:25Z] <thcipriani@deploy2002> thcipriani: Backport for [[gerrit:971287|Disable namespaceDupes.php for now (T350443)]] synced to the testservers (https://wikitech.wikimedia.org/wiki/Mwdebug)

Mentioned in SAL (#wikimedia-operations) [2023-11-03T01:23:34Z] <thcipriani@deploy2002> Finished scap: Backport for [[gerrit:971287|Disable namespaceDupes.php for now (T350443)]] (duration: 10m 29s)

Change 971288 had a related patch set uploaded (by Zoranzoki21; author: Zoranzoki21):

[mediawiki/core@master] Revert "Disable namespaceDupes.php for now"

https://gerrit.wikimedia.org/r/971288

Change 971288 abandoned by Zoranzoki21:

[mediawiki/core@master] Revert "Disable namespaceDupes.php for now"

Reason:

https://gerrit.wikimedia.org/r/971288

Kizule triaged this task as Medium priority.Nov 6 2023, 9:53 PM

@MSantos could you please explain what the dashboard move means exactly? Does it mean that namespaceDupes.php will remain disabled? If so, does that mean there is no way to deploy the change from T350739 ? Thanks.

@MSantos could you please explain what the dashboard move means exactly? Does it mean that namespaceDupes.php will remain disabled? If so, does that mean there is no way to deploy the change from T350739 ? Thanks.

@Strainu thanks for raising this question.

The short version of the discussion is: this is a domain that currently doesn't have explicit owners and needs to be re-evaluated by the Product Function in MediaWiki Engineering. Unfortunately, that means we can't resource this at the moment and tasks will remain blocked, but we are working to find a solution for this situation as soon as possible. Sorry for the inconvenience.

Change 975365 had a related patch set uploaded (by Umherirrender; author: Umherirrender):

[mediawiki/core@master] maintenance: Reduce delete attempts in namespaceDupes.php

https://gerrit.wikimedia.org/r/975365

Change 975417 had a related patch set uploaded (by Umherirrender; author: Umherirrender):

[mediawiki/core@master] maintenance: Batch update for _from_namespace in namespaceDupes.php

https://gerrit.wikimedia.org/r/975417

Change 975419 had a related patch set uploaded (by Umherirrender; author: Umherirrender):

[mediawiki/core@master] maintenance: Batch update for rev_page in namespaceDupes.php

https://gerrit.wikimedia.org/r/975419

Change 975419 merged by jenkins-bot:

[mediawiki/core@master] maintenance: Batch update for rev_page in namespaceDupes.php

https://gerrit.wikimedia.org/r/975419

Change 975365 merged by jenkins-bot:

[mediawiki/core@master] maintenance: Reduce delete attempts in namespaceDupes.php

https://gerrit.wikimedia.org/r/975365

Change 975417 merged by jenkins-bot:

[mediawiki/core@master] maintenance: Batch update for _from_namespace in namespaceDupes.php

https://gerrit.wikimedia.org/r/975417

Change 971288 restored by Zoranzoki21:

[mediawiki/core@master] Revert "Disable namespaceDupes.php for now"

https://gerrit.wikimedia.org/r/971288

Change 976305 had a related patch set uploaded (by Umherirrender; author: Umherirrender):

[mediawiki/core@master] maintenance: Batch delete for key conflicts in namespaceDupes.php

https://gerrit.wikimedia.org/r/976305

Change 976305 merged by jenkins-bot:

[mediawiki/core@master] maintenance: Batch delete for key conflicts in namespaceDupes.php

https://gerrit.wikimedia.org/r/976305

Ladsgroup assigned this task to Umherirrender.

I have re-enabled the script by +2ing gerrit:971288 but I highly recommend running some tests in beta cluster and other places before running in production and running it with even lower batch size than default in prod (there is no rush and the script doesn't take months to run).

I think that makes this ticket resolved now.

I have re-enabled the script by +2ing gerrit:971288 but I highly recommend running some tests in beta cluster and other places before running in production and running it with even lower batch size than default in prod (there is no rush and the script doesn't take months to run).

I think that makes this ticket resolved now.

Sorry, but I don't see +2 there. ;)

Change 971288 merged by jenkins-bot:

[mediawiki/core@master] Revert "Disable namespaceDupes.php for now"

https://gerrit.wikimedia.org/r/971288

I have re-enabled the script by +2ing gerrit:971288 but I highly recommend running some tests in beta cluster and other places before running in production and running it with even lower batch size than default in prod (there is no rush and the script doesn't take months to run).

I think that makes this ticket resolved now.

Can we test it on some smaller Serbian projects (I mean Wiktionary etc), for start, to have some progress for T350431: Run maintenance scripts on Serbian projects?

@Ladsgroup are there any remaining steps so that bugs blocked on this one can be processed?

I think that makes this ticket resolved now.

As a deployer who ran into this issue just now, I disagree, it shouldn’t be closed while it remains impossible to run the maintenance script in production.

As far as I can tell, all the fixes / improvements to the maintenance script made it into wmf.7; can we backport the revert?

I think that makes this ticket resolved now.

As a deployer who ran into this issue just now, I disagree, it shouldn’t be closed while it remains impossible to run the maintenance script in production.

As far as I can tell, all the fixes / improvements to the maintenance script made it into wmf.7; can we backport the revert?

Although the revert has been landed, I don't think it's approved to run yet as a regular maintenance script (needs confirmation first in Beta Cluster and then production, from what Amir says).

I think that makes this ticket resolved now.

As a deployer who ran into this issue just now, I disagree, it shouldn’t be closed while it remains impossible to run the maintenance script in production.

I'm at the middle a partial outage so I don't want to argue much but the title of the ticket is that this maint script doesn't have a limit on deletion. That is fixed. You're talking another issue ("I can't run the maint script in production") which was blocked by this issue but they are not the same.

As far as I can tell, all the fixes / improvements to the maintenance script made it into wmf.7; can we backport the revert?

I don't care. Only test first before doing anything in production.

2 weeks later, there seems to be no progress on the tasks that depend on that script working so I don't really care much about potayto potahto discussions - we need changes to various namespaces throughout the projects.

I'm asking everyone in the subscribe list: what are the remaining steps in unblocking this situation and who owns them? Thanks.

Anoop closed this task as Resolved.EditedDec 18 2023, 8:46 AM

On task T350431 namespacedupes had been run on srwikinews successfully, so reopening this task is not necessary , create a new task if any issues found other than what mentioned in this task.