Wikipedia:Bot requests/Archive 57: Difference between revisions
ClueBot III (talk | contribs) m Archiving 1 discussion from Wikipedia:Bot requests. (BOT) |
ClueBot III (talk | contribs) m Archiving 2 discussions from Wikipedia:Bot requests. (BOT) |
||
Line 46: | Line 46: | ||
::Hi, could someone clarify if I am in the wrong place (ie is this not a bot thing)? Thanks, [[User:Peacemaker67|Peacemaker67]] ([[User_talk:Peacemaker67#top|send... over]]) 03:09, 18 October 2013 (UTC) |
::Hi, could someone clarify if I am in the wrong place (ie is this not a bot thing)? Thanks, [[User:Peacemaker67|Peacemaker67]] ([[User_talk:Peacemaker67#top|send... over]]) 03:09, 18 October 2013 (UTC) |
||
:::G'day all, is this something a bot could do? [[User:Peacemaker67|Peacemaker67]] ([[User_talk:Peacemaker67#top|send... over]]) 19:59, 25 October 2013 (UTC) |
:::G'day all, is this something a bot could do? [[User:Peacemaker67|Peacemaker67]] ([[User_talk:Peacemaker67#top|send... over]]) 19:59, 25 October 2013 (UTC) |
||
== New '''REFBot''' - feedback on user talkpages == |
|||
20:11, 17 October 2013 > message ''[https://en.wikipedia.org/wiki/User_talk:Hwy43#A_reference_problem A reference problem]'' |
|||
*:23:20, 17 October 2013 '''[https://en.wikipedia.org/enwiki/w/index.php?title=User_talk:Hwy43&diff=577640488&oldid=577618171 Thanks for consulting. Cheers,...]''' |
|||
05:19, 18 October 2013 > message ''A reference problem'' |
|||
*02:57, 21 October 2013 [https://en.wikipedia.org/enwiki/w/index.php?title=User_talk:Jax_0677&diff=578062212&oldid=577830796 Reply] - I did some page splits. I split [[List of Princeton University people (United States Congress, Supreme Court, Continental Congress and Constitutional Convention)]] from [[List of Princeton University people (government)]]. The unlinked references at the former more than likely come from the latter. I have fixed two, and hope to fix the rest over time. '''''Thank you for bringing this to my attention.''''' |
|||
11:19, 22 October 2013 > message ''A reference problem'' [https://en.wikipedia.org/enwiki/w/index.php?title=User_talk:Petebutt&diff=578246846&oldid=578245826 Two replys] |
|||
*11:28, 22 October 2013 '''Not me Squire!''' |
|||
*11:28, 22 October 2013 '''OOPS, was me -fixed!''' (This editor is a Senior Editor II and is entitled to display this Rhodium Editor Star.) |
|||
== Bot to download and reupload images to resolve AU legal concerns == |
|||
The discussion is at [[Wikipedia talk:WikiProject Australian Roads/Shields]], but in summary, there are sets of images transferred from Commons to here as {{tl|PD-ineligible-USonly}}. The user that moved the files (downloaded them from commons then uploaded them here) wants to remove his involvement due to potential legal issues in Australia. Under existing policy, revdel, oversight, and office actions are not appropriate. It was suggested that a bot could upload the same files under a different name and nominates the old ones for deletion per [[WP:CSD#F1]]. - [[User:Evad37|Evad37]] ([[User talk:Evad37|talk]]) 06:42, 26 October 2013 (UTC) |
Revision as of 13:16, 4 November 2013
This is an archive of past discussions on Wikipedia:Bot requests. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current main page. |
Archive 50 | ← | Archive 55 | Archive 56 | Archive 57 | Archive 58 | Archive 59 | Archive 60 |
MySQL expert needed
The Wikipedia 1.0 project needs someone with experience working with large (en.wikipedia-size) MySQL databases. If interested, contact me or the WP 1.0 Editorial Team.— Wolfgang42 (talk) 02:19, 20 October 2013 (UTC)
- Um, what specifically do you need help with? You'll probably get more people who can help if they know specifically what you need help with. Legoktm (talk) 02:24, 20 October 2013 (UTC)
- Writing various queries to pull information out of the database. The problem is that the en.wikipedia database is very large, and I don't know enough about MySQL to be able to work with a dataset where queries can take weeks to complete. — Wolfgang42 (talk) 18:02, 20 October 2013 (UTC)
- If queries take that long something is wrong with the database configuration. More often than not you need to add and/or use indexes. If you provide the table structure and index list queries shouldn't take more than a few hours at most. Werieth (talk) 18:08, 20 October 2013 (UTC)
- Can you paste the queries? If you're running on labs, there are different databases like revision_userindex which would be faster if you need an index upon user. Legoktm (talk) 21:10, 20 October 2013 (UTC)
- The query (being run on the Labs database) is:
- Writing various queries to pull information out of the database. The problem is that the en.wikipedia database is very large, and I don't know enough about MySQL to be able to work with a dataset where queries can take weeks to complete. — Wolfgang42 (talk) 18:02, 20 October 2013 (UTC)
SELECT page_title,
IF ( rd_from = page_id,
rd_title,
/*ELSE*/IF (pl_from = page_id,
pl_title,
/*ELSE*/
NULL -- Can't happen, due to WHERE clause below
))
FROM page, redirect, pagelinks
WHERE (rd_from = page_id OR pl_from = page_id)
AND page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
— Wolfgang42 (talk) 22:53, 23 October 2013 (UTC)
- The results from the second column seem odd, mixing varbinary & int data and the OR in the where clause doesn't help with the preformance. What exactly are you wanting to get from the database? -- WOSlinker (talk) 23:39, 23 October 2013 (UTC)
- You're right—I pasted an older version of the code; I've fixed it to be the title both times. (My mistake for not checking that I had the latest copy in version control.) This query is a direct translation of an agglomeration of perl, bash, and C code which was parsing the SQL dumps directly. What it's trying to do is find redirect targets by looking in the redirect table, and falling back to the pagelinks table if that fails.
- I would suspect that the 3-way join isn't helping performance any either, but unfortunately it seems to be needed. If there's a better way to do this, I'd love to see it. — Wolfgang42 (talk) 02:30, 24 October 2013 (UTC)
Try this and see if it works any better. -- WOSlinker (talk) 06:00, 24 October 2013 (UTC)
SELECT page_title, COALESCE(rd_title, pl_title)
FROM page
LEFT JOIN redirect ON page_id = rd_from
LEFT JOIN pagelinks ON page_id = pl_from
WHERE page_is_redirect = 1
AND page_namespace = 0 /* main */
ORDER BY page_id ASC;
- Putting the EXPLAIN keyword in front of the query will return the execution plan, indexes used, etc. --Bamyers99 (talk) 19:40, 24 October 2013 (UTC)
Request for a bot for WikiProject Military history article reviews per quarter
G'day, WPMILHIST has a quarterly awards system for editors that complete reviews (positive or negative) of articles that fall within the WikiProject. So far we (the project coordinators) have done this tallying manually, which is pretty labour-intensive. We have recently included GA reviews, and are having difficulty identifying negative GA reviews using the standard tools. We were wondering if someone could build a bot that could tally all FA, FL, A-Class, Peer and GA reviews of articles that fall within WikiProject Military history? In terms of frequency, we usually tally the points and hand out awards in the first week after the end of each quarter (first weeks in January, April, July and October), but it would be useful functionality to be able to run the bot as needed if that is possible. Regards, Peacemaker67 (send... over) 23:36, 13 October 2013 (UTC)
- If this comes up, lemme know, 'kay? Maybe some of the other projects might be able to use it as well. :) John Carter (talk) 23:43, 13 October 2013 (UTC)
- Hi, could someone clarify if I am in the wrong place (ie is this not a bot thing)? Thanks, Peacemaker67 (send... over) 03:09, 18 October 2013 (UTC)
- G'day all, is this something a bot could do? Peacemaker67 (send... over) 19:59, 25 October 2013 (UTC)
- Hi, could someone clarify if I am in the wrong place (ie is this not a bot thing)? Thanks, Peacemaker67 (send... over) 03:09, 18 October 2013 (UTC)
New REFBot - feedback on user talkpages
20:11, 17 October 2013 > message A reference problem
- 23:20, 17 October 2013 Thanks for consulting. Cheers,...
05:19, 18 October 2013 > message A reference problem
- 02:57, 21 October 2013 Reply - I did some page splits. I split List of Princeton University people (United States Congress, Supreme Court, Continental Congress and Constitutional Convention) from List of Princeton University people (government). The unlinked references at the former more than likely come from the latter. I have fixed two, and hope to fix the rest over time. Thank you for bringing this to my attention.
11:19, 22 October 2013 > message A reference problem Two replys
- 11:28, 22 October 2013 Not me Squire!
- 11:28, 22 October 2013 OOPS, was me -fixed! (This editor is a Senior Editor II and is entitled to display this Rhodium Editor Star.)
Bot to download and reupload images to resolve AU legal concerns
The discussion is at Wikipedia talk:WikiProject Australian Roads/Shields, but in summary, there are sets of images transferred from Commons to here as {{PD-ineligible-USonly}}. The user that moved the files (downloaded them from commons then uploaded them here) wants to remove his involvement due to potential legal issues in Australia. Under existing policy, revdel, oversight, and office actions are not appropriate. It was suggested that a bot could upload the same files under a different name and nominates the old ones for deletion per WP:CSD#F1. - Evad37 (talk) 06:42, 26 October 2013 (UTC)