arc diff
.
=== Who would benefit ===
The tag is helpful to maintain better overview on the state of tasks for developers and product managers in workboards, like for example [https://phabricator.wikimedia.org/tag/ui-standardization-kanban #ui-standardization-kanban]
Currently you have to manually add the tag, see arc diff
.
=== Endorsements (T150510) ===
#
=== Support (T150510) ===
#
== MediaWiki.org: Generate infoboxes from extension.json in git ==
{{tracked|T155029}}{{../buttons|task=T155029|title=MediaWiki.org: Generate infoboxes from extension.json in git}}
=== Problem ===
There are some widely used information templates on mediawiki.org which display some software information that could be easily extracted from the source code, but those templates are currently created and maintained manually. This is inefficient and ineffective - it's a significant amount of manual work and often not done, or not updated when code changes. Example: [https://www.mediawiki.org/wiki/Template:Extension extension], [https://www.mediawiki.org/wiki/Template:ExtensionInstall extension install], [https://www.mediawiki.org/wiki/Template:MediaWikiHook hook] and [https://www.mediawiki.org/wiki/Template:SettingSummary config variable] templates.
=== Who would benefit ===
* Developers who don't need to spend time on creating or updating hook/config var pages whenever they
* Extension maintainers (especially people who routinely change others' extensions) who don't need to track things like required MW version in two different places
* Developers and site admins who will gain access to correct and complete documentation
* Possibly site admins running older MW versions because this would allow a much more user-friendly presentation of MW compatibility information
=== Proposed solution ===
# find a place to store structured mw.org infobox data - there are several possible approaches, see [https://phabricator.wikimedia.org/T155024 T155024: Store structured needed for MediaWiki documentation] for discussion
# write a bot to extract the data from git (extension.json
, hooks.txt
, DefaultSettings.php
phpdoc) and upload it
# fetch the data in the infoboxes
# for hook/configvar pages, make the bot create them automatically with stub content when they don't exist
# run a bot with some regular frequency (e.g. weekly) to update the pages accordingly
=== Endorsements (T155029) ===
#
=== Support (T155029) ===
#
== Structured data side channel for wikitext ==
{{tracked|T156876}}{{../buttons|task=T156876|title=Structured data side channel for wikitext}}
The problem of passing structured data from wikitext to external applications comes up in a wide variety of contexts, and a garden of ugly workarounds has grown around it, usually consisting of encoding the data in the HTML rendered from wikitext in some way, then external applications parsing it out and restoring the structure. Examples include [https://phabricator.wikimedia.org/tag/CommonsMetadata #CommonsMetadata], the various services (#mcs, all kinds of Tool Labs tools) exposing mainpage/featured content (article/picture of the day, anniversaries, in the news etc), article maintenance / warning templates, infoboxes, using Wiktionary for word translation.
Eventually these issues should be handled by separating wikitext and structured data (e.g. with [https://phabricator.wikimedia.org/T107595 T107595: [RFC] Multi-Content Revisions]) but that's a huge project and will take a while. A quick win that would be possible right now and would make the life of developers mining structured from wikitext (and editors maintaining the wikitext) would be to create a side channel where wikitext code can output structured data (with a dedicated parserfunction Lua method), in a simple hierarchic key-value format. The data could exposed by the parser and the parse API, and eventually morph into a virtual MCR slot.
=== Endorsements (T156876) ===
#
=== Support (T156876) ===
#
== Choose a recommended IDE for MediaWiki and maintain a plugin for it ==
{{tracked|T156873}}{{../buttons|task=T156873|title=Choose a recommended IDE for MediaWiki and maintain a plugin for it}}
Good IDE integration convenient for everyone but especially helpful to new contributors who are not experienced coders - they have to learn a thousand new things from code review / distributed version control workflows to security back practices, and if we can avoid adding "learn how to tweak your IDE configuration" to that pile, we can make the learning curve significantly smoother.
A well-integrated IDE would
* ensure that the right coding conventions are followed
* do some of the CI checks in a much more user-friendly way (banana, autoloading etc.)
* provide docs / typing / code completion / clickthrough navigation for systems which IDEs cannot figure out by default (e.g. hooks, global variables, extension-provided services, ResourceLoader modules)
* maybe show docs/help from mediawiki.org
* maybe warn when some MediaWiki best practices are not used (e.g. extension with PHP endpoint)
This does not mean that MediaWiki would be optimized to work with one IDE to the detriment of others, but it's nice to have a default.
PHPStorm integration already seems to have [https://github.com/reedy/phpstorm-plugin-mediawiki some momentum] behind it, but which IDE we focus on is secondary to agreeing to focus on a single one.
=== Endorsements (T156873) ===
#
=== Support (T156873) ===
#
== Showcase how the separation of concerns should work between MediaWiki API and web ==
{{tracked|T156872}}{{../buttons|task=T156872|title=Showcase how the separation of concerns should work between MediaWiki API and web}}
MediaWiki API modules and special pages contain lots of business logic, often duplicated between the two in similar-but-not-quite-identical ways. The business logic in these pages also tends to be inaccessible internally (so MediaWiki code that wants to access the functionality does horrible things like instantiating a SpecialPage
object or making FauxRequest
calls to the API). Everyone agrees the current situation sucks; no one seems to be sure how exactly the right way would look like, so newly written code does not necessarily end up in better shape.
We should pick some special pages and API modules (probably two of each since the answer will look very differently for something that does paged queries and for everything else), refactor them and turn them into a showcase that can be used as a guidance for future work.
=== Endorsements (T156872) ===
#
=== Support (T156872) ===
#
== Core should be aware of the domain it is running on and render mobile domains where necessary ==
{{tracked|T156847}}{{../buttons|task=T156847|title=Core should be aware of the domain it is running on and render mobile domains where necessary}}
Whenever a link to a desktop url is rendered on a mobile page, clicking it can take you to the desktop site (or cause you to go through an unnecessary redirect loop)
This impacts Flow, Echo, MobileFrontend (languageS) and causes lots of development pain (see child tasks).
A call to getFullURL or getFullRequestURL should be aware of the domain it is running on and give the correct result.
Essentially this means doing mobile site detection in core.
[I hope this doesn't turn into a bikeshed of responsive sites vs separate desktop/mobile sites. Both are useful and up to a sysadmin to decide and we should support them both.]
=== Endorsements (T156847) ===
#
=== Support (T156847) ===
#
== Introduce and document a minimum rights hierarchy ==
{{tracked|T156789}}{{../buttons|task=T156789|title=Introduce and document a minimum rights hierarchy}}
=== Problem ===
There seems to be an undocumented and inconsistent hierarchy of rights. For example, if I cannot view a page, I cannot edit it. But: if I can't edit, can I still move? In particular, the API has no concept of a hierarchy of rights. This leads to potential security issues in read protected MediaWikis.
=== Who would benefit ===
Extension developers and MediaWiki maintainers will have a more clear cut security model.
=== Proposed solution ===
We should add a minimum hierarchy in our rights, such as read > edit > other actions, similar to the way we have a hierarchy in the user groups: * > user > other groups. If one cannot read, they cannot do anything else. If one cannot edit, they cannot do any other modifications. I know this is too simplistic, so we need to sketch out a proper hierarchy. The hierarchy should also be documented.
=== Endorsements (T156789) ===
#
=== Support (T156789) ===
#
== Improve support for read access restriction / access control ==
{{tracked|T156788}}{{../buttons|task=T156788|title=Improve support for read access restriction / access control}}
=== Problem ===
There are a lot of extension using the userCan hook for access control. Yet there are still parts of the core where userCan is not considered. This is true in particular for read access. For example, afaik, QueryPages do not consider read access. Quite often, this is as simple as adding a userCan hook call. I'm not proposing to make Mediawiki read access bullet proof, but to fix the most obvious read access holes.
=== Who would benefit ===
Extension developers who need to implement access control for their mediawikis
=== Proposed solution ===
We can use this list as a basis: https://www.mediawiki.org/wiki/Security_issues_with_authorization_extensions . It needs to be updated to the current state of MediaWiki. Then the open questions / issues can be addressed in the code. Ideally, at the end we have a positive list of which pages / actions consider read access.
=== Endorsements (T156788) ===
#
=== Support (T156788) ===
#
== Add examples of the three security review processes ==
{{tracked|T156757}}{{../buttons|task=T156757|title=Add examples of the three security review processes}}
=== Problem ===
[https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Security_reviews Wikimedia Security Team/Security reviews] lists three separate processes which are either recommended or required in order to get an extension deployed on Wikimedia servers.
For non-WMF developers it can be unclear what is needed for each review step and how the Security team expects the information to be presented.
=== Who would benefit ===
* Anyone unfamiliar with the current review practices, likely meaning extension developers outside of the WMF.
* The security team: Clearer expectations/instructions should hopefully help streamline new external review requests.
=== Proposed solution ===
Identify one or a few good real-life examples for each process (or at least the latter two) to promote as case studies.
=== Endorsements (T156757) ===
#
=== Support (T156757) ===
#
== Add a maintenance script for complete cache reset ==
{{tracked|T156695}}{{../buttons|task=T156695|title=Add a maintenance script for complete cache reset}}
=== Problem ===
There seems to be no way to reset all caches programmatically. One example is the Resource Loader minification cache. In extension development and when updating mediawiki installations, I ran into issues with improper cache invalidation. A complete manual reset of all caches helped, yet a script would be very helpful.
=== Who would benefit ===
Developers and MediaWiki maintainers
=== Proposed solution ===
Create a maintenance script which invalidates all caches. Preferrably with the option to selectively invalidate some types of caches, e.g. Resource Loadere minification.
=== Endorsements (T156695) ===
#
=== Support (T156695) ===
#
== Complete documentation about different types of caching for extension developers ==
{{tracked|T156693}}{{../buttons|task=T156693|title=Complete documentation about different types of caching for extension developers}}
=== Problem ===
As an extension developer, I often ran into issues where my code would not be executed / updated due to some cache hit in earlier parts of the code. Documentation is spread out all over MediaWiki, and incomplete.
=== Who would benefit ===
Extension developers who write code that dynamically changes page content and need to know when and how to invalidate the caches.
=== Proposed solution ===
A good starting point is https://www.mediawiki.org/wiki/Manual:Caching. We need to complete the information missing for some cache types. A rough description of which caches are used when in the execution order would be extremely helpful (maybe in relation to some central hooks). Also, some information of how these caches can be invalidated within the code and in general should be added.
=== Endorsements (T156693) ===
#
=== Support (T156693) ===
#
== Fix or replace Module:Assemble multilingual message ==
{{tracked|T156674}}{{../buttons|task=T156674|title=Fix or replace Module:Assemble multilingual message}}
=== Problem ===
The process for updating users about technical changes is far more complex than it should be. If I have a technical newsletter (Tech News, specific newsletters for teams or products) and want to reach out in several languages, this is the process to send it out:
* Generate the text on Meta using a Lua module that combines all languages and adds a switch. [https://meta.wikimedia.org/w/index.php?title=Tech/News/Sandbox&oldid=16250936 See example].
* Realize you had too many languages, the module can't handle more than 20.
* Generate the first 20.
* Generate the next batch.
* Carefully insert the new languages.
* Find the duplication of the source text (typically English) and remove one of them.
* Test it.
* Find a mistake.
* Fix it on Meta.
* Redo the entire process.
* Finally send it out.
=== Who would benefit ===
Anyone keeping users updated about technical changes: upcoming, things they can give feedback on, or new changes.
=== Proposed solution ===
At least make sure we can send out messages to more than 20 languages without inviting mistakes by manual insertion prior to MassMessaging all the wikis.
[https://meta.wikimedia.org/wiki/Module:Assemble_multilingual_message Module:Assemble multilingual message]
=== Endorsements (T156674) ===
#
=== Support (T156674) ===
#
== Improve LTS support of extensions ==
{{tracked|T156640}}{{../buttons|task=T156640|title=Improve LTS support of extensions}}
=== Problem ===
MediaWiki has a LTS release, which receives security updates. Unfortunately most extensions are being developed in current master
or maybe latest wmf*
branch. That means that downloading an extension from REL1_27
branch (e.g. by using Special:ExtensionDistributor) leaves the user with an outdated and potentially insecure codebase.
=== Who would benefit ===
Everyone who wants to use LTS version of MediaWiki and also be sure extensions that work with this version don't have security issues or severe bugs
=== Proposed solution ===
At least the greater WMF extensions like "WikiEditor", "VisualEditor/Parsoid", "FlaggedRevs", "CategoryTree", ... should get bugfix and security cherry-picks/backports from the current development into REL1_27
.
Of course most voluntary, WMF-independent extension developers won't have time/will to do something like this, but maybe we can find a way to help them or at least make them more aware of the LTS release.
=== Endorsements (T156640) ===
#
=== Support (T156640) ===
#
== Review and update the Examples extension ==
{{tracked|T156568}}{{../buttons|task=T156568|title=Review and update the Examples extension}}
Other than ContentAction, which was touched during GCI, the code in the Examples extension hasn't really been touched for a few years. It would be nice to give it a spring clean, updating code to follow modern coding practices for MediaWiki extensions.
=== Endorsements (T156568) ===
#
=== Support (T156568) ===
#
== Document extensions' MediaWiki version compatibility better ==
{{tracked|T156500}}{{../buttons|task=T156500|title=Document extensions' MediaWiki version compatibility better}}
=== Problem ===
Extensions are (supposed to be) documented on a page on MediaWiki. This documentation includes [https://www.mediawiki.org/wiki/Template:Extension Template:Extension] to provide a summary of important information about the extension, including the "required version of MediaWiki".
This version field is inadequate: does specifying "1.19+" mean that the extension's master branch is intended to be compatible with MediaWiki 1.19, or that the extension has release branches going back to REL1_19 while the extension's master branch is only intended to be compatible with MediaWiki core's master branch, or perhaps some middle ground?
Some extensions have solved this by including detailed text in the existing "version" parameter, such as "1.26+ . Flow master is only supported with core's master; otherwise, use matching branches (e.g. Flow REL1_26 with core REL1_26, or matching WMF release branches).", but this is currently done inconsistently and infrequently.
=== Who would benefit ===
Developers making updates to extensions for a core change in compliance with the new [https://www.mediawiki.org/wiki/Deprecation deprecation policy] will no longer have to guess at whether backwards compatibility with older versions of MediaWiki core should be maintained.
Developers of extensions that don't maintain compatibility with old versions of MediaWiki won't have unnecessary and sometimes complex backwards-compatibility code added.
Developers of extensions that do maintain compatibility with old versions of MediaWiki won't have to -1 or -2 as many changes that break backwards compatibility.
=== Proposed solution ===
# Update Template:Extension with parameters to indicate the MediaWiki versions for which release branches exist versus the MediaWiki versions the extension's master branch maintains compatibility with.
# Update existing extension pages to use these parameters, which may involve determining the master-compatibility policy where it's not already specified.
=== Endorsements (T156500) ===
#
=== Support (T156500) ===
#
== Create a developer documentation special interest group ==
{{tracked|T156301}}{{../buttons|task=T156301|title=Create a developer documentation special interest group}}
'''Problem'''A simple dialog window. Press \'Esc\' to close.
' ); this.$body.append( this.content.$element ); } });$wgUploadDirectory
, assuming that the location is accessible from the web. On private wikis this might not be true. Instead of direct delivery by the webserver the img_auth.php
might be invoked. Unfortunately the current implementation sometimes makes it difficult or impossible to deliver files that are not part of the local filerepo.
=== Who would benefit ===
Any extension developer that wants to create files (images, pdfs, JNLP, xml, flat files, ...) dynamically and deliver them by a script entry point.
=== Proposed solution ===
* img_auth.php
could be improved (see also [https://phabricator.wikimedia.org/T153174 T153174: img_auth.php: Improve extendability])
* A complete new entry point could be created
* Some mechanism besides filerepo/filebackend could be created that allows better management (save/cache, delete/invalidate, find) of files created by extensions
=== Endorsements (T156233) ===
#
=== Support (T156233) ===
#
== Provide an easy to use HTML mail system ==
{{tracked|T156231}}{{../buttons|task=T156231|title=Provide an easy to use HTML mail system}}
=== Problem ===
By default MediaWiki sends only oldfashioned plain text mails. With Extension:Echo there is at least a way to deliver some kind of HTML mail, but it is still very hard to customize the layout and design of those mails (need to override/extend classes).
=== Who would benefit ===
All developers that need to customize notification mails. This is especially a need in business contexts.
=== Proposed solution ===
There could be some directory that contains an HTML template and all required resources. E.g /resources/mail/default
which contains body.html
, header.jpg
, styles.css
, ...
Also the use of a library like SwiftMailer or PHPMailer should be considered.
There has been some approaches in the past, and I believe something like this is already planned for the Outreachy [https://phabricator.wikimedia.org/T15303 T15303: Implement HTML e-mail support in MediaWiki]
=== Endorsements (T156231) ===
#
=== Support (T156231) ===
#
== Display changes in sources codes like in Gerrit in red and green ==
{{tracked|T156048}}{{../buttons|task=T156048|title=Display changes in sources codes like in Gerrit in red and green}}
=== Problem ===
When we edit Lua or JavaScript code, we don't see what we have change, and many times that needs a big memory concentration.
=== Who would benefit ===
All Lua or JavaScript coders.
=== Proposed solution ===
When we edit Lua or JavaScript code, highlight differences like in Gerrit ( - + parts of line).
Do this when we explicitly ask differences, but also as option in any change from the previous version.
=== Description of the task ===
The way used in gerrit to display changes in source code seems better than some others.
In code panels, for old and new codes, gerrit colorises them in red and green for each changed character.
Then we could use it for other source codes: javascript for gadgets and user scripts, Lua modules, Lisp for templates.
=== Some chooses to select ===
guerrit uses '''2 ways''', which mode use?
In the '''mixing''' way, gerrit mixes the old and the new code lines, identify them with - and + signs, and hightlights them with red and green colors for each character.
In the '''columns''' way, gerrit put the old code in left and the new code in right column, and hightlights them with red and green colors for each character.
This styles also interfere with the '''usual hightlight''' of each code.
'''Where to use''' these display ways?
Perhaps also in wikitext for wikicontent?
Perhaps also when we display a revision-diff = "Difference between revisions"?
Even in VisualEditor?
'''When to use''' these display ways?
Each time when we edit a script or a module or a template and then clic on "show changes".
Also when we clic "Preview" to see the effect on a test page?
Begin to offer these ways for codes panels, and later extend them to other places?
'''When to activate''' these ways and modes?
When the user chooses these ways for each code panel?
When the user chooses them in his preferences?
=== Endorsements (T156048) ===
#
=== Support (T156048) ===
#
== Make it easier to create an OOUI theme ==
{{tracked|T155562}}{{../buttons|task=T155562|title=Make it easier to create an OOUI theme}}
To create a theme currently: get the oojs/ui repo, copy-paste the src/themes/blank directory to your own name, list the new theme in Gruntfile.js and maybe a few other places (i could find them all if you're serious about doing this), and start writing the CSS
I want a more modular approach, where themes are definable in any extension, and consist of an array of variables in json or less or something which OOUI then assembles that into the css/js itself:
* A set of colours for different things (borders, backgrounds, text, different types of buttons, etc)
* General properties - how much spacing in general, whether to even use any shadows, whether buttons should be gradiented or flat, whether to override fonts, that sort of thing
* Specific properties for all the types of things (buttons, dialogs, inputs, etc) such as rounded corners, gradients, shadows, fonts, padding - none, or specific values (what sort of gradient, what sort of shadow, how much padding)
* ...
* Profit?
=== Endorsements (T155562) ===
#
=== Support (T155562) ===
#
== VE support for skins should be done by adding appropriate anchors/ids/styles to the skins, and not by editing VE itself ==
{{tracked|T155554}}{{../buttons|task=T155554|title=VE support for skins should be done by adding appropriate anchors/ids/styles to the skins, and not by editing VE itself}}
Apparently the only way to make a skin work with VE is to edit VE itself. This makes it very hard to make new skins compatible with VE, despite more and more third-party projects desiring such compatibility.
It should be the other way around - compatibility should be in the skin, not VE. The skin should be edited to meet VE's expectations (maybe have appropriate ids or whatever on things where it should be showing up around the content, or a js snippet specifying how it is supposed to show up, or whatever?), and modified and styled appropriately to look right with it.
And how to do this should be documented somewhere so we can actually, well, do it.
=== Endorsements (T155554) ===
#
=== Support (T155554) ===
#
== Make Monolog the default debug processing layer and deprecate wfDebug* and LegacyLogger ==
{{tracked|T155552}}{{../buttons|task=T155552|title=Make Monolog the default debug processing layer and deprecate wfDebug* and LegacyLogger}}
=== Problem ===
The [https://github.com/php-fig/fig-standards/blob/master/accepted/PSR-3-logger-interface.md PSR3] logging interface has been introduced in MediaWiki to support [https://www.mediawiki.org/wiki/Manual:Structured_logging structured logging], but no coordinated effort has been made to deprecate the use of wfDebug()
, wfDebugLog()
, wfLogDBError()
, and wfErrorLog()
. Several bugs are open in the [https://phabricator.wikimedia.org/tag/mediawiki-debug-logger #mediawiki-debug-logger] project about the lack of parity between debug log usability on the Wikimedia Foundation production cluster and a typical development environment or external deployment of MediaWiki that are directly related to [https://phabricator.wikimedia.org/p/bd808 bd808] taking the structured logging project to a point where it is useful for the WMF but not pushing that usability further for other MediaWiki deployments.
=== Who would benefit ===
* MediaWiki site operators who want better insight into their operational issues
* MediaWiki developers who don't want to think about choosing between two largely compatible but very different debug logging layers
=== Proposed solution ===
* Replace all usage of wfDebug*
in MediaWiki core with direct PSR3 usage.
* Add [https://github.com/Seldaek/monolog Monolog] as a core dependency and the default debug logging solution.
* Make configuring Monolog easier by making helpers in the MediaWiki\Logger\Monolog
namespace.
* Remove MediaWiki\Logger\LegacyLogger
from core. (It could be made a library if there are people who really love it and want to keep maintaining a homegrown debug log formatting and routing layer.)
* Deprecate wfDebug()
, wfDebugLog()
, wfLogDBError()
, and wfErrorLog()
.
=== See also ===
* [https://phabricator.wikimedia.org/T88649 T88649: Convert MediaWiki logging to direct use of PSR-3]
* [https://phabricator.wikimedia.org/T128224 T128224: It is difficult to determine in which log bucket a logging call will end up]
* [https://phabricator.wikimedia.org/T92585 T92585: MediaWiki logging/debugging documentation is out of date]
* [https://phabricator.wikimedia.org/T108650 T108650: $wgMWLoggerDefaultSpi requires special usage, needs to be fixed or documented]
* [https://phabricator.wikimedia.org/T142313 T142313: Add global logging context]
* [https://phabricator.wikimedia.org/T114532 T114532: Monolog logging has no way to interact with MWDebug::debugMsg()]
=== Endorsements (T155552) ===
#
=== Support (T155552) ===
#
== Get rid of UTF-8 encoded as latin-1 ==
{{tracked|T155529}}{{../buttons|task=T155529|title=Get rid of UTF-8 encoded as latin-1}}
=== Problem ===
(copy and paste from https://www.mediawiki.org/wiki/Toolserver:Code_snippets#Fix_UTF-8_encoded_as_latin-1 )
Versions of MySQL before 5 did not have binary nor UTF-8 support (or at least nobody cared). A common workaround was to encode the text as a UTF-8 byte sequence and give that to MySQL which read the byte in as if they were latin-1 characters. This worked alright until MySQL and VARCHAR became more Unicode aware. Now depending on settings, it may convert the latin-1 bytes to Unicode code points then encode to your specified encoding. So è
(U+00E8, UTF-8: c3 a8
) becomes è
(U+00C3 U+00A8, UTF-8: c3 83 c2 a8
), the equivalent python transformation is u'è'.encode('utf-8').decode('latin').encode('utf-8')
. This double encoding is the reason why JOINs will fail with the newer VARBINARY fields used everywhere by MediaWiki now.
This is quite complicated and annoying to work with. Source of bugs.
=== Who would benefit ===
Anyone who uses database access on Toollabs to do queries.
=== Proposed solution ===
Convert the remaining legacy field and tables so we don't have to do tricks like CONVERT( CONVERT( CONVERT( your_column USING latin1) USING binary) USING utf8);
any more?
=== Endorsements (T155529) ===
#
=== Support (T155529) ===
#
== Improve documentation of OOjs UI ==
{{tracked|T155473}}{{../buttons|task=T155473|title=Improve documentation of OOjs UI}}
=== Problem ===
Extension developers and even core MediaWiki developers want to, or should want to, use a standardized design. Even if we don't like the standard itself, by using a standard we can then inherit improvements to the standard with little to no additional work.
OOjs UI is the standard. However, it is difficult to work with, owing to poor documentation. This discourages its use and adoption as a standard.
Compare this to other frameworks such as Semantic UI that make it very easy to use its design. You can use its JavaScript widgets for specialized features, or just use the CSS to get the look and feel if you are dealing with static HTML.
=== Who would benefit ===
People developing new extensions and those working on improvements to MediaWiki core and existing extensions. The Wikimedia Foundation would benefit from less time wasted on figuring out this system.
=== Proposed solution ===
Better documentation that makes it easier for developers to integrate OOjs UI.
=== Endorsements (T155473) ===
#
=== Support (T155473) ===
#
== Remove QUnit CompletenessTest ==
{{tracked|T155194}}{{../buttons|task=T155194|title=Remove QUnit CompletenessTest}}
The CompletenessTest was my attempt at getting a basic code coverage report using run-time inspection instead of static instrumentation.
It was never fully developed, remained somewhat unstable, isn't used by Jenkins or otherwise enabled or encouraged, and its results are not publishable, either. (Only works locally as on the Special:JavaScriptTest HTML view).
The "export" feature for Special:JavaScriptTest introduced in 2014 for Karma and TestSwarm (ba50b3255) lacked support support for loading the CompletenessTest. And when the regular "skinned" mode was deprecated and, eventually, removed last year in 0f9e4ca0f it essentially hasn't been used anymore as far as I can see.
From a Git-wide search I see that various Wikibase repositories still have references to it, so I won't remove it just yet. But it'd be good to know for sure if and how it's being used there. There's no rush behind its removal, but if it's not being used, that I'd rather we remove it from core.
[https://phabricator.wikimedia.org/F5278216 Image]
=== Endorsements (T155194) ===
#
=== Support (T155194) ===
#
== Organize a MediaWiki Documentation Day (similar to the Gerrit Cleanup Day) ==
{{tracked|T126500}}{{../buttons|task=T126500|title=Organize a MediaWiki Documentation Day (similar to the Gerrit Cleanup Day)}}
It would be awesome if we devoted 1 entire day to doing nothing but documenting MediaWiki. There are countless classes and functions in MediaWiki core and its extensions that have no documentation. There is also a serious lack of high level documentation for developers. The problem is that documentation is often an after-thought and rarely gets focused attention from developers.
Some suggested projects that could be part of MediaWiki Documentation Day:
* Tackle some of the blocking tasks at [https://phabricator.wikimedia.org/T2001 T2001: Documentation is out of date, incomplete (tracking)] or [https://phabricator.wikimedia.org/tag/tracking #tracking].
* Add in-code function descriptions for important functions that don't have them.
* Add in-code class descriptions to classes that don't have them.
* Create documentation pages on MediaWiki.org for important classes in core that don't have them. See [https://www.mediawiki.org/wiki/Manual:User.php Manual:User.php] for an example of a page that does exist. High-level documentation, like [https://www.mediawiki.org/wiki/Manual:Title.php#Title_structure Manual:Title.php#Title_structure] is especially useful.
* Clean up our [https://www.mediawiki.org/wiki/Category:Outdated_pages outdated documentation] on MediaWiki.org.
* Create README files for all the extensions that don't have them and make sure that all configuration variables are documented there.
* Create some high-level documentation on how to write new API modules.
'''See Also:'''A simple dialog window. Press \'Esc\' to close.
' ); this.$body.append( this.content.$element ); }; MyDialog.prototype.getBodyHeight = function () { return this.content.$element.outerHeight( true ); }; var myDialog = new MyDialog( { size: 'medium' } ); var windowManager = new OO.ui.WindowManager(); $( 'body' ).append( windowManager.$element ); windowManager.addWindows( [ myDialog ] ); windowManager.openWindow( myDialog );div
s as the source of the dialogue:
A simple dialog window. Press \'Esc\' to close.
__DISAMBIG__
=== Endorsements (T96041) ===
#
=== Support (T96041) ===
#
== ApiQueryImageInfo is crufty, needs rewrite ==
{{tracked|T89971}}{{../buttons|task=T89971|title=ApiQueryImageInfo is crufty, needs rewrite}}
The code is a mess, the limit semantics make no sense, and we have several other options that don't really fit non-images.
The best thing to do here is probably to just write a prop=fileinfo module from scratch so we don't have to worry about backwards compatibility, and then deprecate prop=imageinfo.
Current plans:
* Right now, iilimit specifies the max number of revisions to return per file, which is inconsistent with the rest of the API and isn't particularly sane. For fileinfo, filimits will limit the number of file-info-objects returned per result, and a separate "fioldversions" property (default 0, values integers or 'all') will specify the max number of revisions to be returned per file.
* fistart/fiend may result in the info for the current revision not being returned.
* iiprops has three different metadata properties. There really should be only one, and if possible it should be key-value pairs rather than a list of objects with key and value properties.
** Metadata needs to be separately continuable, see [https://phabricator.wikimedia.org/T86611 T86611: API does not fail gracefully when data is too large].
* Figure out something sane to replace iiurlwidth/iiurlheight/iiurlparam. Maybe multi-valued fiparams?
* prop=stashimageinfo is very odd, it's a prop module but doesn't use any titles. It would make sense to me for prop=fileinfo to have a fifilekeys parameter instead of having a whole separate module for this.
* prop=videoinfo really isn't needed either. Instead we should make it possible for extensions to add additional info to the fileinfo response.
=== Endorsements (T89971) ===
#
=== Support (T89971) ===
#
== Expose php warnings in mediawiki-config more visibly ==
{{tracked|T87447}}{{../buttons|task=T87447|title=Expose php warnings in mediawiki-config more visibly}}
https://gerrit.wikimedia.org/r/#/c/185667/ would probably not have happened if we had noticed the PHP warnings. Changes to mediawiki-config are difficult to test locally, so spelling mistakes are hard to spot.
Most links in https://wikitech.wikimedia.org/wiki/How_to_deploy_code#Test_and_monitor_your_live_code seem to contain no data, or are broken, and don't contain PHP warnings anyway, just fatals and exceptions.
=== Endorsements (T87447) ===
#
=== Support (T87447) ===
#
== Implement addition of un-redirected pages to Special:NewPages and Special:NewPagesFeed ==
{{tracked|T92621}}{{../buttons|task=T92621|title=Implement addition of un-redirected pages to Special:NewPages and Special:NewPagesFeed}}
Per consensus [https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(proposals)/Archive_117#Proposed_technical_change:_show_pages_expanded_from_redirects_on_Special:NewPages_and_Special:NewPagesFeed here], en-wiki's [https://en.wikipedia.org/wiki/User:Samwalton9 Samwalton9] activated [https://en.wikipedia.org/wiki/Special:AbuseFilter/342 the relevant abuse filter] on March 7th. After monitoring the [https://en.wikipedia.org/w/index.php?title=Special:AbuseLog&wpSearchFilter=342 filter log] for a week, we are satisfied that the behavior is ready to be implemented on Special:NewPages and Special:NewPagesFeed, with two caveats:
# Edits that ''revert '' a redirect to a prior state with content should ''not'' appear on these patrol pages.
# Pages where a redirect has been restored should disappear from the patrol pages, by analogy with new pages that have been deleted.
It's our understanding that this behavior can be implemented in the PageCuration extension. Thanks!
=== Endorsements (T92621) ===
#
=== Support (T92621) ===
#
== CodeEditor: Migrate from Ace to CodeMirror ==
{{tracked|T50826}}{{../buttons|task=T50826|title=CodeEditor: Migrate from Ace to CodeMirror}}
Has more features, is lighter(?) and has much better support for Unicode / BiDi text rendering (see http://codemirror.net/demo/bidi.html).
Plus Brion likes it :)
--------------------------
'''Version''': unspecified
'''Severity''': enhancement
=== Endorsements (T50826) ===
#
=== Support (T50826) ===
#
== Relocate CI generated docs and coverage reports ==
{{tracked|T137890}}{{../buttons|task=T137890|title=Relocate CI generated docs and coverage reports}}
Part of / blocker of integration-publisher
* A Jenkins job publish-on-gallium
is executing on gallium to rsync the doc from the publisher instance.
We would need an instance to host the material with PHP5 (some docs need that). Most probably out of the labs support host / on an isolated network. We would need some intermediary system to have the Nodepool building instances to push to.
* Doc and coverage reports are still generated on Nodepool instances
* Building instance push to a publisher system
** might reuse integration-publisher
* Jenkins (or another system) runs a task that fetch from the publisher system to doc.wikimedia.org document root.
We need to find a target host on which to migrate documentation to. It would have Apache / PHP5 code and run code as generated by the various code repository that have doc/coverage enabled.
== Flow originating from webhost ==
{| class="wikitable"
! Proto !! source host !! source IP !! dest network !! dest Host !! dest IP !! dest port !! description
|-
| TCP || webhost || ??? || labs project || publisher || 10.68.16.255 || 873 || Jenkins job doing a rsync
from Webhost to fetch material
|-
|}
== Flow going to webhost ==
{| class="wikitable"
! Proto !! source network !! source host !! source IP !! dest Host !! dest port !! description
|-
| TCP || labs support hosts || scandium || 10.64.4.12 || webhost || 22 || Jenkins master ssh connection
|-
| TCP || production || misc varnish || - || webhost || 80 || Misc cache to Apache serving doc.wm.o
|-
|}
=== Endorsements (T137890) ===
#
=== Support (T137890) ===
#
== Enable and document "WIP" workflow status in Gerrit ==
{{tracked|T135245}}{{../buttons|task=T135245|title=Enable and document "WIP" workflow status in Gerrit}}
Mailing list (wikitech-l) discussion summary from [https://phabricator.wikimedia.org/p/greg greg]:
https://lists.wikimedia.org/pipermail/wikitech-l/2016-May/085611.html
How to do it from Tim L (from Sept 2015) https://lists.wikimedia.org/pipermail/wikitech-l/2015-September/083172.html :
{{quotation|1=
Untested, the change would be something like:
```
diff --git a/project.config b/project.config
index 151eebd..93291e1 100644
--- a/project.config
+++ b/project.config
@@ -12,6 +12,7 @@
owner = group ldap/ops
label-Code-Review = -2..+2 group Project Owners
label-Code-Review = -1..+1 group Registered Users
+ label-WIP = -1..+0 group Registered Users
create = group Project Owners
editTopicName = group Registered Users
viewDrafts = group JenkinsBot
@@ -78,6 +79,11 @@
value = +2 Looks good to me, approved
copyAllScoresOnTrivialRebase = true
copyAllScoresIfNoCodeChange = true
+[label "WIP"]
+ function = AnyWithBlock
+ value = -1 Work in progress
+ value = 0 Ready for reviews
+ copyMinScore = true
[access "refs/meta/dashboards/*"]
create = group Project Owners
create = group platform-engineering
```
}}
Related: mediawiki-core-qunit-jessie
runs against mediawiki/core and has no dependencies beside mediawiki/vendor
. The failing builds are:
{| class="wikitable"
!Time !! Gerrit !! Console !! Chromium
|-
| Dec 12 16:05 UTC || [https://gerrit.wikimedia.org/r/#/c/325049/13 | 325049/13] || [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/9842/console | 9842] || Chrome 53.0.2785
|-
| Dec 12 16:14:41 UTC || [https://gerrit.wikimedia.org/r/#/c/325049/14 | 325049/14] || [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/9843/console | 9843] || Chrome 53.0.2785
|-
| Dec 16 21:47:08 UTC || [https://gerrit.wikimedia.org/r/#/c/325049/15 | 325049/15] || [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/10154/console | 10154] || Chrome 55.0.2883
|-
|}
Note that Gerrit change 325049 hasn't been merged. I have rebuild 9842 twice (in Jenkins UI with the Rebuild link in the web interface) and it failed with the same error. So it seems those patches manage to reproduce the issue at hand.
I have triggered the build against the master
branch and it passed three times [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/10304/console 10304] [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/10305/console 10305] [https://integration.wikimedia.org/ci/job/mediawiki-core-qunit-jessie/10306/console 10306].
=== Endorsements (T153597) ===
#
=== Support (T153597) ===
#
== MediaWiki support for Composer equivalent for JavaScript packages ==
{{tracked|T107561}}{{../buttons|task=T107561|title=MediaWiki support for Composer equivalent for JavaScript packages}}
We should have an equivalent for JavaScript to what we have with Composer for PHP. A simple way to manage dependencies and install required JavaScript modules automatically.
'''Problem'''sumanah
'''Description:'''wg*
).
I'm opening this bug (similar to Skin.mustache
that includes the partial template sidebar.mustache
using
{{>sidebar}}
I notice on my local wiki if I edit Skin.mustache
and view a page with the skin, I see the change to it (good!), but if I edit the partial sidebar.mustache
, I don't see the change to it, even if I use ?action=purge (bug!).
Looking at includes/TemplateParser.php
in core, it does a simple
integration/config
repo (e.g. editing a huge YAML file) and whitelisting of users (who may otherwise be unable to trigger some Jenkins jobs via Gerrit).
= Proposed solution =
Travis CI ''et similia'' only require a simple YAML file in the root directory of each code repository. Supporting such model can greatly streamline the addition of CI to repositories managed by small volunteer teams, decrease reliance on third-party services and ease the work of the few employees dedicated to CI infrastructure.
----
I discussed this on [https://phabricator.wikimedia.org/tag/wikimedia-releng #wikimedia-releng] and [https://phabricator.wikimedia.org/p/mmodell mmodell] thought it was a nice idea, he is the one who came up with the name for the file. But this will help us customize it per repo without needing to create a separate job.
[https://old-bugzilla.wikimedia.org/show_bug.cgi?id=9123000#c123 | comment 123]
(T9125000) ||
|-
| [Bb]ug 9123000#c123 || bug 9123000 [https://old-bugzilla.wikimedia.org/show_bug.cgi?id=9123000#c123 | comment 123]
(T9125000) ||
|-
| http://bugzilla.wikimedia.org/show_bug.cgi?id=9123000 || http://bugzilla.wikimedia.org/show_bug.cgi?id=9123000 (T9125000) ||
|-
| https://bugzilla.wikimedia.org/show_bug.cgi?id=9123000 || https://bugzilla.wikimedia.org/show_bug.cgi?id=9123000 (T9125000) ||
|-
| http://bugzilla.wikimedia.org/9123000 || http://bugzilla.wikimedia.org/9123000 (T9125000) ||
|-
| https://bugzilla.wikimedia.org/9123000 || https://bugzilla.wikimedia.org/9123000 (T9125000) ||
|-
| http://bugs.wikimedia.org/show_bug.cgi?id=9123000 || http://bugs.wikimedia.org/show_bug.cgi?id=9123000 (T9125000) ||
|-
| https://bugs.wikimedia.org/show_bug.cgi?id=9123000 || https://bugs.wikimedia.org/show_bug.cgi?id=9123000 (T9125000) ||
|-
| http://bugs.wikimedia.org/9123000 || http://bugs.wikimedia.org/9123000 (T9125000) ||
|-
| https://bugs.wikimedia.org/9123000 || https://bugs.wikimedia.org/9123000 (T9125000) ||
|-
|}
----
Use case:
@Chasemp: For sentences imported from Bugzilla tickets in comments like
'''* Bug 12345 has been marked as a duplicate of this bug. *'''