Jump to content

Wikipedia:WikiProject Conservatism

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:RIGHT)


    Welcome to WikiProject Conservatism! Whether you're a newcomer or regular, you'll receive encouragement and recognition for your achievements with conservatism-related articles. This project does not extol any point of view, political or otherwise, other than that of a neutral documentarian. Partly due to this, the project's scope has long become that of conservatism broadly construed, taking in a healthy periphery of (e.g., more academic) articles for contextualization.

    Major alerts

    A broad collection of discussions that could lead to significant changes of related articles

    Did you know

    Articles for deletion

    • 06 Jun 2025 – Revue des questions historiques (talk · edit · hist) was AfDed by Mathglot (t · c); see discussion (4 participants; relisted)
    • 28 May 2025Avner Netanyahu (talk · edit · hist) AfDed by Longhornsg (t · c) was closed as delete by Asilvering (t · c) on 14 Jun 2025; see discussion (8 participants; relisted)
    • 27 May 2025Colcom Foundation (talk · edit · hist) AfDed by 30Four (t · c) was closed as keep by Malinaccier (t · c) on 19 Jun 2025; see discussion (8 participants; relisted)

    Categories for discussion

    Redirects for discussion

    Files for discussion

    Good article nominees

    Featured article reviews

    Good article reassessments

    Requests for comments

    Peer reviews

    Requested moves

    Articles to be merged

    Articles to be split

    Watchlists

    WatchAll (Excerpt)
    Excerpt from watchlist concerning all the articles in the project's scope
    Note that your own edits, minor edits, and bot edits are hidden in this tab

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    17 June 2025

    16 June 2025

    15 June 2025

    For this watchlist but about 3X in length, visit: Wikipedia:WikiProject Conservatism/All recent changes
    WatchHot (Excerpt)
    A list of 10 related articles with the most (recent) edits total
    328 edits List of contributors to Project 2025
    83 edits Political appointments of the second Trump administration
    65 edits Mahathir Mohamad
    61 edits Kaitlin Bennett
    45 edits Second presidency of Donald Trump
    42 edits Trump–Musk feud
    40 edits Department of Government Efficiency
    35 edits Donald Trump
    33 edits Benjamin Netanyahu
    31 edits Curtis Yarvin

    These are the articles that have been edited the most within the last seven days. Last updated 20 June 2025 by HotArticlesBot.



    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    For this watchlist but about 5X in length, visit: Wikipedia:WikiProject Conservatism/Hot articles recent changes
    WatchPop (Excerpt)
    A list of 500 related articles with the most (recent) views total

    This is a list of pages in the scope of Wikipedia:WikiProject Conservatism along with pageviews.

    To report bugs, please write on the Community tech bot talk page on Meta.

    List

    Period: 2025-05-01 to 2025-05-31

    Total views: 72,291,566

    Updated: 12:04, 6 June 2025 (UTC)

    Rank Page title Views Daily average Assessment Importance
    1 Donald Trump 1,723,245 55,588 B High
    2 Heil Hitler (song) 1,107,087 35,712 C Low
    3 Elon Musk 1,013,649 32,698 GA Low
    4 Phil Robertson 902,417 29,110 C Low
    5 Karoline Leavitt 751,559 24,243 B Low
    6 Kristi Noem 726,676 23,441 B Low
    7 Jeanine Pirro 598,885 19,318 B Low
    8 JD Vance 530,757 17,121 B Mid
    9 Vladimir Putin 509,545 16,436 B High
    10 Stephen Miller (political advisor) 495,030 15,968 B Low
    11 Narendra Modi 463,750 14,959 GA Top
    12 Friedrich Merz 444,670 14,344 C Mid
    13 Mike Waltz 425,519 13,726 C Low
    14 Marco Rubio 404,725 13,055 B Mid
    15 Imran Khan 399,996 12,903 C Low
    16 George W. Bush 393,225 12,684 B High
    17 Winston Churchill 391,942 12,643 GA Top
    18 Kelsey Grammer 389,001 12,548 B Low
    19 Ronald Reagan 386,465 12,466 FA Top
    20 Charlie Kirk 369,400 11,916 B Low
    21 Reform UK 364,852 11,769 C High
    22 Benjamin Netanyahu 327,633 10,568 B Mid
    23 Pete Hegseth 326,144 10,520 B Low
    24 Richard Nixon 324,257 10,459 FA High
    25 Gary Sinise 299,924 9,674 C Low
    26 George H. W. Bush 294,476 9,499 B High
    27 Pam Bondi 294,227 9,491 C Low
    28 Nigel Farage 280,018 9,032 B Mid
    29 Zionism 279,500 9,016 B Low
    30 White genocide conspiracy theory 276,663 8,924 B Low
    31 Boris Johnson 274,988 8,870 B High
    32 Barron Trump 272,564 8,792 B Low
    33 Dwight D. Eisenhower 264,744 8,540 B High
    34 Theodore Roosevelt 260,664 8,408 B High
    35 Laura Loomer 257,072 8,292 C Low
    36 Republican Party (United States) 252,691 8,151 B Top
    37 Mel Gibson 238,957 7,708 B Mid
    38 Jon Voight 232,836 7,510 C Low
    39 Cold War 227,962 7,353 B Top
    40 Liberal–National Coalition 221,921 7,158 C High
    41 Muhammad Ali Jinnah 218,179 7,038 FA High
    42 Jordan Peterson 213,489 6,886 B Low
    43 Project 2025 207,210 6,684 B Mid
    44 John Wayne 204,965 6,611 B Low
    45 French Revolution 204,811 6,606 B Top
    46 2025 German federal election 196,108 6,326 B High
    47 Margaret Thatcher 194,588 6,277 A Top
    48 Alternative for Germany 188,868 6,092 C Low
    49 Gerald Ford 188,214 6,071 B High
    50 Department of Government Efficiency 185,130 5,971 B High
    51 Trump family 175,552 5,662 B Low
    52 Rishi Sunak 175,304 5,654 B High
    53 Liberal Party of Australia 174,711 5,635 C High
    54 Bharatiya Janata Party 172,743 5,572 GA Top
    55 Linda McMahon 171,636 5,536 B Low
    56 Candace Owens 168,621 5,439 B Low
    57 Marjorie Taylor Greene 166,623 5,374 GA Low
    58 Taliban 164,957 5,321 B High
    59 Nawaz Sharif 162,240 5,233 B Unknown
    60 Kemi Badenoch 161,903 5,222 B Low
    61 Chuck Norris 160,492 5,177 B Low
    62 Stephen Baldwin 159,348 5,140 B Low
    63 Fyodor Dostoevsky 157,415 5,077 B Low
    64 Alliance for the Union of Romanians 156,684 5,054 B Unknown
    65 Dave Mustaine 156,145 5,036 C Low
    66 Nancy Mace 146,698 4,732 B Low
    67 Robert Duvall 142,197 4,587 B Low
    68 Recep Tayyip Erdoğan 142,189 4,586 B High
    69 Nayib Bukele 139,964 4,514 GA Low
    70 People's Action Party 139,257 4,492 C Mid
    71 Curtis Yarvin 136,976 4,418 C High
    72 Francisco Franco 133,375 4,302 C Mid
    73 Chiang Kai-shek 132,312 4,268 C Low
    74 Dan Bongino 130,827 4,220 C Mid
    75 Shirley Temple 130,373 4,205 B Low
    76 Megyn Kelly 129,322 4,171 B Low
    77 National Party of Australia 129,301 4,171 C High
    78 Rupert Murdoch 129,124 4,165 B Low
    79 Herbert Hoover 127,656 4,117 B Mid
    80 Manosphere 125,765 4,056 B Low
    81 James Caan 123,664 3,989 C Low
    82 Dick Cheney 122,475 3,950 C Mid
    83 Conservative Party (UK) 122,206 3,942 B High
    84 William McKinley 121,020 3,903 FA Low
    85 John Malkovich 120,342 3,882 C Low
    86 Bo Derek 120,155 3,875 Start Low
    87 Constitution of the United States 119,676 3,860 B High
    88 Rashtriya Swayamsevak Sangh 118,920 3,836 B Top
    89 Gary Cooper 118,691 3,828 FA Mid
    90 Grover Cleveland 118,558 3,824 FA Mid
    91 New World Order conspiracy theory 118,406 3,819 GA High
    92 Kayleigh McEnany 117,670 3,795 C Low
    93 Nick Fuentes 117,527 3,791 B Low
    94 Atal Bihari Vajpayee 117,311 3,784 GA High
    95 James Stewart 116,981 3,773 GA Low
    96 David Horowitz 115,972 3,741 B Mid
    97 George Santos 115,427 3,723 B Low
    98 Mike Johnson 114,642 3,698 C Mid
    99 Ben Shapiro 114,570 3,695 C Mid
    100 Abdul Qadeer Khan 114,055 3,679 C Low
    101 Charles de Gaulle 112,865 3,640 B Mid
    102 QAnon 110,539 3,565 GA Mid
    103 Enoch Powell 109,483 3,531 C High
    104 Angela Merkel 108,690 3,506 GA High
    105 Liz Truss 108,521 3,500 FA Mid
    106 John Roberts 108,474 3,499 B High
    107 Matt Gaetz 108,470 3,499 B Low
    108 Katter's Australian Party 107,628 3,471 C Low
    109 Lara Trump 106,309 3,429 C Low
    110 Dana Perino 106,134 3,423 C Low
    111 William Howard Taft 105,433 3,401 FA Mid
    112 John McCain 103,486 3,338 FA Mid
    113 Trump derangement syndrome 103,009 3,322 C Mid
    114 Woke 101,573 3,276 B Top
    115 Douglas Murray (author) 101,028 3,258 C Low
    116 Rachel Campos-Duffy 100,955 3,256 Start Low
    117 Andrew Scheer 100,883 3,254 C Unknown
    118 Clark Gable 100,450 3,240 B Low
    119 Warren G. Harding 100,287 3,235 FA Low
    120 Otto von Bismarck 99,867 3,221 B High
    121 Calvin Coolidge 99,392 3,206 FA High
    122 Russell Vought 99,387 3,206 C Mid
    123 Zia-ul-Haq 99,327 3,204 B High
    124 John Howard 99,299 3,203 B Mid
    125 Second presidency of Donald Trump 98,413 3,174 C Low
    126 James A. Garfield 98,001 3,161 FA Low
    127 Shinzo Abe 97,563 3,147 B Mid
    128 Tom Homan 97,181 3,134 C Low
    129 Dmitry Medvedev 96,979 3,128 C High
    130 Deng Xiaoping 96,458 3,111 B Low
    131 Ted Cruz 95,017 3,065 B Mid
    132 Javier Milei 94,974 3,063 B Mid
    133 Mike Pence 94,738 3,056 B Mid
    134 Ayn Rand 94,644 3,053 GA Mid
    135 Steve Bannon 94,223 3,039 B Mid
    136 Conservative Party of Canada 93,198 3,006 B High
    137 Patricia Heaton 92,623 2,987 C Low
    138 Charlton Heston 90,605 2,922 B Low
    139 Falun Gong 90,503 2,919 B Mid
    140 Neville Chamberlain 90,221 2,910 FA Mid
    141 James Woods 89,724 2,894 Start Low
    142 Sean Hannity 89,492 2,886 B Mid
    143 Red states and blue states 88,916 2,868 C Mid
    144 Law and Justice 88,650 2,859 C High
    145 Charles Lindbergh 88,561 2,856 B Low
    146 Călin Georgescu 88,347 2,849 C Low
    147 Danielle Smith 87,684 2,828 B Unknown
    148 James O'Keefe 86,361 2,785 C Low
    149 Trumpism 86,273 2,783 B Mid
    150 Amy Coney Barrett 85,977 2,773 B Low
    151 Paul von Hindenburg 85,916 2,771 C Mid
    152 Jemima Goldsmith 85,833 2,768 C Unknown
    153 Bob Katter 84,895 2,738 C Low
    154 Clarence Thomas 84,537 2,727 B Mid
    155 Christian Democratic Union of Germany 84,435 2,723 C High
    156 Viktor Orbán 83,952 2,708 C Mid
    157 Tony Hinchcliffe 83,455 2,692 B Low
    158 Jesse Watters 83,098 2,680 Start Low
    159 Dark Enlightenment 82,945 2,675 Start Mid
    160 T. S. Eliot 82,592 2,664 B Low
    161 Tucker Carlson 82,486 2,660 B High
    162 Far-right politics 82,443 2,659 B Low
    163 Khawaja Asif 81,837 2,639 B Unknown
    164 Condoleezza Rice 81,288 2,622 B Mid
    165 Mitt Romney 81,173 2,618 FA High
    166 Pauline Hanson's One Nation 80,720 2,603 C Mid
    167 Ron DeSantis 80,236 2,588 B Mid
    168 Karl Malone 80,184 2,586 Start Low
    169 John Kennedy (Louisiana politician) 80,118 2,584 C Low
    170 Greg Gutfeld 79,769 2,573 C Low
    171 Donald Rumsfeld 79,563 2,566 B Mid
    172 Alice Weidel 78,654 2,537 C Low
    173 Rivers of Blood speech 78,433 2,530 C Low
    174 David Cameron 78,401 2,529 B Top
    175 Arthur Wellesley, 1st Duke of Wellington 78,270 2,524 B Low
    176 GypsyCrusader 77,471 2,499 C High
    177 Lee Hsien Loong 77,320 2,494 C Mid
    178 Steve Doocy 77,124 2,487 Start Unknown
    179 Fox News 76,977 2,483 C Mid
    180 Bob Hope 76,686 2,473 B Low
    181 Bing Crosby 76,369 2,463 B Low
    182 Generation 76,209 2,458 B Mid
    183 Rudy Giuliani 75,856 2,446 C Mid
    184 Thomas Sowell 75,681 2,441 C Mid
    185 Matt Walsh (political commentator) 75,224 2,426 C Low
    186 Iran–Contra affair 74,763 2,411 GA Low
    187 Gadsden flag 74,729 2,410 B Low
    188 1964 United States presidential election 74,304 2,396 C Mid
    189 Anders Behring Breivik 74,029 2,388 C Low
    190 Cicero 73,971 2,386 B Mid
    191 George Wallace 73,784 2,380 B Mid
    192 Truth Social 73,608 2,374 B Low
    193 McCarthyism 72,861 2,350 C High
    194 Right-wing politics 72,171 2,328 C Top
    195 Sarah Palin 71,885 2,318 C Mid
    196 Elise Stefanik 71,405 2,303 B Low
    197 Vinayak Damodar Savarkar 71,015 2,290 B High
    198 Fourteen Words 70,724 2,281 Start Low
    199 Mullah Omar 70,572 2,276 B High
    200 Anna Paulina Luna 70,436 2,272 B Low
    201 Ted Nugent 70,346 2,269 C Low
    202 House of Bourbon 70,203 2,264 B High
    203 Neoliberalism 70,141 2,262 B Top
    204 Angie Harmon 69,714 2,248 C Low
    205 Susie Wiles 69,538 2,243 C Low
    206 Roger Ailes 68,905 2,222 C Mid
    207 Ben Carson 68,788 2,218 C Low
    208 Whig Party (United States) 68,500 2,209 C Low
    209 Great Replacement conspiracy theory 68,351 2,204 C Top
    210 Tony Abbott 68,088 2,196 C Mid
    211 Tammy Bruce 67,663 2,182 Start Low
    212 Laura Ingraham 67,424 2,174 C Mid
    213 Craig T. Nelson 67,070 2,163 Start Unknown
    214 Conservatism 67,052 2,162 B Top
    215 Anthony Eden 66,536 2,146 B Mid
    216 Nicolas Sarkozy 66,493 2,144 B High
    217 Stephen Harper 66,445 2,143 GA High
    218 John Locke 66,258 2,137 B Top
    219 David Duke 64,817 2,090 B Mid
    220 Spiro Agnew 64,736 2,088 FA Mid
    221 Brooke Rollins 64,630 2,084 Start Low
    222 Thomas Massie 64,456 2,079 B Low
    223 Social Democratic Party (Portugal) 64,309 2,074 Start Low
    224 Libertarianism 63,918 2,061 B High
    225 Chester A. Arthur 63,152 2,037 FA Low
    226 Daily Mail 63,010 2,032 B Mid
    227 Likud 63,001 2,032 C Low
    228 Capitalism 62,511 2,016 C Top
    229 John Major 62,507 2,016 B High
    230 Theresa May 62,179 2,005 B Mid
    231 Pat Sajak 62,121 2,003 C Low
    232 Right-wing populism 61,732 1,991 B High
    233 Sebastian Gorka 61,658 1,988 C Unknown
    234 Andrew Hastie 61,647 1,988 C Low
    235 Rand Paul 61,411 1,981 GA Mid
    236 Lauren Boebert 61,397 1,980 B Low
    237 Benjamin Harrison 61,059 1,969 FA Low
    238 Nancy Reagan 60,586 1,954 B Mid
    239 False or misleading statements by Donald Trump 60,502 1,951 B Low
    240 AI slop 60,303 1,945 B Low
    241 Liberal National Party of Queensland 60,265 1,944 Start Low
    242 Shigeru Ishiba 60,105 1,938 B Low
    243 Tommy Tuberville 60,100 1,938 B Low
    244 Mitch McConnell 59,942 1,933 B Mid
    245 People Power Party (South Korea) 59,693 1,925 C High
    246 Barbara Stanwyck 59,616 1,923 B Low
    247 L. K. Advani 59,406 1,916 B High
    248 Ustaše 59,214 1,910 C High
    249 The Heritage Foundation 59,058 1,905 B High
    250 David Mamet 58,942 1,901 C Low
    251 Joni Ernst 58,836 1,897 B Low
    252 Pakistan Muslim League (N) 58,760 1,895 B Low
    253 Confederation Liberty and Independence 58,544 1,888 B Low
    254 UK Independence Party 58,316 1,881 B Low
    255 Chuck Grassley 58,249 1,879 C Mid
    256 Winsome Earle-Sears 57,793 1,864 C Low
    257 Brett Cooper (commentator) 57,395 1,851 Start Low
    258 David Perdue 56,609 1,826 B Low
    259 Gamergate (harassment campaign) 56,605 1,825 C Mid
    260 Nacionalista Party 56,399 1,819 Start Low
    261 Scott Baio 56,198 1,812 Start Low
    262 Bill O'Reilly (political commentator) 56,113 1,810 B Mid
    263 White supremacy 56,016 1,806 B Low
    264 Make America Great Again 55,818 1,800 B High
    265 Melissa Joan Hart 55,651 1,795 B Low
    266 Greg Abbott 55,584 1,793 B Mid
    267 Kevin McCarthy 55,385 1,786 B Low
    268 Riley Gaines 55,181 1,780 B Mid
    269 Itamar Ben-Gvir 55,146 1,778 C Mid
    270 The Times of India 55,033 1,775 C Mid
    271 Rutherford B. Hayes 54,901 1,771 FA Low
    272 Neoconservatism 54,583 1,760 C Top
    273 Patrick Bet-David 54,449 1,756 C Low
    274 Tom Clancy 54,430 1,755 C Low
    275 Nikki Haley 54,265 1,750 B Low
    276 Hillbilly Elegy 53,963 1,740 B Low
    277 Jair Bolsonaro 53,805 1,735 B Mid
    278 Kellyanne Conway 53,687 1,731 B Low
    279 Newt Gingrich 53,455 1,724 B High
    280 W. B. Yeats 53,442 1,723 FA Low
    281 Brothers of Italy 52,949 1,708 B Mid
    282 Barry Goldwater 52,758 1,701 B High
    283 Tim Montgomerie 52,587 1,696 C Mid
    284 Left–right political spectrum 52,274 1,686 C Top
    285 Mahathir Mohamad 52,263 1,685 GA High
    286 Proud Boys 52,183 1,683 C Low
    287 Thom Tillis 52,094 1,680 B Low
    288 Harold Macmillan 52,037 1,678 B High
    289 Lindsey Graham 51,959 1,676 C Low
    290 Liberty University 51,954 1,675 B Mid
    291 Edward Teller 51,855 1,672 FA Low
    292 Oliver North 51,697 1,667 C Mid
    293 Leonard Leo 51,648 1,666 C Mid
    294 Gretchen Carlson 51,613 1,664 B Low
    295 Dave Ramsey 51,499 1,661 C Unknown
    296 Critical race theory 51,481 1,660 C Low
    297 Brett Kavanaugh 50,953 1,643 B High
    298 Ron Paul 50,847 1,640 C Mid
    299 Benjamin Disraeli 50,753 1,637 FA Top
    300 Terri Schiavo case 50,703 1,635 GA Low
    301 Blake Moore 50,592 1,632 Start Low
    302 Ray Bradbury 50,274 1,621 B Low
    303 Strom Thurmond 50,007 1,613 B Mid
    304 Ron Johnson 49,929 1,610 C Low
    305 National Rally 49,813 1,606 Unknown High
    306 Rush Limbaugh 49,771 1,605 B High
    307 Samuel Alito 49,498 1,596 C Mid
    308 Harmeet Dhillon 49,495 1,596 Start Low
    309 Denis Leary 48,977 1,579 C NA
    310 John Ratcliffe 48,818 1,574 C Low
    311 Dan Quayle 48,703 1,571 B Mid
    312 Victor Davis Hanson 48,647 1,569 B Mid
    313 James Cagney 48,498 1,564 B Low
    314 Milton Friedman 48,372 1,560 GA High
    315 Lee Zeldin 48,362 1,560 B Low
    316 History of tariffs in the United States 47,780 1,541 B Mid
    317 Last Man Standing (American TV series) 47,511 1,532 B Low
    318 Profumo affair 47,420 1,529 FA Mid
    319 Christopher Luxon 47,350 1,527 B Unknown
    320 Marc Andreessen 47,213 1,523 C Mid
    321 Pat Buchanan 47,136 1,520 B Mid
    322 Robert Menzies 46,964 1,514 C Mid
    323 Park Chung Hee 46,861 1,511 C Low
    324 Tomi Lahren 46,719 1,507 Start Low
    325 Bezalel Smotrich 46,349 1,495 C Mid
    326 Laissez-faire 45,747 1,475 C Top
    327 United Russia 45,691 1,473 B High
    328 Aleksandr Solzhenitsyn 45,488 1,467 B Mid
    329 William F. Buckley Jr. 45,454 1,466 B Top
    330 The Daily Telegraph 45,404 1,464 C Low
    331 Laura Bush 45,206 1,458 GA Low
    332 Kalergi Plan 45,138 1,456 Start Mid
    333 The Wall Street Journal 44,997 1,451 B Mid
    334 Anthony Scaramucci 44,729 1,442 C Low
    335 Hindutva 44,695 1,441 B Top
    336 Martin Heidegger 44,681 1,441 C Low
    337 Paul Ryan 44,617 1,439 C Mid
    338 Pat Boone 44,481 1,434 C Low
    339 Mary Matalin 44,376 1,431 C Low
    340 Howard Buffett 44,145 1,424 Start Low
    341 Stacey Dash 43,909 1,416 C Low
    342 Marine Le Pen 43,713 1,410 B Low
    343 Richard Grenell 43,540 1,404 C Low
    344 John Thune 43,536 1,404 C Low
    345 António de Oliveira Salazar 43,492 1,402 B Mid
    346 Federalist Party 42,977 1,386 B Low
    347 Jacobitism 42,887 1,383 B High
    348 Edward Wood, 1st Earl of Halifax 42,788 1,380 C Low
    349 Mike Huckabee 42,702 1,377 B Mid
    350 Liberal Democratic Party (Japan) 42,594 1,374 C High
    351 Classical liberalism 42,392 1,367 B Top
    352 Rick Scott 42,372 1,366 C Low
    353 Antonin Scalia 42,322 1,365 FA High
    354 Ayaan Hirsi Ali 42,318 1,365 B Low
    355 Grey Wolves (organization) 42,208 1,361 B Mid
    356 Roger Stone 42,198 1,361 C Low
    357 Bob Dole 41,948 1,353 B Low
    358 Harold Holt 41,942 1,352 B Mid
    359 CDU/CSU 41,799 1,348 C Low
    360 Sarah Huckabee Sanders 41,593 1,341 C Low
    361 Éamon de Valera 40,969 1,321 B High
    362 D. H. Lawrence 40,918 1,319 B Unknown
    363 Menachem Begin 40,895 1,319 B Mid
    364 John Birch Society 40,859 1,318 C Low
    365 Milo Yiannopoulos 40,447 1,304 C Low
    366 Elon Musk salute controversy 40,411 1,303 B Low
    367 Walter Brennan 40,403 1,303 C Low
    368 Ashley Moody 40,165 1,295 C Unknown
    369 First presidency of Donald Trump 39,902 1,287 B Low
    370 Brian Mulroney 39,808 1,284 B High
    371 Mark Levin 39,640 1,278 B High
    372 Alpha and beta male 39,369 1,269 C Low
    373 Tea Party movement 39,232 1,265 C Mid
    374 Traditionalist Catholicism 39,147 1,262 C Top
    375 Kelly Loeffler 39,061 1,260 B Low
    376 Ann Coulter 38,886 1,254 B Mid
    377 Neil Gorsuch 38,886 1,254 B Mid
    378 Liaquat Ali Khan 38,885 1,254 B Low
    379 Islamophobia 38,815 1,252 C Mid
    380 Islamism 38,726 1,249 B High
    381 Jeb Bush 38,626 1,246 B Low
    382 Deportation in the second presidency of Donald Trump 38,402 1,238 B Low
    383 John Layfield 38,316 1,236 B Low
    384 Booker T. Washington 38,009 1,226 B Low
    385 Original sin 37,871 1,221 C Low
    386 The Epoch Times 37,656 1,214 B Low
    387 Franz von Papen 37,555 1,211 B Low
    388 Jackson Hinkle 37,430 1,207 B Low
    389 Ginger Rogers 37,397 1,206 C Unknown
    390 Edward Heath 37,342 1,204 B High
    391 Bourbon Restoration in France 37,296 1,203 C High
    392 Turning Point USA 37,085 1,196 C Low
    393 Corey Lewandowski 36,972 1,192 C Low
    394 Patrick Spencer 36,924 1,191 Stub Unknown
    395 Doug Ford 36,891 1,190 B Low
    396 Chris Christie 36,835 1,188 C Low
    397 Alt-right 36,787 1,186 C Mid
    398 Conservatism in the United States 36,211 1,168 B Top
    399 Buddy Carter 36,211 1,168 Start Low
    400 Dinesh D'Souza 36,132 1,165 B Mid
    401 Fianna Fáil 36,055 1,163 B Low
    402 John C. Calhoun 36,026 1,162 FA Top
    403 Thomas Mann 35,573 1,147 C Mid
    404 The Daily Wire 35,567 1,147 C Low
    405 Liz Cheney 34,881 1,125 B High
    406 Flannery O'Connor 34,803 1,122 A Low
    407 John Bolton 34,759 1,121 C Mid
    408 Political appointments of the second Trump administration 34,740 1,120 List Low
    409 Edmund Burke 34,696 1,119 B Top
    410 New York Post 34,570 1,115 C Low
    411 Robert Jenrick 34,557 1,114 C Unknown
    412 Groypers 34,534 1,114 B Low
    413 Party of Young People 34,431 1,110 C Low
    414 Donald Trump and fascism 34,309 1,106 B Mid
    415 Primogeniture 34,288 1,106 Start Low
    416 Carl Schmitt 33,897 1,093 C Top
    417 Don King 33,853 1,092 B Low
    418 Breitbart News 33,597 1,083 C Mid
    419 Justice and Development Party (Turkey) 33,379 1,076 B Low
    420 Anti-communism 33,305 1,074 B Mid
    421 Reform Party of the United States of America 33,217 1,071 C Low
    422 Patriots for Europe 33,075 1,066 C Low
    423 Byron Donalds 33,009 1,064 C Low
    424 Pan-Islamism 33,001 1,064 C High
    425 Friedrich Hayek 32,935 1,062 B Top
    426 Julius Evola 32,926 1,062 B Low
    427 Progressivism 32,789 1,057 C Mid
    428 Jacob Rees-Mogg 32,626 1,052 C Low
    429 Free Democratic Party (Germany) 32,576 1,050 C Mid
    430 Social norm 32,534 1,049 C Top
    431 Deus vult 32,524 1,049 Start Low
    432 Michael Steele 32,460 1,047 B Low
    433 GiveSendGo 32,385 1,044 C Low
    434 David Frum 32,270 1,040 C Low
    435 Race and crime in the United States 32,260 1,040 C Mid
    436 Fred Thompson 32,200 1,038 B Low
    437 Adam Kinzinger 32,169 1,037 C Low
    438 Meir Kahane 32,116 1,036 B High
    439 Mark Rutte 32,035 1,033 C High
    440 Patriarchy 31,967 1,031 B Low
    441 Blue Dog Coalition 31,963 1,031 C Low
    442 John Rocker 31,959 1,030 C Unknown
    443 The Fountainhead 31,930 1,030 FA Low
    444 Geert Wilders 31,905 1,029 B Low
    445 British National Party 31,598 1,019 B Mid
    446 Larry Kudlow 31,598 1,019 B Low
    447 Redneck 31,591 1,019 C Low
    448 Loretta Young 31,554 1,017 C Low
    449 The Second Coming (poem) 31,496 1,016 Start Low
    450 Lee Jun-seok 31,324 1,010 C Low
    451 Joe Scarborough 31,042 1,001 B Low
    452 1924 United States presidential election 31,021 1,000 C Low
    453 Honoré de Balzac 30,787 993 FA High
    454 Chip Roy 30,732 991 B Low
    455 Christian nationalism 30,721 991 Start High
    456 Dennis Miller 30,706 990 Start Low
    457 Nick Land 30,706 990 C Low
    458 Ben Stein 30,700 990 C Low
    459 Otzma Yehudit 30,604 987 B Mid
    460 National Party (South Africa) 30,588 986 C Unknown
    461 Ross Douthat 30,528 984 Start Low
    462 Leader of the Conservative Party (UK) 30,373 979 List Low
    463 Dimes Square 30,364 979 Stub Low
    464 Jane Russell 30,332 978 B Low
    465 Richard B. Spencer 30,285 976 C Low
    466 Ward Bond 30,261 976 C Low
    467 Fidesz 30,252 975 C Unknown
    468 Rick Perry 30,230 975 B Mid
    469 Charles Koch 30,175 973 B Low
    470 Mike Lindell 30,101 971 C Low
    471 Dan Crenshaw 29,971 966 B Low
    472 Jerry Falwell 29,957 966 B High
    473 Moshe Dayan 29,820 961 B Mid
    474 Political spectrum 29,797 961 C Top
    475 Ulf Kristersson 29,795 961 B Low
    476 Naftali Bennett 29,677 957 B Mid
    477 Franklin Graham 29,645 956 B Low
    478 Helmut Kohl 29,632 955 B High
    479 Moshe Feiglin 29,487 951 C Low
    480 Tories (British political party) 29,482 951 C High
    481 Katie Britt 29,403 948 C Low
    482 David Koch 29,383 947 C Mid
    483 Islam in the United Kingdom 29,314 945 B Low
    484 John A. Macdonald 29,295 945 FA High
    485 Gavin McInnes 29,286 944 C Low
    486 Doug Collins (politician) 29,246 943 Start Low
    487 Thrash metal 29,237 943 B Low
    488 List of federal judges appointed by Donald Trump 29,195 941 List Low
    489 Reagan (2024 film) 29,039 936 C Low
    490 Anarcho-capitalism 29,032 936 B Low
    491 Gretchen Whitmer kidnapping plot 28,727 926 C Low
    492 William Rehnquist 28,710 926 B High
    493 12 Rules for Life 28,690 925 B Mid
    494 Jordan Bardella 28,677 925 C High
    495 Michael Knowles (political commentator) 28,578 921 Start Low
    496 Freedom Caucus 28,555 921 C Low
    497 Rumble (company) 28,528 920 Start Low
    498 Aleksandr Dugin 28,498 919 C Mid
    499 Phyllis Schlafly 28,442 917 B High
    500 Tom Cotton 28,256 911 C Low


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    17 June 2025

    For this watchlist but with about 3X in length, See: Wikipedia:WikiProject Conservatism/Recent changes
    Alternative watchlist prototypes (Excerpts)
    See also: Low-importance recent changes
    See also: Mid-importance recent changes
    See also: High-importance recent changes
    See also: Top-importance recent changes
    See also: Preconfigured recent vandalism shortlist

    Publications watchlist prototype beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    17 June 2025

    16 June 2025

    15 June 2025

    14 June 2025

    13 June 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Publications recent changes

    Watchlist of journalists, bloggers, commentators etc., beneath this line:

    Discuss  · Edit

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Journalism recent changes

    Organizations watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    17 June 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Organizations recent changes

    Prototype political parties watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Political parties recent changes

    Prototype politicians watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Politicians recent changes

    Prototype MISC (drafts, templates etc.) watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    20 June 2025

    19 June 2025

    18 June 2025

    17 June 2025

    For a watchlist potentially up to30X in length, see Wikipedia:WikiProject Conservatism/MISC recent changes

    New articles

    A list of semi-related articles that were recently created

    This list was generated from these rules. Questions and feedback are always welcome! The search is being run daily with the most recent ~14 days of results. Note: Some articles may not be relevant to this project.

    Rules | Match log | Results page (for watching) | Last updated: 2025-06-19 20:28 (UTC)

    Note: The list display can now be customized by each user. See List display personalization for details.















    In The Signpost

    One of various articles to this effect
    The Right Stuff
    July 2018
    DISCUSSION REPORT
    WikiProject Conservatism Comes Under Fire

    By Lionelt

    WikiProject Conservatism was a topic of discussion at the Administrators' Noticeboard/Incident (AN/I). Objective3000 started a thread where he expressed concern regarding the number of RFC notices posted on the Discussion page suggesting that such notices "could result in swaying consensus by selective notification." Several editors participated in the relatively abbreviated six hour discussion. The assertion that the project is a "club for conservatives" was countered by editors listing examples of users who "profess no political persuasion." It was also noted that notification of WikiProjects regarding ongoing discussions is explicitly permitted by the WP:Canvassing guideline.

    At one point the discussion segued to feedback about The Right Stuff. Member SPECIFICO wrote: "One thing I enjoy about the Conservatism Project is the handy newsletter that members receive on our talk pages." Atsme praised the newsletter as "first-class entertainment...BIGLY...first-class...nothing even comes close...it's amazing." Some good-natured sarcasm was offered with Objective3000 observing, "Well, they got the color right" and MrX's followup, "Wow. Yellow is the new red."

    Admin Oshwah closed the thread with the result "definitely not an issue for ANI" and directing editors to the project Discussion page for any further discussion. Editor's note: originally the design and color of The Right Stuff was chosen to mimic an old, paper newspaper.

    Add the Project Discussion page to your watchlist for the "latest RFCs" at WikiProject Conservatism Watch (Discuss this story)

    ARTICLES REPORT
    Margaret Thatcher Makes History Again

    By Lionelt

    Margaret Thatcher is the first article promoted at the new WikiProject Conservatism A-Class review. Congratulations to Neveselbert. A-Class is a quality rating which is ranked higher than GA (Good article) but the criteria are not as rigorous as FA (Featued article). WikiProject Conservatism is one of only two WikiProjects offering A-Class review, the other being WikiProject Military History. Nominate your article here. (Discuss this story)
    RECENT RESEARCH
    Research About AN/I

    By Lionelt

    Reprinted in part from the April 26, 2018 issue of The Signpost; written by Zarasophos

    Out of over one hundred questioned editors, only twenty-seven (27%) are happy with the way reports of conflicts between editors are handled on the Administrators' Incident Noticeboard (AN/I), according to a recent survey . The survey also found that dissatisfaction has varied reasons including "defensive cliques" and biased administrators as well as fear of a "boomerang effect" due to a lacking rule for scope on AN/I reports. The survey also included an analysis of available quantitative data about AN/I. Some notable takeaways:

    • 53% avoided making a report due to fearing it would not be handled appropriately
    • "Otherwise 'popular' users often avoid heavy sanctions for issues that would get new editors banned."
    • "Discussions need to be clerked to keep them from raising more problems than they solve."

    In the wake of Zarasophos' article editors discussed the AN/I survey at The Signpost and also at AN/I. Ironically a portion of the AN/I thread was hatted due to "off-topic sniping." To follow-up the problems identified by the research project the Wikimedia Foundation Anti-Harassment Tools team and Support and Safety team initiated a discussion. You can express your thoughts and ideas here.

    (Discuss this story)

    Delivered: ~~~~~


    File:Finally, a public-domain "Finally," template.jpg
    Goodyear
    PD
    47
    0
    442
    WikiProject Conservatism

    Is Wikipedia Politically Biased? Perhaps


    A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.


    Report by conservative think-tank presents ample quantitative evidence for "mild to moderate" "left-leaning bias" on Wikipedia

    A paper titled "Is Wikipedia Politically Biased?"[1] answers that question with a qualified yes:

    [...] this report measures the sentiment and emotion with which political terms are used in [English] Wikipedia articles, finding that Wikipedia entries are more likely to attach negative sentiment to terms associated with a right-leaning political orientation than to left-leaning terms. Moreover, terms that suggest a right-wing political stance are more frequently connected with emotions of anger and disgust than those that suggest a left-wing stance. Conversely, terms associated with left-leaning ideology are more frequently linked with the emotion of joy than are right-leaning terms.
    Our findings suggest that Wikipedia is not entirely living up to its neutral point of view policy, which aims to ensure that content is presented in an unbiased and balanced manner.

    The author (David Rozado, an associate professor at Otago Polytechnic) has published ample peer-reviewed research on related matters before, some of which was featured e.g. in The Guardian and The New York Times. In contrast, the present report is not peer-reviewed and was not posted in an academic venue, unlike most research we cover here usually. Rather, it was published (and possibly commissioned) by the Manhattan Institute, a conservative US think tank, which presumably found its results not too objectionable. (Also, some – broken – URLs in the PDF suggest that Manhattan Institute staff members were involved in the writing of the paper.) Still, the report indicates an effort to adhere to various standards of academic research publications, including some fairly detailed descriptions of the methods and data used. It is worth taking it more seriously than, for example, another recent report that alleged a different form of political bias on Wikipedia, which had likewise been commissioned by an advocacy organization and authored by an academic researcher, but was met with severe criticism by the Wikimedia Foundation (who called it out for "unsubstantiated claims of bias") and volunteer editors (see prior Signpost coverage).

    That isn't to say that there can't be some questions about the validity of Rozado's results, and in particular about how to interpret them. But let's first go through the paper's methods and data sources in more detail.

    Determining the sentiment and emotion in Wikipedia's coverage

    The report's main results regarding Wikipedia are obtained as follows:

    "We first gather a set of target terms (N=1,628) with political connotations (e.g., names of recent U.S. presidents, U.S. congressmembers, U.S. Supreme Court justices, or prime ministers of Western countries) from external sources. We then identify all mentions in English-language Wikipedia articles of those terms.

    We then extract the paragraphs in which those terms occur to provide the context in which the target terms are used and feed a random sample of those text snippets to an LLM (OpenAI’s gpt-3.5-turbo), which annotates the sentiment/emotion with which the target term is used in the snippet. To our knowledge, this is the first analysis of political bias in Wikipedia content using modern LLMs for annotation of sentiment/emotion."

    The sentiment classification rates the mention of a terms as negative, neutral or positive. (For the purpose of forming averages this is converted into a quantitative scale from -1 to +1.) See the end of this review for some concrete examples from the paper's published dataset.

    The emotion classification uses "Ekman’s six basic emotions (anger, disgust, fear, joy, sadness, and surprise) plus neutral."

    The annotation method used appears to be an effort to avoid the shortcomings of popular existing sentiment analysis techniques, which often only rate the overall emotional stance of a given text overall without determining whether it actually applies to a specific entity mentioned in it (or in some cases even fail to handle negations, e.g. by classifying "I am not happy" as a positive emotion). Rozado justifies the "decision to use automated annotation" (which presumably rendered considerable cost savings, also by resorting to OpenAI's older GPT 3.5 model rather than the more powerful but more expensive GPT-4 API released in March 2023) citing "recent evidence showing how top-of-the-rank LLMs outperform crowd workers for text-annotation tasks such as stance detection." This is indeed becoming a more widely used choice for text classification. But Rozado appears to have skipped the usual step of evaluating the accuracy of this automated method (and possibly improving the prompts it used) against a gold standard sample from (human) expert raters.

    Selecting topics to examine for bias

    As for the selection of terms whose Wikipedia coverage to annotate with this classifier, Rozado does a lot of due diligence to avoid cherry-picking: "To reduce the degrees of freedom of our analysis, we mostly use external sources of terms [including Wikipedia itself, e.g. its list of members of the 11th US Congress] to conceptualize a political category into left- and right-leaning terms, as well as to choose the set of terms to include in each category." This addresses an important source of researcher bias.

    Overall, the study arrives at 12 different groups of such terms:

    • 8 of these refer to people (e.g. US presidents, US senators, UK members of parliament, US journalists).
    • Two are about organizations (US think tanks and media organizations).
    • The other two groups contain "Terms that describe political orientation", i.e. expressions that carry a left-leaning or right-leaning meaning themselves:
      • 18 "political leanings" (where "Rightists" receives the lowest average sentiment and "Left winger" the highest), and
      • 21 "extreme political ideologies" (where "Ultraconservative" scores lowest and "radical-left" has the highest – but still slightly negative – average sentiment)

    What is "left-leaning" and "right-leaning"?

    As discussed, Rozado's methods for generating these lists of people and organizations seem reasonably transparent and objective. It gets a bit murkier when it comes to splitting them into "left-leaning" and "right-leaning", where the chosen methods remain unclear and/or questionable in some cases. Of course there is a natural choice available for US Congress members, where the confines of the US two-party system mean that the left-right spectrum can be easily mapped easily to Democrats vs. Republicans (disregarding a small number of independents or libertarians).

    In other cases, Rozado was able to use external data about political leanings, e.g. "a list of politically aligned U.S.-based journalists" from Politico. There may be questions about construct validity here (e.g. it classifies Glenn Greenwald or Andrew Sullivan as "journalists with the left"), but at least this data is transparent and determined by a source not invested in the present paper's findings.

    But for example the list of UK MPs used contains politicians from 14 different parties (plus independents). Even if one were to confine the left vs. right labels to the two largest groups in the UK House of Commons (Tories vs. Labour and Co-operative Party, which appears to have been the author's choice judging from Figure 5), the presence of a substantial number of parliamentarians from other parties to the left or right of those would make the validity of this binary score more questionable than in the US case. Rozado appears to acknowledge a related potential issue in a side remark when trying to offer an explanation for one of the paper's negative results (no bias) in this case: "The disparity of sentiment associations in Wikipedia articles between U.S. Congressmembers and U.K. MPs based on their political affiliation may be due in part to the higher level of polarization in the U.S. compared to the U.K."

    Tony Abbott.
    Most negative sentiment among Western leaders: Former Australian PM Tony Abbott
    Scott Morrison.
    Most positive sentiment among Western leaders: Former Australian PM Scott Morrison

    This kind of question become even more complicated for the "Leaders of Western Countries" list (where Tony Abbott scored the most negative average sentiment, and José Luis Rodríguez Zapatero and Scott Morrison appear to be in a tie for the most positive average sentiment). Most of these countries do not have a two-party system either. Sure, their leaders usually (like in the UK case) hail from one of the two largest parties, one of which is more to the left and the another more to the right. But it certainly seems to matter for the purpose of Rozado's research question whether that major party is more moderate (center-left or center-right, with other parties between it and the far left or far right) or more radical (i.e. extending all the way to the far-left or far-right spectrum of elected politicians).

    What's more, the analysis for this last group compares political orientations across multiple countries. Which brings us to a problem that Wikipedia's Jimmy Wales had already pointed to back in 2006 in response a conservative US blogger who had argued that there was "a liberal bias in many hot-button topic entries" on English Wikipedia:

    "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. ... The idea that neutrality can only be achieved if we have some exact demographic matchup to [the] United States of America is preposterous."

    We already discussed this issue in our earlier reviews of a notable series of papers by Greenstein and Zhu (see e.g.: "Language analysis finds Wikipedia's political bias moving from left to right", 2012), which had relied on a US-centric method of defining left-leaning and right-leaning (namely, a corpus derived from the US Congressional Record). Those studies form a large part of what Rozado cites as "[a] substantial body of literature [that]—albeit with some exceptions—has highlighted a perceived bias in Wikipedia content in favor of left-leaning perspectives." (The cited exception is a paper[2] that had found "a small to medium size coverage bias against [members of parliament] from the center-left parties in Germany and in France", and identified patterns of "partisan contributions" as a plausible cause.)

    Similarly, 8 out of the 10 groups of people and organizations analyzed in Rozado's study are from the US (the two exceptions being the aforementioned lists of UK MPs and leaders of Western countries).

    In other words, one potential reason for the disparities found by Rozado might simply be that he is measuring an international encyclopedia with a (largely) national yardstick of fairness. This shouldn't let us dismiss his findings too easily. But it is a bit disappointing that this possibility is nowhere addressed in the paper, even though Rozado diligently discusses some other potential limitations of the results. E.g. he notes that "some research has suggested that conservatives themselves are more prone to negative emotions and more sensitive to threats than liberals", but points out that the general validity of those research results remains doubtful.

    Another limitation is that a simple binary left vs. right classification might be hiding factors that can shed further light on bias findings. Even in the US with its two-party system, political scientists and analysts have long moved to less simplistic measures of political orientations. A widely used one is the NOMINATE method which assigns members of the US Congress continuous scores based on their detailed voting record, one of which corresponds to the left-right spectrum as traditionally understood. One finding based on that measure that seems relevant in context of the present study is the (widely discussed but itself controversial) asymmetric polarization thesis, which argues that "Polarization among U.S. legislators is asymmetric, as it has primarily been driven by a substantial rightward shift among congressional Republicans since the 1970s, alongside a much smaller leftward shift among congressional Democrats" (as summarized in the linked Wikipedia article). If, for example, higher polarization was associated with negative sentiments, this could be a potential explanation for Rozado's results. Again, this has to remain speculative, but it seems another notable omission in the paper's discussion of limitations.

    What does "bias" mean here?

    A fundamental problem of this study, which, to be fair, it shares with much fairness and bias research (in particular on Wikipedia's gender gap, where many studies similarly focus on binary comparisons that are likely to successfully appeal to an intuitive sense of fairness) consists of justifying its answers to the following two basic questions:

    1. What would be a perfectly fair baseline, a result that makes us confident to call Wikipedia unbiased?
    2. If there are deviations from that baseline (often labeled disparities, gaps or biases), what are the reasons for that – can we confidently assume they were caused by Wikipedia itself (e.g. demographic imbalances in Wikipedia's editorship), or are they more plausibly attributed to external factors?

    Regarding 1 (defining a baseline of unbiasedness), Rozado simply assumes that this should imply statistically indistinguishable levels of average sentiment between left and right-leaning terms. However, as cautioned by one leading scholar on quantitative measures of bias, "the 'one true fairness definition' is a wild goose chase" – there are often multiple different definitions available that can all be justified on ethical grounds, and are often contradictory. Above, we already alluded to two potentially diverging notions of political unbiasedness for Wikipedia (using an international instead of US metric for left vs right leaning, and taking into account polarization levels for politicians).

    But yet another question, highly relevant for Wikipedians interested in addressing the potential problems reported in this paper, is how much its definition lines up with Wikipedia's own definition of neutrality. Rozado clearly thinks that it does:

    Wikipedia’s neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results suggest that Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles.

    WP:NPOV indeed calls for avoiding subjective language and expressing judgments and opinions in Wikipedia's own voice, and Rozado's findings about the presence of non-neutral sentiments and emotions in Wikipedia articles are of some concern in that regard. However, that is not the core definition of NPOV. Rather, it refers to "representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic." What if the coverage of the terms examined by Rozado (politicians, etc.) in those reliable sources, in their aggregate, were also biased in the sense of Rozado's definition? US progressives might be inclined to invoke the snarky dictum "reality has a liberal bias" by comedian Stephen Colbert. Of course, conservatives might object that Wikipedia's definition of reliable sources (having "a reputation for fact-checking and accuracy") is itself biased, or applied in a biased way by Wikipedians. For some of these conservatives (at least those that are not also conservative feminists) it may be instructive to compare examinations of Wikipedia's gender gaps, which frequently focus on specific groups of notable people like in Rozado's study. And like him, they often implicitly assume a baseline of unbiasedness that implies perfect symmetry in Wikipedia's coverage – i.e. the absence of gaps or disparities. Wikipedians often object that this is in tension with the aforementioned requirement to reflect coverage in reliable sources. For example, Wikipedia's list of Fields medalists (the "Nobel prize of Mathematics") is 97% male – not because of Wikipedia editors' biases against women, but because of a severe gender imbalance in the field of mathematics that is only changing slowly, i.e. factors outside Wikipedia's influence.

    All this brings us to question 2. above (causality). While Rozado uses carefully couched language in this regard ("suggests" etc, e.g. "These trends constitute suggestive evidence of political bias embedded in Wikipedia articles"), such qualifications are unsurprisingly absent in much of the media coverage of this study (see also this issue's In the media). For example, the conservative magazine The American Spectator titled its article about the paper "Now We've Got Proof that Wikipedia is Biased."

    Commendably, the paper is accompanied by a published dataset, consisting of the analyzed Wikipedia text snippets together with the mentioned term and the sentiment or emotion identified by the automated annotation. For illustration, below are the sentiment ratings for mentions of the Yankee Institute for Public Policy (the last term in the dataset, as a non-cherry-picked example), with the term bolded:

    Dataset excerpt: Wikipedia paragraphs with sentiment for "Yankee Institute for Public Policy"
    positive "Carol Platt Liebau is president of the Yankee Institute for Public Policy.Liebau named new president of Yankee Institute She is also an attorney, political analyst, and conservative commentator. Her book Prude: How the Sex-Obsessed Culture Damages Girls (and America, Too!) was published in 2007."
    neutral "Affiliates

    Regular members are described as ""full-service think tanks"" operating independently within their respective states.

    Alabama: Alabama Policy Institute
    Alaska: Alaska Policy Forum
    [...]
    Connecticut: Yankee Institute for Public Policy
    [...]
    Wisconsin: MacIver Institute for Public Policy, Badger Institute, Wisconsin Institute for Law and Liberty, Institute for Reforming Government
    Wyoming: Wyoming Liberty Group"
    positive "The Yankee Institute for Public Policy is a free market, limited government American think tank based in Hartford, Connecticut, that researches Connecticut public policy questions. Organized as a 501(c)(3), the group's stated mission is to ""develop and advocate for free market, limited government public policy solutions in Connecticut."" Yankee was founded in 1984 by Bernard Zimmern, a French entrepreneur who was living in Norwalk, Connecticut, and Professor Gerald Gunderson of Trinity College. The organization is a member of the State Policy Network."
    neutral "He is formerly Chairman of the Yankee Institute for Public Policy. On November 3, 2015, he was elected First Selectman in his hometown of Stonington, Connecticut, which he once represented in Congress. He defeated the incumbent, George Crouse. Simmons did not seek reelection in 2019."
    negative "In Connecticut the union is closely identified with liberal Democratic politicians such as Governor Dannel Malloy and has clashed frequently with fiscally conservative Republicans such as former Governor John G. Rowland as well as the Yankee Institute for Public Policy, a free-market think tank."
    positive "In 2021, after leaving elective office, she was named a Board Director of several organizations. One is the Center for Workforce Inclusion, a national nonprofit in Washington, DC, that works to provide meaningful employment opportunities for older individuals. Another is the William F. Buckley Program at Yale, which aims to promote intellectual diversity, expand political discourse on campus, and expose students to often-unvoiced views at Yale University. She also serves on the Board of the Helicon Foundation, which explores chamber music in its historical context by presenting and producing period performances, including an annual subscription series of four Symposiums in New York featuring both performance and discussion of chamber music. She is also a Board Director of the American Hospital of Paris Foundation, which provides funding support for the operations of the American Hospital of Paris and functions as the link between the Hospital and the United States, funding many collaborative and exchange programs with New York-Presbyterian Hospital. She is also a Fellow of the Yankee Institute for Public Policy, a research and citizen education organization that focuses on free markets and limited government, as well as issues of transparency and good governance."
    positive "He was later elected chairman of the New Hampshire Republican State Committee, a position he held from 2007 to 2008. When he was elected he was 34 years old, making him the youngest state party chairman in the history of the United States at the time. His term as chairman included the 2008 New Hampshire primary, the first primary in the 2008 United States presidential election. He later served as the executive director of the Yankee Institute for Public Policy for five years, beginning in 2009. He is the author of a book about the New Hampshire primary, entitled Granite Steps, and the founder of the immigration reform advocacy group Americans By Choice."

    Briefly


    Other recent publications

    Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

    How English Wikipedia mediates East Asian historical disputes with Habermasian communicative rationality

    From the abstract: [3]

    "We compare the portrayals of Balhae, an ancient kingdom with contested contexts between [South Korea and China]. By comparing Chinese, Korean, and English Wikipedia entries on Balhae, we identify differences in narrative construction and framing. Employing Habermas’s typology of human action, we scrutinize related talk pages on English Wikipedia to examine the strategic actions multinational contributors employ to shape historical representation. This exploration reveals the dual role of online platforms in both amplifying and mediating historical disputes. While Wikipedia’s policies promote rational discourse, our findings indicate that contributors often vacillate between strategic and communicative actions. Nonetheless, the resulting article approximates Habermasian ideals of communicative rationality."

    From the paper:

    "The English Wikipedia presents Balhae as a multi-ethnic kingdom, refraining from emphasizing the dominance of a single tribe. In comparison to the two aforementioned excerpts [from Chinese and Korean Wikipedia], the lead section of the English Wikipedia concentrates more on factual aspects of history, thus excluding descriptions that might entail divergent interpretations. In other words, this account of Balhae has thus far proven acceptable to a majority of Wikipedians from diverse backgrounds. [...] Compared to other language versions, the English Wikipedia forthrightly acknowledges the potential disputes regarding Balhae's origin, ethnic makeup, and territorial boundaries, paving the way for an open and transparent exploration of these contested historical subjects. The separate 'Balhae controversies' entry is dedicated to unpacking the contentious issues. In essence, the English article adopts a more encyclopedic tone, aligning closely with Wikipedia's mission of providing information without imposing a certain perspective."

    (See also excerpts)

    Facebook/Meta's "No Language Left Behind" translation model used on Wikipedia

    From the abstract of this publication by a large group of researchers (most of them affiliated with Meta AI):[4]

    "Focusing on improving the translation qualities of a relatively small group of high-resource languages comes at the expense of directing research attention to low-resource languages, exacerbating digital inequities in the long run. To break this pattern, here we introduce No Language Left Behind—a single massively multilingual model that leverages transfer learning across languages. [...] Compared with the previous state-of-the-art models, our model achieves an average of 44% improvement in translation quality as measured by BLEU. By demonstrating how to scale NMT [neural machine translation] to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system."

    "Four months after the launch of NLLB-200 [in 2022], Wikimedia reported that our model was the third most used machine translation engine used by Wikipedia editors (accounting for 3.8% of all published translations) (https://web.archive.org/web/20221107181300/https://nbviewer.org/github/wikimedia-research/machine-translation-service-analysis-2022/blob/main/mt_service_comparison_Sept2022_update.ipynb). Compared with other machine translation services and across all languages, articles translated with NLLB-200 has the lowest percentage of deletion (0.13%) and highest percentage of translation modification kept under 10%."

    "Which Nigerian-Pidgin does Generative AI speak?" – only the BBC's, not Wikipedia's

    From the abstract:[5]

    "Naija is the Nigerian-Pidgin spoken by approx. 120M speakers in Nigeria [...]. Although it has mainly been a spoken language until recently, there are currently two written genres (BBC and Wikipedia) in Naija. Through statistical analyses and Machine Translation experiments, we prove that these two genres do not represent each other (i.e., there are linguistic differences in word order and vocabulary) and Generative AI operates only based on Naija written in the BBC genre. In other words, Naija written in Wikipedia genre is not represented in Generative AI."

    The paper's findings are consistent with an analysis by the Wikimedia Foundation's research department that compared the number of Wikipedia articles to the number of speakers for the top 20 most-spoken languages, where Naija stood out as one of the most underrepresented.

    "[A] surprising tension between Wikipedia's principle of safeguarding against self-promotion and the scholarly norm of 'due credit'"

    From the abstract:[6]

    Although Wikipedia offers guidelines for determining when a scientist qualifies for their own article, it currently lacks guidance regarding whether a scientist should be acknowledged in articles related to the innovation processes to which they have contributed. To explore how Wikipedia addresses this issue of scientific "micro-notability", we introduce a digital method called Name Edit Analysis, enabling us to quantitatively and qualitatively trace mentions of scientists within Wikipedia's articles. We study two CRISPR-related Wikipedia articles and find dynamic negotiations of micro-notability as well as a surprising tension between Wikipedia’s principle of safeguarding against self-promotion and the scholarly norm of “due credit.” To reconcile this tension, we propose that Wikipedians and scientists collaborate to establish specific micro-notability guidelines that acknowledge scientific contributions while preventing excessive self-promotion.

    See also coverage of a different paper that likewise analyzed Wikipedia's coverage of CRISPR: "Wikipedia as a tool for contemporary history of science: A case study on CRISPR"

    "How article category in Wikipedia determines the heterogeneity of its editors"

    From the abstract:[7]

    " [...] the quality of Wikipedia articles rises with the number of editors per article as well as a greater diversity among them. Here, we address a not yet documented potential threat to those preconditions: self-selection of Wikipedia editors to articles. Specifically, we expected articles with a clear-cut link to a specific country (e.g., about its highest mountain, "national" article category) to attract a larger proportion of editors of that nationality when compared to articles without any specific link to that country (e.g., "gravity", "universal" article category), whereas articles with a link to several countries (e.g., "United Nations", "international" article category) should fall in between. Across several language versions, hundreds of different articles, and hundreds of thousands of editors, we find the expected effect [...]"

    "What do they make us see:" The "cultural bias" of GLAMs is worse on Wikidata

    From the abstract:[8]

    "Large cultural heritage datasets from museum collections tend to be biased and demonstrate omissions that result from a series of decisions at various stages of the collection construction. The purpose of this study is to apply a set of ethical criteria to compare the level of bias of six online databases produced by two major art museums, identifying the most biased and the least biased databases. [...] For most variables the online system database is more balanced and ethical than the API dataset and Wikidata item collection of the two museums."

    References

    1. ^ Rozado, David (June 2024). "Is Wikipedia Politically Biased?". Manhattan Institute. Dataset: https://doi.org/10.5281/zenodo.10775984
    2. ^ Kerkhof, Anna; Münster, Johannes (2019-10-02). "Detecting coverage bias in user-generated content". Journal of Media Economics. 32 (3–4): 99–130. doi:10.1080/08997764.2021.1903168. ISSN 0899-7764.
    3. ^ Jee, Jonghyun; Kim, Byungjun; Jun, Bong Gwan (2024). "The role of English Wikipedia in mediating East Asian historical disputes: the case of Balhae". Asian Journal of Communication: 1–20. doi:10.1080/01292986.2024.2342822. ISSN 0129-2986. Closed access icon (access for Wikipedia Library users)
    4. ^ Costa-jussà, Marta R.; Cross, James; Çelebi, Onur; Elbayad, Maha; Heafield, Kenneth; Heffernan, Kevin; Kalbassi, Elahe; Lam, Janice; Licht, Daniel; Maillard, Jean; Sun, Anna; Wang, Skyler; Wenzek, Guillaume; Youngblood, Al; Akula, Bapi; Barrault, Loic; Gonzalez, Gabriel Mejia; Hansanti, Prangthip; Hoffman, John; Jarrett, Semarley; Sadagopan, Kaushik Ram; Rowe, Dirk; Spruit, Shannon; Tran, Chau; Andrews, Pierre; Ayan, Necip Fazil; Bhosale, Shruti; Edunov, Sergey; Fan, Angela; Gao, Cynthia; Goswami, Vedanuj; Guzmán, Francisco; Koehn, Philipp; Mourachko, Alexandre; Ropers, Christophe; Saleem, Safiyyah; Schwenk, Holger; Wang, Jeff; NLLB Team (June 2024). "Scaling neural machine translation to 200 languages". Nature. 630 (8018): 841–846. Bibcode:2024Natur.630..841N. doi:10.1038/s41586-024-07335-x. ISSN 1476-4687. PMC 11208141. PMID 38839963.
    5. ^ Adelani, David Ifeoluwa; Doğruöz, A. Seza; Shode, Iyanuoluwa; Aremu, Anuoluwapo (2024-04-30). "Which Nigerian-Pidgin does Generative AI speak?: Issues about Representativeness and Bias for Multilingual and Low Resource Languages". arXiv:2404.19442 [cs.CL].
    6. ^ Simons, Arno; Kircheis, Wolfgang; Schmidt, Marion; Potthast, Martin; Stein, Benno (2024-02-28). "Who are the "Heroes of CRISPR"? Public science communication on Wikipedia and the challenge of micro-notability". Public Understanding of Science. doi:10.1177/09636625241229923. ISSN 0963-6625. PMID 38419208. blog post
    7. ^ Oeberst, Aileen; Ridderbecks, Till (2024-01-07). "How article category in Wikipedia determines the heterogeneity of its editors". Scientific Reports. 14 (1): 740. Bibcode:2024NatSR..14..740O. doi:10.1038/s41598-023-50448-y. ISSN 2045-2322. PMC 10772120. PMID 38185716.
    8. ^ Zhitomirsky-Geffet, Maayan; Kizhner, Inna; Minster, Sara (2022-01-01). "What do they make us see: a comparative study of cultural bias in online databases of two large museums". Journal of Documentation. 79 (2): 320–340. doi:10.1108/JD-02-2022-0047. ISSN 0022-0418. Closed access icon / freely accessible version


    ToDo List

    Miscellaneous tasks

    Categories to look through

    (See also this much larger list of relevant articles without a lead image)

    Translation ToDo

    A list of related articles particularly good and notable enough to be worthy of a solid translation effort

    Requested articles (in general)

    1. ^ Backman, J. (2022). Radical conservatism and the Heideggerian right : Heidegger, de Benoist, Dugin. Frontiers in Political Science, 4, Article 941799. https://doi.org/10.3389/fpos.2022.941799

    Merging ToDo

    A list of related articles that may have resulted from a WP:POVFORK or may, at least, look like the functional equivalents of one
    Note that the exact target of a potential merge must not be provided here and that multiple options (e.g. generous use of Template:Excerpt) might accomplish the same