domingo, 31 de diciembre de 2017

Emerging Markets Don't Control Their Own Destiny

Their fate is in the hands of big central banks. So much for the "decline of the West."

Daniel Moss

It still doesn't pay to buck the Fed.

For all the hype about the decline of the West, it still largely controls whether emerging markets thrive or suffer.

The broad global pick-up in growth this year propelled emerging markets toward the biggest gains in stocks and currencies in almost a decade. China's debt binge did add ballast to the global expansion, and the country's neighbors are vulnerable to any sudden blow-up there, as implausible as it is. (A gradual unwinding, guided by the state, seems more likely.)

But in a year cluttered with commentary about the retreat of the U.S. and the contentious divorce proceedings between Britain and the EU, it's worth remembering that in some arenas the Old still punches way above its weight relative to the New. The withdrawal of monetary accommodation in the U.S. and the euro zone enters a new phase in 2018, and how that plays out will matter far more than anything emerging markets do themselves. Even Japan may begin easing up on liquidity.

The Fed is further ahead than anyone. Unlike in past Fed interest-rate cycles, big emerging market countries like Indonesia and Brazil have been able cut borrowing costs. That's partly because inflation has been relatively low, and also because the Fed's actions have been predictable; forward guidance has been a reliable script. Here's the risk: What if the Fed feels it needs to be more aggressive in 2018 than indicated, because unemployment gets so low that wages and inflation pick up?

Emerging markets can probably handle four Fed hikes next year, rather than the three indicated by the celebrated "dot plot." The tension is that four aren't priced by investors. Some models even show less than two steps baked in. A shift in pricing could spell volatility in emerging market assets, says Luis Oganes, head of global emerging market research at JPMorgan. Things might start to get hairy if the Fed has to again do more in 2019.

The European Central Bank is on track to end quantitative easing in September. It doesn't say that baldly, only that the current whittled-down program of 30 billion euros a month runs until September. But you can't be as optimistic as Mario Draghi was last week, and then say you need more QE beyond the current horizon. It's a gradual phasing out, but when it ends, a chapter in monetary history is over.

Japan, for one, seems closer to the end of QE than to the beginning, even if no immediate cutoff is likely.

The second big threat to EM also originates in the developed world. And it just wouldn't be December 2017 without mentioning bitcoin or the flattening U.S. yield curve, right? The yield curve gets the prize. While its record as a predictor of recession isn't perfect, it might be telling us something is amiss. There is little indication from elsewhere that a recession is approaching. Something to watch, cautions Oganes, is whether the curve points to reduced lending appetite from banks, which would in turn erode credit supply and constrain growth.

Political risks are scattered through individual countries, principally elections, though not on a level that is likely to prove systemic. Elections in Mexico and Brazil top the list. On the former, left-wing populist Andres Manuel Lopez Obrador is ahead in most opinion polls, and some of his positions are hard for investors to stomach. That said, Lopez Obrador is trying to placate critics. He's made several visits to New York recently and this month said he would name Carlos Urzua as finance minister if he wins. Urzua is a former finance minister of Mexico City and well respected. Lopez Obrador may be less of a revolutionary than investors fear.

Then again, perhaps politics doesn't matter so much anyway. After all, 12 months ago one of the most-cited risks was that Donald Trump would start a trade war and that globalization was dead. Either of those would have gravely wounded emerging market assets.

Good thing the developed world central banks are really running things. China and India are rising, but can't yet bend the financial world to their will consistently. It still doesn't pay to fight the Fed.

The top 10 governor’s races of 2018

Rubén Weinsteiner

Democrats face another critical test at the state level in 2018, hoping to rebuild power in governors' mansions around the country.

Years of strife with Illinois Democrats has left Republican Gov. Bruce Rauner in a weak position going into his reelection campaign.

The battle over the House majority has taken center stage in Washington. But Democrats face another critical test at the state level in 2018, with the party hoping to rebuild its power in governor's mansions around the country after several years of decline.

Republicans hold 33 governorships, to just 16 for Democrats, heading into 2018 — but that could change rapidly next November. The political environment looks bad for the GOP, and the current governors are term-limited in a number of key states, giving Democratic candidates an opportunity to run for open seats in blue states like New Mexico and Maine. The party has also been energized by the opportunity to win seats at the table in the next round of redistricting, which was controlled by Republicans in most states the last time congressional and state legislative district lines were drawn, in 2011 and 2012.

Republicans still have some opportunities to expand though, eyeing both Connecticut and Alaska — the only state with an independent governor — as pickup opportunities in 2018. Here are POLITICO's 10 governorships most likely to change parties in the 2018 elections:
1. Illinois — Republican Gov. Bruce Rauner is running for reelection.

Years of strife with Illinois Democrats has left Rauner in a weak position. A Morning Consult poll released in late October found Rauner’s approval rating at 30 percent, while a 55 percent majority said they disapproved of him — and the Democratic Governors Association has had a laser-like focus on Rauner for years ahead of his blue-state reelection run. One of the governor's strengths is his wealth, which Rauner has poured into advertising in Illinois. But billionaire Democrat J.B. Pritzker has emerged as the front-runner to face Rauner and will be able to match the governor in spending if he wins the nomination. It could be one of the most expensive state races ever.

Rep. Michelle Lujan Grisham is the front-runner in a crowded, divided Democratic primary for the New Mexico governorship. | Chip Somodevilla/Getty Images
2. New Mexico — Republican Gov. Susana Martinez is term-limited.

Martinez is term-limited in New Mexico, leaving an open gubernatorial race in a state that’s trending blue with a large Hispanic population. Rep. Michelle Lujan Grisham is the front-runner in a crowded, divided Democratic primary. She could face a House colleague in the general election: Republican Rep. Steve Pearce, who lost a previous bid for statewide office in 2008. Martinez’s approval ratings are underwater and Democrats see New Mexico, a state that Hillary Clinton won in 2016, as one of their best gubernatorial pickup opportunities in 2018.

There are nearly a dozen candidates in the Maine Democratic primary for governor, including state Attorney General Janet Mills, pictured in 2010. | Robert F. Bukaty/AP Photo
3. Maine — Republican Gov. Paul LePage is term-limited.

There are nearly a dozen candidates in the Maine Democratic primary for governor, including state Attorney General Janet Mills, former state House Speaker Mark Eves, and attorney Adam Cote. Some Republicans were hoping Sen. Susan Collins, who floated running for governor, would jump into the primary. Collins decided not to run for governor in October, leaving a handful of lesser-known Republicans, including state Senate President Mike Thibodeau and former state Health and Human Services Commissioner Mary Mayhew, to vie for the nomination. Democrats are eager to tie whoever emerges from the primary to the controversial LePage, whose disapproval rating is over 50 percent.
4. Connecticut — Democratic Gov. Dannel Malloy is retiring.

Deep-blue Connecticut is actually one of Republicans' best opportunities in 2018. Malloy’s approval ratings were some of the worst among any governor in the country, and he decided not to run for a third term. But Republicans hope that environment in the state will clear the way for their candidate next fall. There are almost a dozen candidates running in the Republican primary and it’s unclear who will emerge as the nominee. A Tremont Public Advisors LLC poll conducted in mid-December found a generic Republican candidate beating a generic Democratic candidate — 35 percent to 23 percent, with 42 percent undecided — despite the fact that Connecticut has not voted Republican at the presidential level since 1988.

The Republican front-runner in Nevada is state Attorney General Adam Laxalt, though he still faces a tough road ahead in 2018. | Carolyn Kaster/AP
5. Nevada — Republican Gov. Brian Sandoval is term-limited.

Sandoval is leaving the Nevada governor’s mansion as one of the most popular governors in the country, and the Republican front-runner to replace him is state Attorney General Adam Laxalt. But Laxalt, a rising star in national Republican circles who has clashed with Sandoval in the past, still faces a tough road ahead in 2018. Nevada has been getting more Democratic, and operatives from both parties say the state Democratic Party is one of the most organized in the country. They are hoping to ride momentum from electing a Democratic senator and carrying the state for Hillary Clinton in 2016. Either Clark County Commission Chair Steve Sisolak or Vice Chair Chris Giunchigliani could prove formidable in the general election after a 2018 primary.

Most polling of Florida's Republican gubernatorial primary has shown state Agriculture Commissioner Adam Putnam leading. | Mark Wallheiser/AP Photo
6. Florida — Republican Gov. Rick Scott is term-limited.

Crowded primaries on both the Democratic and Republican sides in a state that often elects politicians by narrow margins makes Florida difficult to predict. Most polling of the Republican gubernatorial primary has shown state Agriculture Commissioner Adam Putnam leading the field, but President Donald Trump may have scrambled the field by tweeting support for Rep. Ron DeSantis just before Christmas. Since Trump's tweet, DeSantis has won the support of a number of billionaire Republican donors, calling into question just how strong a hold Putnam has on the primary field. The Democratic primary is even hazier, with former Rep. Gwen Graham and Tallahassee Mayor Andrew Gillum are often mentioned as top tier candidates. Most head-to-head matchups between Putnam and one of the four Democrats running in the primaries have shown a single-digit race, and Democrats are bullish about winning the governor's race for the first time in two decades given the political environment.

Alaska is a reliably Republican state, and the GOP likes its odds against Bill Walker, one of the more unusual governors in the country. | Becky Bohrer/AP Photo
7. Alaska — Independent Gov. Bill Walker is running for reelection.

Walker is an independent, which means he doesn’t enjoy the support of either the Republican Governors Association or the Democratic Governors Association. No Democrat has jumped into the race to challenge Walker from that side, although former Sen. Mark Begich’s name has been floated. On the Republican side, the RGA and a trio of declared GOP gubernatorial candidates — former state House Speaker Mike Chenault, businessman Scott Hawkins and former state Senate President Charlie Huggins — are eager to unseat Walker. The Alaska Republican primary is late, on Aug. 21, so the field could get bigger or change before then. But the bottom line is that Alaska is a reliably Republican state, and the GOP likes its odds against one of the more unusual governors in the country.

Polling has shown former state Senate Democratic leader Gretchen Whitmer leading Michigan's Democratic primary. | Jose Juarez/AP Photo
8. Michigan — Republican Gov. Rick Snyder is term-limited.

Democrats are eager to paint whoever emerges out of the Republican primary as a carbon copy of the term-limited Snyder. Polling has shown former state Senate Democratic leader Gretchen Whitmer leading the Democratic primary field over Abdul El-Sayed, a physician and favorite of the progressive left, and engineer Shri Thanedar. On the Republican side, state Attorney General Bill Schuette is running against Lt. Gov Brian Calley and state Rep. Patrick Colbeck. Trump has already endorsed Schuette, and polling has shown him with a comfortable lead over the Republican primary field. General-election polls show tight races between the major candidates.

Richard Cordray’s presence in the Ohio race has already attracted the support of high- profile Democrats. | Steve Helber/AP Photo
9. Ohio — Republican Gov. John Kasich is term-limited.

A late entrance by former CFPB Director Richard Cordray into the primary has added a Democrat with a national profile to the list of about five candidates competing for the nomination. But Republicans in Ohio for months have been gearing up for an aggressive primary and general election to succeed Kasich. State Attorney General Mike DeWine and Secretary of State Jon Husted have united under one ticket, while Rep. Jim Renacci, a former businessman first elected to the House in 2010, is running as a Trumpian outsider. Cordray’s presence in the race has already attracted the support of high profile Democrats like Sen. Elizabeth Warren, who endorsed Cordray shortly after he jumped into the race.

A crowded Democratic primary in a Democratic-leaning state likely heralds a difficult reelection battle for Maryland Gov. Larry Hogan. | Stephan Savoia/AP Photo
10. Maryland — Republican Gov. Larry Hogan is running for reelection.

Hogan has remained popular and kept his distance from Trump in deep-blue Maryland. But a crowded Democratic primary with progressive energy in a Democratic leaning state likely heralds a difficult reelection battle for Hogan. No clear front-runner has emerged from the primary which includes former NAACP President Ben Jealous, Prince George’s County Executive Rushern Baker and former State Department official Alec Ross. Polling has shown Hogan with low-double digit leads against a generic Democrat, but that’s before an actual nominee has been picked. 
Rubén Weinsteiner
Rubén Weinsteiner

Does the White Working Class Really Vote Against Its Own Interests?

Trump’s first year in office revived an age-old debate about why some people choose race over class—and how far they will go to protect the system.


As his first year in the White House draws to a close, Donald J. Trump remains in almost every respect a singular character. He exists well outside the boundaries of what most observers previously judged possible, let alone respectable, in American politics. To catalogue the norms he has violated, the traditions he has traduced or trampled, and the rules—written and unwritten—that he has either cunningly sidestepped or audaciously blown to smithereens would require volumes. Love him or loath him, Trump operates apart from history.

Yet if Trump defies history, paradoxically, he has also resurfaced questions that historians have long debated, including some that many considered settled for many years. In this sense, Trump hasn’t just defied history; he has changed it—and he has changed the way that we think about it, forcing us to look back on our past with a new lens.

The first in this series, perhaps the most fundamental, centers around the white working class. Are working-class white voters shooting themselves in the foot by making common cause with a political movement that is fundamentally inimical to their economic self-interest? In exchange for policies like the new tax bill, which several nonpartisan analyses conclude will lower taxes on the wealthy and raise them for the working class, did they really just settle for a wall that will likely never be built, a rebel yell for Confederate monuments most of them will never visit, and the hollow validation of a disappearing world in which white was up and brown and black were down?

If they did accept that bargain, why? Or are we missing something? Might working-class whites in fact derive some tangible advantage from their bargain with Trump? Is it really so irrational to care more about, say, illegal immigration than marginal income tax rates?

These are good questions. They’re also not new ones. The historian W.E.B. Dubois asked them more than 80 years ago in his seminal work on Reconstruction, when he posited that working-class Southern whites were complicit, or at least passive instruments, in their own political and economic disenfranchisement. They forfeited real power and material well-being, he argued, in return for the “psychological” wages associated with being white.

Since then, the issue has inspired a vibrant debate among historians. Until last year, most agreed with DuBois that the answer to the question was not so simple as “yes” or “no”—that whiteness sometimes conferred benefits both imaginary and real.

In the age of Trump, we’re once again pressure-testing DuBois’ framework. As one might expect, it’s complicated. White identity pays dividends you can easily bank, and some that you can’t.


In 1935 Du Bois published his most influential treatise, Black Reconstruction, a reconsideration of the period immediately following the Civil War. One of the historical quandaries that Du Bois addressed was the successful effort of white plantation owners in the 1870s and 1880s in building a political coalition with poor, often landless, white men to overthrow biracial Reconstruction governments throughout the South.

“The theory of laboring class unity rests upon the assumption that laborers, despite internal jealousies, will unite because of their opposition to the exploitation of the capitalists,” wrote Du Bois, who trained at both the University of Berlin and Harvard, and whose grounding in Marxist political economy taught him to view politics through the lens of different but fixed stages in capitalist development. “This would throw white and black labor into one class,” he continued, “and precipitate a united fight for higher wages and better working conditions.”

That, of course, is not what happened. In most Southern states, poor whites and wealthy whites forged a coalition that overthrew biracial Reconstruction governments and passed a raft of laws that greatly benefited plantation and emerging industrial elites at the expense of small landowners, tenant farmers and factory workers. “It failed to work because the theory of race was supplemented by a carefully planned and slowly evolved method,” Du Bois wrote, “which drove such a wedge between white and black workers that there probably are not today in the world two groups of workers with practically identical interests who hate and fear each other so deeply and persistently and who are kept so far apart that neither sees anything of common interest.”

Du Bois famously posited that “the white group of laborers, while they received a low wage, were compensated in part by a sort of public and psychological wage. They were given public deference and titles of courtesy because they were white.”

Decades before so many white working-class citizens of Pennsylvania, Michigan, Ohio and Wisconsin—to say nothing of Alabama, West Virginia and Mississippi—cast their lot with a party that endeavors to raise their taxes and gut their health care, Du Bois identified the problem: Some wages aren’t denominated in hard currency. They carry a psychological payoff—even a spiritual one.


The most obvious time and place to pressure-test Du Bois’ theory is the Jim Crow South. In the 60-odd years between the collapse of Reconstruction and World War II, the South—still reeling from the Civil War, in which it lost the present-day equivalent of approximately $5.5 trillion in real property and wealth—slipped into a semi-permanent state of economic crisis.

In 1938, President Franklin Roosevelt declared the region “the Nation’s No. 1 economic problem.” It was, as historian Gavin Wright famously observed, a “low-wage region in a high-wage country,” one where two-thirds of the population lived in small towns of fewer than 2,500 people, derived meager incomes from agriculture, mining or manufacturing, and even in the midst of a national depression, stood out for poor health, want of education and lack of opportunities for upward mobility.

The vast majority of farmers, black and white, were tenants or sharecroppers, and repressive poll taxes disenfranchised not just black men and women, but also poor white people. Designed by wealthy plantation owners and industrialists, the poll tax was expressly a class measure, meant to preserve the region’s prevailing low-tax, low-wage, low-service economy. It was more ingenious and insidious than many people today realize. In Mississippi and Virginia, it was cumulative for two years; if a tenant farmer or textile worker couldn’t pay in any given year, not only did he miss an election cycle, he had to pay a full two years’ tax to restore his voting rights. In Georgia, the poll tax was cumulative from the time a voter turned 21 years old—meaning, if one missed 10 years, he or she would have to pay a decade’s worth of back taxes before regaining the right to vote. In Texas, the tax was due on February 1, in the winter off-season, when farmers were habitually strapped for cash. It was, as one Southern liberal observed at the time, “like buying a ticket to a show nine months ahead of time, and before you know who’s playing, or really what the thing is all about.”

Little wonder that in 1936, three of four voting-age adults outside the South participated in the presidential election, but in the South, just one in four cast ballots. The system kept men like Eugene Cox, a conservative Democrat who held the powerful post of House Rules Committee chairman, in power. In 1938, Cox won re-election with 5,137 votes, though his district in southwest Georgia had a total population of 263,606 residents.

Yet when working-class Southern whites could participate in the political process, they often jettisoned their natural class interests in favor of racial solidarity. Historians have focused special attention on the rise and fall of the Readjuster movement, a biracial coalition that controlled the legislature, governor’s office and most federal posts in Virginia between 1879 and 1883. Forged in opposition to a conservative Democratic establishment that had shuttered schools, imposed regressive taxes, and favored creditors over debtors, the alliance passed a raft of measures that presaged much of the Populist movement’s agenda in coming years. For a time, it held. But in 1883 Democrats campaigned with intense focus on the issue of inter-marriage and miscegenation—a rare phenomenon that nevertheless struck a raw nerve with white workers and farmers. They warned that Readjuster rule would result in “mixed schools now and mixed marriages for the future.” It worked. Conservative “Bourbon” Democrats regained control of state government and reintroduced regressive, one-party rule that benefited a small minority of Virginians.

To reduce Jim Crow politics to a single trajectory is to oversimplify a complicated story. But the problem of white working-class Southerners bedeviled generations of liberal activists and the historians who studied them. When the union federation Congress of Industrial Organizations (CIO) launched Operation Dixie, a massive effort to unionize Southern workers in the mid-1940s, organizers ran into the same wall: Conservative politicians and their wealthy patrons successfully used race as a cudgel to turn white workers away from collective bargaining agreements that would have raised their wages. Even those Southern populists who ostensibly opposed Bourbon rule—from Georgia’s Tom Watson in the early 20th century to Mississippi’s Theodore Bilbo in the 1930s—more often flipped the playbook and used race as a blunt instrument against their elite opponents.

Southern liberals in the 1930s and 1940s applied a sharp class focus and concluded that wealthy Democrats wanted, in historian Gavin Wright’s words, to keep labor “cheap and divided.” The white liberal writer Lillian E. Smith famously captured this thinking in her short story, “Two Men and a Bargain,” which began: “Once upon a time, down South, a rich white man made a bargain with a poor white ... ‘You boss the nigger, and I’ll boss the money.’”


Critically, Du Bois never insisted that the psychological wages of whiteness were wholly devoid of tangible value. What they forfeited in material benefits, working-class whites also recouped in limited power and privilege. “They were admitted freely with all classes of white people to public functions, public parks, and the best schools,” he wrote. “The police were drawn from their ranks. … The newspapers specialized on news that flattered the poor whites and utterly ignored the Negro except in crime and ridicule. On the other hand, the Negro was subject to insult; was afraid of mobs; was liable to the jibes of children and the unreasoning fears of white women; and was compelled almost continuously to submit to various badges of inferiority.” You couldn’t necessarily buy groceries with these benefits, but they were palpably meaningful.

David Roediger, a historian of class and race who writes with a Marxian lens, emphasized exactly this point in his classic volume, The Wages of Whiteness, published in 1991 (the title was a direct tribute to Du Bois). He encouraged a generation of scholars to consider that working-class whites may not have been unwitting dupes in their own economic subjugation; instead, they knowingly harvested certain real advantages of whiteness. While this pattern was most visible in the South, it also deeply influenced political culture in the North and West, where whiteness was no less central to popular conceptions of American citizenship. And Roediger’s focus was on Northern workers in antebellum cities—workers undergoing the jarring transition from pre-industrial forms of work and leisure to a more regimented existence as wage laborers.

The workers whom Roediger describes, and whom dozens more scholars would similarly study, understood that American citizenship was predicated on race and independence; Congress, after all, had opened citizenship to all “free white persons” in 1790. That law remained on the books into the 20th century. But what did it mean to be “white?” Congress never made that point clear. Indeed, there was no immediate consensus that certain new immigrants met the qualification. And what did it mean to be “free?” Their new status as wage earners—economically dependent on other men to earn a living—seemingly made many working men and women something less than free. Many non-black workers keenly understood that they might be left outside the boundaries of citizenship. They also resented new forms of industrial discipline that their employers foisted onto them. Many addressed these anxieties by drawing a sharp dichotomy between white and black—citizen and slave—and placing themselves on one side of that divide.

They became avid purveyors of blackface minstrelsy—a popular form of entertainment in which working-class whites reveled in watching other working-class whites apply burnt cork to their faces and act out what the historian George Rawick (writing more generally about early American racism) described as a “pornography of [their] former [lives].” The black characters they portrayed on stage were shiftless, sexually promiscuous and rowdy; they reveled in pre-industrial activities like hunting. They were coarse. In short, they deflected on black people, both slave and free, the very same social demerits that wealthier whites—who were trying to impose new discipline on the urban working class—ascribed to them.

Playbills commonly “paired pictures of the performers in blackface and without makeup—rough and respectable,” Roediger observed. The former were labeled, “Plantation Darkeys.” The latter, “Citizens.” By culturally differentiating themselves from black people, actors and audience members alike established themselves as “free white persons.”

While it’s easy to imagine that working-class whites embraced the new racial dichotomy in order to enjoy leverage in the new urban job market, in many cases, black and white workers weren’t even in competition with each other. Many of the most popular blackface actors were former artisans and mechanics—coach makers, typesetters and wood craftsmen who were now increasingly likely to fall into “wage slavery.” They were unlikely to vie for employment with free black men, who were normally consigned to unskilled jobs as dockworkers, day laborers, and (until Irish women displaced them) domestic servants.

One group that did sometimes compete for unskilled jobs with African Americans were Irish immigrants. Regarded as racially suspect—depicted in political cartoons as dark and ape-like, and patently unqualified for citizenship—Irish immigrants became some of the most avid and violent practitioners of white identity politics. Even when they weren’t in direct competition with back men for jobs—as when a group of Irish handloom weavers was displaced by white Protestant weavers in Philadelphia in 1844—they donned blackface and mobbed their black neighbors. The point wasn’t to get their jobs back.

Indeed, more was at stake than cash wages. To achieve standing as free white persons—and to enjoy the many benefits of citizenship that accrued from that definition—working-class men in the antebellum era consciously asserted their white identity and set it apart from blackness through language, performance, politics and violence. To imagine that they didn’t understand the full impact of their decisions is to deny them any modicum of intelligence or agency.


If working-class whites historically derived both psychological and citizenship wages by privileging race over class, is it possible that they sometimes enjoyed real wages as well? Beginning 25 years ago, a rising generation of political historians including Thomas Sugrue, Kevin Kruse, Matthew Lassiter, Robert Self and Craig Steven Wilder concluded that they did. Giving special focus to labor and housing markets, they found that many working-class white families benefited directly from government policies that placed African Americans at a disadvantage.

Take housing. Beginning in the 1930s, most mortgages were underwritten by the Federal Housing Administration (FHA), a federal agency that insured banks against losses from homeowners who defaulted on their loans. The FHA insured these mortgages in return for securing the banks’ pledge to provide home loans at low interest rates and to spread interest payments over at least 15 and as many as 30 years to pay back their loans. At minimal expense to the federal government and with only the pledge of default insurance, the FHA freed up unprecedented levels of capital and helped create a postwar social order in which 60 percent of American households owned and accumulated wealth in their own homes.

In deciding whether or not to insure mortgages, the FHA rated every census tract in the country. Assuming that houses lost value in neighborhoods that were racially mixed or primarily populated by African Americans and Latinos, the FHA assigned such areas lower scores or “redlined” them altogether, refusing to insure mortgages in these neighborhoods or insuring them on unfavorable terms. This meant that most black Americans could not secure mortgages, as their mere presence in a neighborhood would choke off affordable credit.

In a perverse twist, black residents in many Northern cities had little recourse but to rent cramped, sub-divided apartments in buildings whose white landlords often neglected repairs and upkeep, but the physical decay of their homes fed the white Americans’ suspicions that black residents chose to live in squalor.

It was not just a matter of housing. A powerful combination of private-sector discrimination and nepotism within trade unions had long excluded black workers from well-paid, blue-collar industries. As George Meany, the president of the AFL-CIO crassly admitted, “When I was a plumber, it never [occurred] to me to have niggers in the union!” Even in liberal bastions like New York City, African Americans in the 1950s and 1960s comprised less than 5 percent of all dock workers, skilled machinists, electricians or unionized carpenters—the types of jobs that afforded non-college educated white men access to middle-class comfort and economic security in the post-war period. The black unemployment rate was double that of the city’s overall unemployment rate. And New York was better than most places. In Chicago, 17 percent of black adults in the early 1960s were unemployed. In Cleveland, 20 percent. In Detroit, 39 percent.

By the time federal and state officials got serious about enforcing fair employment laws in the 1970s, America’s manufacturing and extractive industries had already fallen into steady decline. In effect, two post-war forces most responsible for lifting millions of working-class families into middle-class comfort and privilege—the suburban housing boom and unionized blue-collar jobs—only became available en masse to black Americans just as the post-war boom drew to a close.

This wasn’t a simple case of discrimination or inequality. Working-class white families affirmatively enjoyed what the historian George Lipsitz termed a “possessive investment in whiteness.” They availed themselves of the G.I. Bill’s housing and education benefits, paid for in part by black people’s taxes, at a time when black veterans faced sharp limitations on where (or whether) they could draw the same benefits. They accumulated equity in their suburban homes and used it to send their children to college or to save for their retirement. They enjoyed access to public services—from public schools and public trash collection, to clean water and sewage—that were deficient in majority-minority neighborhoods. These advantages conferred second-order benefits, including better health and a higher average life expectancy.

In other words, whiteness did pay real wages. It delivered an inter-generational advantage to those who were in a position to claim it. And white working-class Americans seemed on some level to understand it. When in 1966 Lyndon Johnson attempted to ram through Congress a law banning racial discrimination in the sale and rental of housing, white working-class voters revolted both in the streets and at the polls. (Ronald Reagan, a washed up former actor, unseated the otherwise popular incumbent governor of Califnronia, Pat Brown, largely by touting his opposition to the state’s open housing law.) That summer, when Martin Luther King Jr. led protests throughout the “bungalow belt” in Chicago’s working-class white neighborhoods and the nearby blue-collar suburb of Cicero, Polish, Italian, and Irish residents who had once been staunch Democratic voters now erupted in fury. They pelted protesters with rocks and beat them with clubs amid cries of “White Power!”; “Burn them like Jews”; “We want Martin Luther Coon!”; “Roses are red, violets are black, King would look good with a knife in his back.”

Just a few years later, when the federal government began requiring that government contractors and industrial unions that did business with them begin aking affirmative efforts to integrate their workforces, white voters gravitated to backlash politicians who promised to preserve their privilege in the job sector. Even as late as 1990, conservative Republican Senator Jesse Helmes was able to make openly racist appeals on such grounds and pay no price. On the contrary, it was a winning formula.


The same dynamic that DuBois grappled with is on display today. In breaking for Donald Trump and the GOP, working-class white voters are manifestly undercutting their economic self-interest. To be sure, Trump didn’t campaign like an archetypal GOP plutocrat. He railed against free trade and immigration, policies that many white working-class citizens believe, with some justification, have hurt their communities. He promised to bring back manufacturing and coal mining jobs, eliminate generous tax loopholes for wealthy families like his own, and—like Andrew Jackson, after whom he has patterned his presidency—privilege the many over the few.

But Democrats and Never Trump Republicans shouted at the top of their lungs that Trump’s campaign promises either weren’t possible or that they wouldn’t help working-class voters as much as he pledged. And they appear to have been right. The president recently signed into law a tax bill whose benefits, according to the nonpartisan Tax Policy Center and the Congressional Budget Office, accrue principally to corporations and super-rich individuals; many middle-class and working-class families will ultimately face a tax hike. The administraton and its congressional supporters have also taken steps to make health care less affordable or altogether inaccessible, destabilize retirement security for working-class families, and allow industrial polluters to despoil the air they breathe and the water they drink. Despite what Trump said on the campaign trail, his agenda does little to help and much to hurt struggling white families.

Of course, whiteness still delivers other dividends—as it always has. It makes one less likely to be killed by a police officer during a traffic stop. It enables white men to carry assault weapons (including long guns) in places of public accommodation, while a black man might be shot and killed by law enforcement officials merely for picking up a BB gun displayed on a sales rack at Walmart. It affords working-class white families the peace of mind that the government won’t invade homes or hospitals in pursuit of undocumented children or grandparents. Whiteness, in other words, continues to pay tangible benefits, and for right or wrong, it makes some sense that its primary beneficiaries are loathe to support candidates who expressely promise to disrupt this privileged status.

Yet Trump has also, arguably more than any other candidate for president in the last hundred years (excepting third-party outliers like Strom Thurmond and George Wallace), played to the purely psychological benefits of being white. From his racially-laden exhortations about black crime in Chicago and Latino gangs seemingly everywhere, to his attacks on an American-born federal judge of Mexican parentage and Muslim gold star parents, he has paid the white majority with redemption and revanchism. Trump might be increasing economic inequality, but at least the working-class whites feel like they belong in Trump’s America. He urged them to privilege race over class when they entered their polling stations. And it didn’t just stop there. As Ta-Nehisi Coates argues, Trump swept almost every white demographic group, forging a “broad white coalition that ran the gamut from Joe the Dishwasher to Joe the Plumber to Joe the Banker.” It’s not just blue-collar white people who seem blithely willing to sacrifice economic rationality for racial solidarity. After all, it arguably took a special kind of stupid for upper-middle class suburbanites in high-tax states to support a party that just raised their taxes. (No, this wasn’t a bait-and-switch. The GOP leadership has talked openly about elminiating deducations for state and local taxes since 2014.) Unless, that is, you account for the wages of whiteness.

2017 and the curious demise of Europe's centre-left

Rubén Weinsteiner

Rubén Weinsteiner

Across Europe this year, traditional centre-left parties lost out to new forces, of the left, the right and sometimes the centre. What’s behind this historic shift, and is Britain’s really immune? Jon Henley explains

So what happened to Europe’s centre-left?

The spectre that haunts Europe’s centre left has a name: Pasokification. In 2009, Greece’s once-great social democratic party won 43.9% of the national vote. Barely six years later, it could manage just 6.3%.

Atomised in France, all but wiped out in the Netherlands, humiliated in Germany, Europe’s mainstream centre left is in full retreat. Even in its one-time stronghold of Scandinavia, social democracy is now struggling.

There are many reasons. The embrace-the-market “Third Way” policies of leaders such as Tony Blair and Gerhard Schröder worked fine in the turn-of-the-century boom years but seem to offer little to today’s vulnerable centre-left voters.

The fallout from the 2008 financial crash – high unemployment, lower living standards, ongoing public spending cuts – has combined with long-term trends (globalisation, automation, immigration, changing class identities, declining union membership) to eat into the centre left’s core electorates.

Openly addressing those fears, populist far-right parties have attracted the votes of many who traditionally supported the centre left. The rise of a new anti-capitalist, anti-globalisation, anti-establishment far left has proved equally damaging.

The moderate European left that played such a fundamental part in rebuilding western Europe’s post-war democracy is not yet dead. But unless it can once more offer voters credible solutions to their present-day problems, it could be in terminal decline.
Where did all those votes go?

Although the centre-left’s decline was common, its beneficiaries differed from country to country; in France votes went to centrists and the harder left, in Germany to the extremes of left and right, and in the Netherlands to all points of the spectrum. The ramifications for each country’s government were varied too, with arguably the most serious consequences in the one country where the centre-left fared relatively well.
Centre-left party: Parti Socialiste (Socialist party)
Election: Presidential, first round, 23 April
Change in vote share from previous: down 22.2 percentage points

In 2012, François Hollande swept into the Elysée palace at the head of an all-conquering Socialist party that also controlled parliament, the senate and a majority of France’s regions.

Five years later, he became the Fifth Republic’s first head of state not to seek re-election; the party’s official presidential candidate, Benoît Hamon, finished fifth; and after parliamentary elections the Socialists slumped from 280 MPs to just 30.

Outflanked to the right by the reforming centrist Emmanuel Macron and to the left by the radical firebrand Jean-Luc Mélenchon, the party of François Mitterrand managed just 7.4% of the national vote in the parliamentary elections. It is now firing 60 of its 100 staff and selling its illustrious party HQ on the rue Solférino to raise money.

The Netherlands
Centre-left party: Partij van de Arbeid (Labour Party)

Election: Lower house, 15 March
Change in vote share from previous: down 19.1 percentage points

The headline story of this year’s Dutch legislative elections was the worse-than-expected result of the populist anti-Islam politician Geert Wilders and his PVV Freedom party. But the poll also marked a crushing defeat for the Dutch Labour party.

Punished for backing the liberal policies pursued by the outgoing centre-right led coalition of which it was a part, the PvdA saw its parliamentary party collapse from 38 MPs to nine after falling to just 5.7% of the vote.

It lost votes to several smaller parties in a now heavily fragmented Dutch political landscape but above all to the Green party of charismatic young Jesse Klaver (“the Jessiah”), which overtook it to become the largest party of the left.


Centre-left party: Sozialdemokratische Partei Deutschlands (German Social-Democrat Party)

Election: Bundestag, 24 September
Change in vote share from previous: down 5.2 percentage points

Germany’s SPD also paid the price for four years as minority partner in a coalition led by a centre-right behemoth, namely Angela Merkel’s CDU. It slumped in September’s poll to a score of 20.5%, the party’s worst since the second world war.

The SPD enjoyed an early boost when it chose former European parliament president Martin Schulz as its candidate for chancellor, but that soon faded. It swore after the election to spend the next four years in opposition to win back lost core support.

After the collapse of three-way coalition talks between the CDU, pro-business FDP and Greens, however, the SPD – a buttress of the European left for 150 years – came under intense pressure to reconsider and is now in cooperation talks with Merkel’s party.

Centre-left party: Sozialdemokratische Partei Österreichs (Austrian Social-Democrat Party)

Election: Parliament, 15 October
Change in vote share from previous: up 0.1 percentage points

Austria’s SPÖ is an exception. Despite being in government since 2007, its share of the vote actually increased, by 0.1% to just under 27%, and it lost no MPs. Its respectable second place, however, had perhaps the biggest consequences.

Sharp swings from minor parties to the SPÖ’s main rivals – the centre-right ÖVP led by another charismatic young leader, Sebastian Kurz, and the far-right FPÖ – have left Austria the only country in western Europe with a far-right presence in government.

Labour sprung a big surprise in 2017, storming to a near 10% vote swing and increasing the size of its parliamentary party from 232 MPs to 262 – an achievement most experts considered impossible when the June election was called.

The party’s leader, Jeremy Corbyn, called the result “a victory for hope”. But is it a mainstream centre-left party? Moderates might argue that in its present incarnation, the party – in power from 1997 to 2010 – is closer to the radical, militant idealism of Greece’s Syriza or Spain’s Podemos than to France’s PS or Germany’s SPD.

Rubén Weinsteiner

viernes, 29 de diciembre de 2017

Putin’s Medieval Dreams


As much of the world makes amends for social and political injustices of the past, Russia is lionizing its despots, raising statues to the worst of them. Behind this phenomenon is an ultra-nationalist brand of conservatism that seeks to take Russian politics back to the Middle Ages.
While much of the world is busy dismantling monuments to oppressors, Russians are moving in the opposite direction, erecting statues to medieval warlords who were famous for their despotism. Understanding this revival can shed light on the direction of Russia’s politics.

In October 2016, with the endorsement of Russia’s culture minister, Vladimir Medinsky, the country’s first-ever monument to Ivan the Terrible was unveiled in the city of Orel. A half-year later, President Vladimir Putin christened Moscow’s own tribute to the tyrant, declaring, erroneously, that “most likely, Ivan the Terrible never killed anyone, not even his son.” And a month ago, Vladimir Zhirinovsky, the leader of the ultra-nationalist Liberal Democratic Party of Russia, called for Lenin Avenue in Moscow to be renamed Ivan the Terrible Highway

Most historians agree that Ivan lived up to his name; not only did he kill his son and other relatives, he also ordered the oprichnina, the state-led purges that terrorized Russia from 1565 to 1572. He also presided over Russia’s defeat in the Livonian War, and his misrule contributed to the Time of Troubles and the state’s devastating depopulation.

Joseph Stalin initiated the modern cult of Ivan the Terrible. But, since the mid-2000s, Russia’s Eurasia Party – a political movement led by the pro-fascist mystic Alexander Dugin – has moved to position Ivan as the best incarnation of an “authentic” Russian tradition: authoritarian monarchy.

Dugin’s brand of “Eurasianism” advocates the embrace of a “new Middle Ages,” where what little remains of Russian democracy is replaced by an absolute autocrat. In Dugin’s ideal future, a medieval social order would return, the empire would be restored, and the Orthodox church would assume control over culture and education.

Eurasianism, which was marginal in the 1990s, has gained considerable popularity in recent years by contributing to the formation of the so-called Izborsky Club, which unites the Russian far right. On several occasions, Putin has referred to Eurasianism as an important part of Russian ideology; he has even invoked it as a founding principle of the “Eurasian Economic Union,” a burgeoning trade area of former Soviet states.

Eurasianism has given ultra-nationalist groups common ground around which to unite. It has also given symbols of totalitarianism, like Ivan the Terrible and Stalin, new legions of support.

Chief among them are members of the Eurasia Party, who consider political terror the most effective tool of governance and call for a “new oprichnina” – a staunchly anti-Western Eurasian conservative revolution. According to Mikhail Yuriev, a member of the political council of the Eurasia Partyand author of the utopian novel The Third Empire, the oprichniks should be the only political class, and they should rule by fear.

Ivan the Terrible is not the only medieval vestige being revived in Russia. Cultural vocabulary is also reverting. For example, the word kholop, which means “serf,” is returning to the vernacular, a linguistic devolution that parallels a troubling rise in Russia’s modern slavery. Data from the Global Slavery Index show that more than one million Russians are currently enslaved in the construction industry, the military, agriculture, and the sex trade. Moreover, serf “owners” are also happily identifying themselves as modern-day barins.

Even Russian officials speak approvingly of modern slavery. Valery Zorkin, who chairs the Constitutional Court, wrote in Rossiyskaya Gazeta, the official government newspaper, that serfdom has long been a “social glue” for Russia. And another medieval term – lydi gosudarevy, which translates to “servants of his majesty” – has returned to favor among high-ranking bureaucrats.

Nostalgia for serfdom compliments the desire for a return to autocracy. Prominent Russian intellectuals – including the filmmaker Nikita Mikhalkov, journalist Maksim Sokolov, and Vsevolod Chaplin, a Russian Orthodox cleric – call for the coronation of Putin, and petitions of support are gaining signatures online. Significantly, the protests against Putin’s regime in 2012 have since been interpreted not as a protest against Putin himself, but rather against the social order to which Eurasianism aspires.

Putin’s tacit support for the Eurasian vision of a neo-medieval Russia invokes the historical memory of Stalinism. According to Dugin, “Stalin created the Soviet Empire,” and, like Ivan the Terrible, expresses “the spirit of the Soviet society and the Soviet people.” No wonder, then, that monuments to Stalin, too, are multiplying in Russian cities.

Neo-medievalism is rooted in nostalgia for a social order based on inequality, caste, and clan, enforced by terror. The lionization of historical despots reflects the contemporary embrace of such pre-modern, radically anti-democratic and unjust values. For Ivan’s contemporary champions, the past is prologue.

“We’ve centralized all of our data to a guy called Mark Zuckerberg” says Pirate Bay Founder

At its inception, the internet was a beautifully idealistic and equal place. But the world sucks and we’ve continuously made it more and more centralized, taking power away from users and handing it over to big companies. And the worst thing is that we can’t fix it — we can only make it slightly less awful.

That was pretty much the core of Pirate Bay’s co-founder, Peter Sunde‘s talk at tech festival Brain Bar Budapest. TNW sat down with the pessimistic activist and controversial figure to discuss how screwed we actually are when it comes to decentralizing the internet.
Forget about the future, the problem is now

In Sunde’s opinion, people focus too much on what might happen, instead of what is happening. He often gets questions about how a digitally bleak future could look like, but the truth is that we’re living it.
Everything has gone wrong. That’s the thing, it’s not about what will happen in the future it’s about what’s going on right now. We’ve centralized all of our data to a guy called Mark Zuckerberg, who’s basically the biggest dictator in the world as he wasn’t elected by anyone.
Trump is basically in control over this data that Zuckerberg has, so I think we’re already there. Everything that could go wrong has gone wrong and I don’t think there’s a way for us to stop it.

One of the most important things to realize is that the problem isn’t a technological one. “The internet was made to be decentralized,” says Sunde, “but we keep centralizing everything on top of the internet.”

To support this, Sunde points out that in the last 10 years, almost every up-and-coming tech company or website has been bought by the big five: Amazon, Google, Apple, Microsoft and Facebook. The ones that manage to escape the reach of the giants, often end up adding to the centralization.
We don’t create things anymore, instead we just have virtual things. Uber, Alibaba and Airbnb, for example, do they have products? No. We went from this product-based model, to virtual product, to virtually no product what so ever. This is the centralization process going on.

Although we should be aware that the current effects of centralization, we shouldn’t overlook that it’s only going to get worse. There are a lot of upcoming tech-based services that are at risk of becoming centralized, which could have a huge impact on our daily lives.
We’re super happy about self driving cars, but who owns the self driving cars? Who owns the information about where they can and can’t go? I don’t want to ride in a self driving car that can’t drive me to a certain place because someone has bought or sold an illegal copy of something there.

Sunde firmly believes that this is a realistic scenario as companies will always have to put their financial gains first, before the needs of people and societies. That’s why there needs to be a greater ethical discussion about technology and ownership, if we don’t want to end up living in a corporate-driven dystopia (worse than our current one, that is).
Making a shitty situation slightly more tolerable

Feeling a bit optimistic, I asked Sunde whether we could still fight for decentralization and bring the power back to the people. His answer was simple.
No. We lost this fight a long time ago. The only way we can do any difference is by limiting the powers of these companies — by governments stepping in — but unfortunately the EU or the US don’t seem to have any interest in doing this.

So there’s still some chance for a less awful future, but it would require a huge political effort. However, in order to achieve that, the public needs to be informed about the need for decentralization — but historically that’s not likely to happen.
I would say we, as the people, kind of lost the internet back to the capitalist society, which we were hoping to take it back from. We had this small opening of a decentralized internet but we lost it by being naive. These companies try to sound good in order to take over, that they’re actually ‘giving’ you something. Like Spotify gives you music and has great passion for music, and all of the successful PR around it.
But what it does to us in the long term is more like smoking. Big data and Big Tobacco are really similar in that sense. Before, we didn’t realize how dangerous tobacco actually was, but now we know it gives you cancer. We didn’t know that big data could be thing, but now we know it is. We’ve been smoking all our lives on big data’s products and now we can’t quit.

And just like with tobacco, it’s governments that need to create the restrictions. However, it’s difficult to see how any government — except for big players like US and EU — are supposed to be able to restrict the powerful tech giants.

Sunde feels that as the EU behemoth becomes bigger, it will be more difficult to pass laws that are actually for humans and that give people extended right. Which is unfortunate as the EU technically has the legislative power to make an actual difference when it comes to decentralization.
The EU could say that if Facebook wants to operate within the EU, they have to agree that all of the data has to be owned by the user, and not by Facebook. Which would be quite simple for the EU to do, but of course that would make Facebook really upset.
Then every country would be scared to be the first one to implement the law because Facebook would leave and all of its citizens would be without their tobacco. That’s the problem we’ll always have.

Sunde, however, is slightly optimistic (but not really) as he doesn’t feel that this fight has to necessarily go through monolithic governments to reach some kind of successful result. In fact, it might actually be more likely to succeed on a smaller national level.

In regards to this, Sunde names my beloved Iceland as an example, where the Pirate Party, running on a platform of groundbreaking digital policies, almost got into government. Dramatic changes on a national level, no matter how small the population is, could have great effects in the global community. Basically meaning that countries can lead by example.

Sunde, who’s half Norwegian and half Finnish, says that another good example of leading digital policies on a national level is when Finland made access to the internet a human right in 2010. By giving people these rights, the government had to define what the internet actually is and prevented future discussions about censorship — bolstering people’s rights against further centralization.

If nation states can actually facilitate further decentralization, like Sunde suggests, then we might actually be able to hamper the immense power of big corporations. Countries like Estonia have shown that politicians can come up with digital policies that actually preserve citizens’ right in a digital age.

However, we humans are illogical creatures that don’t necessarily do the things that are good for us: “It’s better for the people, but we don’t want to suffer that one single down-time second of our beloved tobacco.”

Instagram and the Cult of the Attention Web: How the Free Internet is Eating Itself

Jesse Weaver

I’m disappointed about Instagram’s most recent announcement. They’ll be shifting their photo feed from a chronological list to an algorithmically driven one, ordered based on which posts they think you will like most. My disappointment is not based in nostalgia or a lament of change. I’m disappointed because the decision is a symptom of a larger problem that is eating the web.

Over the past few decades a significant portion of the economy has shifted. Once upon a time companies and services were geared toward enticing you out of your money. Today, the goal of many is to entice you out of your time. Which, in turn, is leveraged as collateral to attract money from advertisers.

Our current version of the internet lives and breathes off a currency of human attention. With the success and failure of many internet companies predicated on how much of a person’s time they can capture.

This model has reshaped much of the internet into an “attention web”, with companies fighting tooth and nail to own every possible moment of your time.

As laid out in a recent New York Times piece about the Instagram change:
“These companies want to always, always give you the next best thing to look at,” said Brian Blau, a vice president at Gartner, an industry research firm. “If an algorithm can give you much more engaging content more frequently, you’ll stick around longer.”
The more time people spend using Instagram, the more often the company is able to serve people ads.

It’s the Faustian bargain we’ve all struck. In exchange for a “free” web, we give you our time. Unfortunately, this structure is unsustainable and is compromising both our experience of the web and the quality of the things we consume.

Time is more precious than money. Money is a renewable resource. Everyone always has the potential to make more money. Time, on the other hand, is finite. There are only so many hours in a day. By definition, you only have so much time to give.

The finite nature of time means that, in the world of the attention web, the competitive landscape is all encompassing. Everything is in competition with everything else. Facebook is as much in competition with Twitter, as it is with Spotify and Apple Music, Gawker and BuzzFeed, Hulu and YouTube, Candy Crush and Two Dots, Amazon and Walmart, Xbox and Playstation, Chipotle and your family dinner table, your hobbies and your bed. Because in the attention web, time spent shopping, eating, talking, playing, or sleeping is time that you are not looking at ads. It’s why Facebook has experimented with in-feed shopping. It’s why they bought a messaging app and VR company. It’s behind their big drive into video, as well as article self-publishing. They have to compete on all fronts to win the attention war. If they could serve up your meals they would.

Coca-cola talks about trying to win “share of stomach”, acknowledging that they are not just in competition with the other players in the drink industry, but in competition with every other food company and restaurant for the finite resource of stomach real estate. The attention web has taken this concept to a new scale that pits a vast array of industries against each other. This broad, unending competition for people’s time takes it’s toll on even the most popular services. See Twitter, Yahoo, Zynga and others.

As with all finite resources, there is a physical cap to how much time can be mined from the world, with population size as the forcing function. The number of people on the internet is directly proportional to the amount of time available. If you assume that technology companies want to maintain their growth curves, there are three possible avenues for them to take against this constraint:

Grow the size of the population with internet access.

Free up more time for the people who already have internet access.

Or create more people.

While no tech company is currently trying to create more people (except maybe Tinder) the other two paths have already started to manifest. Major players are trying to expand global internet access. Facebook’s internet.orginitiative is geared toward bringing free internet access to populations without it, and Google’s Project Loon is designed to create a balloon-based network delivering reliable internet to isolated rural areas.

Google is also one of the best examples of a company taking the second avenue: free up more time for people who already have internet. Their push into self driving car technology has a lot of potential benefits for humanity, but it also does something fundamental for Google and their business model. Time spent in the car is a vast untapped reserve of human attention. If your daily commute isn’t filled with trivial things like watching the road and trying not to kill people you suddenly have a lot more time to search — and be served search ads. Building a self driving car may seem like extreme measures just to free up people’s time, but it’s really just the tech equivalent of fracking — Oil’s extreme attempt to unlock untapped reserves.

At some point though, the reserves run out, and as more and more competitors (from almost every industry) come onto the scene, all vying for their slice of the time pie, simply expanding internet access and freeing up time isn’t enough. You still have to win people’s attention.

Ostensibly the drive to capture share of attention should be a big win for consumers. It’s often positioned that way. As in Instagram’s characterization of their timeline change as a step “to improve your experience”. And, based on the principles of human-centered design, companies should be striving for the best possible user experience and highest quality content in order to win the hearts, minds and, ultimately, the time of would be users. But, often the attention web takes a different direction.

Instead of streamlined experiences, filled with quality content, we’ve seen the rise of clickbait headlines, listicles and ad saturated UIs that are slow, cumbersome and sometimes down right unusable, especially on mobile screens.

In the attention web we end up with feeds that look like this:

And then we click through to a mess like this — with auto-playing video ads and inline ads that suddenly appear mid-scroll.

The drive for attention has also influenced the way we talk about products. As designers we’re expected to make things “habit forming”. Get people “hooked”. And turn monthly “users” into daily “users”. The only other people I know who call their customers users are drug dealers.

This rhetoric has made companies more and more aggressive about pushing their agenda into our lives. Floods of emails, push notifications, text notifications, daily reminders, and weekly digests are the norm in the attention web.

We aren’t creating human-centered experiences, we are creating attention-centered experiences, which puts the needs of the business squarely ahead of the needs of the customer.

Which brings us back to Instagram.

A long time ago I picked my horse in the social media race and it was Instagram. It’s one of the few services that, in my opinion, completely nailed the intersection between human desire and the capabilities enabled by the internet. It’s the kind of product the internet was born to produce. And, as I see it, has the potential to be around for the long haul.

The desire to preserve and share memories is uniquely human and is as old as cave drawings and the spoken word. It has always been in us and it will always be in us. The magic of Instagram is that it delivers on that innate desire in a beautifully crafted, deeply human experience. One that is so simple it becomes second nature, allowing you to co-create an interwoven story of your life and the lives happening around you, visualized, in real-time, as a stunningly artistic, chronological archive of photographs.

That is a beautiful and powerful value proposition. Something that hits on a real human need and drives real connection. It is an example of what creating a human-centered experience should ultimately be.

For a long time (surprisingly long actually) Instagram maintained that magic. Keeping the feature set small and the experience pure. Which is in sharp contrast to the rest of the social world, which promises connection in a bloated tangle of features and gimmicks.

But, alas, a business has to pay the bills. And in the attention web, when the devil of revenue comes calling, the easy out is ads. Or, in the parlance of our time, “sponsored posts”.

So Instagram instituted sponsored posts. Step one. And now comes the second step: the algorithmic feed. The NYT piece states that this change won’t impact how Instagram ads are served, and it won’t, they are already algorithmically targeted. But it changes something else. Something more fundamental.

An algorithmic feed changes the ability of influencers and brands on Instagram to reach people. It changes the ability of accounts to be discovered. Instagram is making this move just when brands are really starting to figure out how to leverage the platform. Strategically this makes sense for the business because brands have found the value of the platform, so when the new feed starts to erode that value, they’ll be more likely to stay and pay for promoted posts. The same scenario has already played out on Facebook (Instagram’s parent). And it works. Facebook is doing pretty well from a revenue perspective. And so, Instagram will continue to head down the same road. And, as has happened with other services, as the Instagram experience falls deeper and deeper into the attention web’s monetization trap it’s likely the magic will fade.

The problem is that you can’t fault Instagram, or any of the other service out there that’s playing the attention web game. It’s we, the people of the internet, who have set the rules of engagement. We want our web and we want it for free. However, the inconvenient truth is that there is a cost to doing business and at some point companies have to make money.

And so we sacrifice the magic. We devalue content and products by refusing to pay for the work it takes to create and maintain them. We are satisfied wading through poorly designed, ad-based experiences. And we allow our most precious resource, our time, to become a commodity to be traded, sold and manipulated. Our data is mined, our privacy discarded and our actions tracked all in the name of more targeted advertising.

And it’s not even the best scenario for companies either. In Q4 of 2015 Facebook brought in $5.9 billion in revenue with 1.59 billion active users/month. That’s roughly $1.23 of revenue/user/month. If, in the same quarter, Facebook moved away from ads and instead charged each active user just $1.50 a month for the service, their Q4 2015 revenue would have increased by $1.2 billion dollars, from $5.9 billion to $7.1 billion.

Now, what if Facebook started using that extra $1.2 billion to pay content creators for posting quality content on the platform? Similar to what Netflix does. And what if other major platforms like Twitter and YouTube followed the same model? Suddenly the revenue sources for content creators starts to diversify. The reliance on advertisers wanes. Feeds are no longer designed to hide creators and friends in order to drive ads, they are designed to promote connection and shine a light on creators. Bloated, ad-filled UIs start to disappear. Content loads faster. Creators develop more immersive content experiences focused on the people using them. The balance of power flips back to the user.

Maybe this is a utopian dream, and I’m sure a lot of people will say it would never work for lots of reasons. However, if we are willing to start paying for the products and services we use, we stop being the commodity and we start being the driver. And when users are the driver, companies will focus on adding value, not just grabbing our attention.

AFL-CIO calls for a break with “lesser of two evils” politics

By John Wojcik And Mark Gruenberg

Lee Saunders, Randi Weingarten, and Mark Dimondstein. | Credit: AFL-CIO and APWU

ST. LOUIS – The AFL-CIO convention here passed yesterday a political resolution that calls for a break with “lesser of two evil politics” but came up short when it comes to projecting a clear path to how that will be accomplished.

“The time has passed when we can passively settle for the lesser of two evils,” reads the main political resolution passed Tuesday by the AFL-CIO convention delegates. Lee Saunders, chair of the AFL-CIO’s political committee and president of AFSCME, and Randi Weingarten, president of the American Federation of Teachers, introduced the resolution. They lead the labor federation’s two largest unions. Convention managers yoked the resolution to another measure it also approved discussing a labor party, though not by name.

“For decades the political system has failed working people,” Weingarten said. “Acting on behalf of corporations and the rich and powerful, the political system has been taking away, one after another, the pillars that support working people’s right to good jobs and secure benefits.”

The two measures, adopted October 24, followed a late Monday-evening meeting of supporters of reviving the Labor Party idea. It attracted about 50 delegates to an upstairs meeting room at the convention’s lead hotel. Their contention: Both the Democrats and the Republicans are under corporate domination.
The prime mover of a Labor Party motion at the convention, Postal Workers President Mark Dimondstein, has been calling for such a new formation since the passage of NAFTA in 1993, which he said showed both Democrats and Republicans were in the pockets of the corporate class.

Dimondstein made many of the same arguments for a Labor Party on the convention floor that he voiced in the meeting the night before, when Baldemar Velasquez of the Farm Labor Organizing Committee, Mark Dudzic of Labor’s Committee for Single Payer, and Donna DeWitt, former president of the South Carolina AFL-CIO, joined him.

Meeting participants differed over whether the nascent party should first build an organization and concentrate on issues, or get into political races, running the risk of becoming “spoilers” in the current political system, rigged in favor of the two existing parties.

“We had a vision to build a party of the working class. You have to have the labor movement at the table from the beginning,” of the effort, “or you’re building sand castles,” Dudzic explained. He was a leader in the original Labor Party effort of the late 1990s and early 2000s. Participants in the meeting agreed. “We cannot build a party of labor when the working class is in retreat,” he added. The question was how to move forward.

“We have to crawl before we walk, we have to walk before we run and we have to run before we sprint,” one attendee, Professional and Technical Engineers President Greg Junemann, said.

Velasquez contended pro-Labor Party members should participate in electoral politics, but starting at the local and state levels. But all agreed, as he put it, the Democrats “are not doing us any favors, never have and never will.”

Several unions meanwhile drew delegates to another meeting off the main convention floor – a meeting of Labor for Our Revolution. Seven national unions called on their member delegates to attend that meeting Monday, across the street from the convention hall to, in their words, continue the movement that grew out of the Sanders challenge in last year’s Democratic primaries.

The unions that called that meeting were the Amalgamated Transit Union, the American Postal Workers Union, the Brotherhood of Maintenance Way Employees, the Communications Workers of America, the International Longshore and Warehouse Union, National Nurses United, and the United Electrical Workers. They were joined by the Massachusetts and South Carolina AFL-CIOs.

The Our Revolution organization they are backing claims more than 300 local chapters in the US and four state-wide chapters in Texas, Massachusetts, Wisconsin, and Maryland.

Communications Workers of America previous president Larry Cohen is the chair of Our Revolution and has been asking union leaders to become part of the local groups.

Rand Wilson, a leader in SEIU’s Local 888, is a steering committee member for Labor for Our Revolution. He explained at the group’s meeting here that unions getting involved in Our Revolution gives labor the ability to influence it so it has significant focus on the working class. “It also gives us the opportunity to organize our own members around broad, working class issues,” he said.

The group is pushing a number of bills in Congress including Medicare for All and Free College tuition.

The AFL-CIO is not actually pulling the plug on the Democrats, although their politics resolution, which doesn’t mention the party by name, is a clear warning to Democrats that labor support will not be taken for granted. The labor leaders who introduced the independent politics resolution, Saunders and Weingarten, are both members of the Democratic National Committee. The main resolution declared that for the 2018 elections labor would “define a pro-worker agenda…to hold as a joint standard for all officials, regardless of party.”

That resolution also commits the federation to establish a communications framework to send the agenda to unions and their allies, “prioritize year-round member-to-member communication” and greater internal organizing, mobilizing non-union workers and opposing suppression of union, young, old, female and minority-group voters. AFL-CIO convention delegates went out into St. Louis neighborhoods to door-knock for a Right to Work education campaign. | AFL-CIO

The Labor Party supporters at the convention made the argument that the massive grassroots mobilization for the Democratic presidential primary candidacy of Sen. Bernie Sanders, Ind-Vt., and the later GOP triumph of Donald Trump – powered in part by defecting working class voters in key Great Lakes states – “showed the working class is done with the status quo.” Dimondstein made that point in both the upstairs session and on the convention floor. Sanders drew tens of thousands of union volunteers and many more union voters.

“We can’t take half a loaf, a quarter of a loaf, an eighth of a loaf, or even crumbs,” Dimondstein added.

He received applause when he pointed out on the convention floor that even when the Democrats gained total control of the presidency and Congress, in the 2008 election, they not only didn’t follow through on labor law reform and other top progressive and worker priorities, but instead produced the Trans-Pacific Partnership “free trade” pact and similar measures. “The Democratic Party was not delivering anything,” he said, “even when it had control of the White House, the Congress and the Senate.”

The Republicans entrenched union-busting, Democratic President Bill Clinton deregulated Wall Street, and Democratic President Jimmy Carter deregulated trucking, Dimondstein said.

Constructing a Labor Party, Dimondstein admitted, will be a long-range project and needs both community and labor support. “What would be wrong would be to confine this movement” for a Labor Party “to the institution of the two-party system.”

“Continuing to follow the same model, expecting a different result, is not a solution,” a delegate from Vermont said in the meeting.

Number of U.S. workers employed by foreign-owned companies is on the rise


Foreign-owned companies employed 6.8 million workers in the United States in 2015, up 22% from 2007, according to preliminary data from the U.S. Bureau of Economic Analysis. The increase is notably larger than overall U.S. private employment growth, which was 3.6% over the same span.

Among foreign enterprises, British-owned companies employed the highest number of U.S. workers in 2015 (around 1.1 million), followed by companies with majority ownership in Japan (around 856,000) and France, Germany and Canada (each over 600,000). These five countries alone accounted for a majority (58%) of U.S. employment by foreign-owned enterprises in 2015 and have made up the top five since at least 2007, the earliest year for which comparable data are available.

Overall, foreign-owned companies accounted for 5.5% of all U.S. private sector employment in 2015, up from 4.7% in 2007. This analysis counts full- and part-time employees of foreign multinational enterprises’ U.S. affiliates (such as corporate branches) that were majority-owned by their foreign parents in 2015, the most recent year available. The BEA provides country-level data for 41 countries and territories, as well as broader regional and global totals.

For British-owned enterprises, transportation and hospitality and food services were among the top-employing industries in the U.S. Jobs in manufacturing (especially transportation) and wholesale trade stood out for Japan.

While companies majority-owned in the UK and Japan have led the pack for foreign direct investment, U.S. employment by Chinese-owned companies has grown exponentially since 2007, showing by far the largest percentage increase in employment. Chinese enterprises employed nearly 44,000 U.S. workers in 2015, more than 30 times greater than in 2007. By comparison, New Zealand (second fastest for U.S. employment growth) employed nearly five times as many U.S. workers in 2015 as in 2007 – paling in comparison with China.

Although Canada is consistently a top employer among U.S. affiliates, employment by the U.S.’s southern neighbor is increasing. Mexican-owned companies had nearly 80,000 U.S. workers on their payrolls in 2015, an 82% increase over 2007.

What foreign enterprises pay their U.S. workers

As the country with the most U.S.-affiliate employees, the UK also paid the most in total compensation ($84.9 billion) in 2015. Japan, the second-highest employer of U.S. affiliates, followed at $72.2 billion.

Foreign-owned companies overall paid $539.1 billion in total compensation to U.S. workers in 2015 – around $79,000 per employee. This figure includes all income received for work, including employer spending for employee benefit plans. Compensation by foreign companies was somewhat higher than the average among all U.S. private industry employers in 2015, which was around $63,600 per employee, according to Bureau of Economic Analysis data.

Saudi Arabian companies had the highest compensation per employee at around $163,000, more than double the figure for all foreign-owned companies. However, Saudi Arabian companies have consistently employed far fewer U.S. workers than other U.S. affiliates – British companies, for example, employed over 100 times as many in 2015.

Seven other countries’ U.S. affiliates had an average compensation per worker of more than $100,000: Bermuda, Norway, Venezuela, South Africa, Israel, Finland and Ireland; Switzerland’s enterprises paid just under six figures per employee on average. Of these countries, only Irish- and Swiss-owned companies employed more than 100,000 workers.

Despite the recent surge in U.S. employment by companies majority-owned in China, compensation per worker by Chinese enterprises trended downward from 2007 and ranked near the bottom of countries and territories analyzed for 2015 at around $49,000 per employee.

Some states have more foreign direct investment than others

U.S. affiliates of foreign-owned companies employ people in all 50 states and in U.S. territories. In 2015, the states with the largest shares of total private industry employment by foreign-owned companies were New Jersey (8.1%), South Carolina (8.0%) and New Hampshire (7.7%), followed by Kentucky, Indiana, Hawaii, Connecticut and Delaware (each more than 7%). The states with the largest populations in general also had the most employees who worked for foreign-owned companies: California, Texas, New York, Florida and Illinois. California led, with 715,800 such workers.

Rising contribution to U.S. GDP

Foreign-owned companies contributed $894.5 billion to the U.S. gross domestic product in 2015, a 16% increase from 2007 and 6.4% of the total contribution to the GDP by U.S.-based private industry during this span. While 2007 is the earliest comparable year for these data, it also marks the start of the Great Recession, which appears to have had a larger impact on U.S. affiliates of foreign enterprises than on U.S. private businesses.

During the 2007-2009 recession, the contribution to the GDP by U.S. affiliates declined at an average rate of 6.7%, compared with a 1.6% average decline for U.S. private businesses. After the recession, GDP contributions rose faster for U.S. affiliates than for U.S. private businesses. Between 2009 and 2014, U.S. affiliate contributions to the GDP increased 8.3% on average, nearly twice the rate of U.S. private businesses.
Rubén Weinsteiner