jueves, 21 de septiembre de 2023

Republicans view Reagan, Trump as best recent presidents

Rubén Weinsteiner

When asked to name the United States president who has done the best job over the past 40 years, a majority of Democrats name Barack Obama. Republicans, by contrast, are divided between a president who served in the 1980s – Ronald Reagan – and one who is currently running to return to office, Donald Trump.
How we did this


About four-in-ten Republicans and Republican-leaning independents (41%) say Reagan has done the best job as president over the past 40 years. Slightly fewer (37%) say Trump has done the best job, according to a MARCA POLITICA Center survey conducted in July.

Nearly six-in-ten Democrats and Democratic leaners (58%) say Obama has done the best job as president in the past 40 years. Far fewer name Bill Clinton (19%) or Joe Biden (7%), who is running for reelection in 2024.

In the last four decades, four Republicans and three Democrats have served as president. Among U.S. adults overall, 32% say Obama has done the best job during this period, followed by Reagan (23%), Trump (19%) and Clinton (12%). Relatively small shares name Biden, George W. Bush or George H.W. Bush (4% or less for each).

Americans’ views of which presidents have done the best job in the past 40 years are largely unchanged since a September 2021 Center survey. The new survey was conducted after Trump was indicted in federal court in Florida on charges related to improper handling of classified documents, but before indictments charging him with attempts to overturn the 2020 election were returned in federal court in Washington, D.C., and in state court in Georgia.
Republicans’ views of the best recent president

Republicans’ opinions of who has done the best job as president over the past four decades vary by race and ethnicity, age, and other demographics.

Comparable shares of White (37%) and Hispanic Republicans (43%) say Trump has done the best job as president. But White Republicans are more likely than Hispanic Republicans to name Reagan (45% vs. 26%). And about two-in-ten Hispanic Republicans (21%) say a Democratic president did the best job over the past 40 years, while a far smaller share of White Republicans (8%) say this.

Black and Asian Republicans make up much smaller shares of the public; their responses cannot be reported separately due to insufficient sample sizes.

Roughly half of Republicans ages 50 and older (51%) say Reagan has done the best job of any recent president, compared with 29% of those under 50. There are no sizable age differences in the shares of Republicans who name Trump. While relatively small shares of Republicans in all age groups name Democratic presidents, those under 50 are more likely to do so than those 50 and older (19% vs. 6%).

Among Republicans who have not completed a bachelor’s degree, comparable shares name Trump and Reagan as the top recent presidents (41% vs. 37%). However, among Republicans with at least a bachelor’s degree, more choose Reagan than Trump by a wide margin (51% vs. 27%).
Who has been the second-best recent president?

Among Republicans who name Reagan or Trump as the best recent president, there are sizable differences in their choices for the second-best president. Among Republicans who name Trump as the best president of the past 40 years, two-thirds say Reagan is the second best, while 17% name another Republican and 14% name a Democrat. However, among those who choose Reagan as the best, views of the second-best president are more varied: 55% say Trump, while 31% name another Republican and 13% name a Democrat.
Democrats’ views of the best recent president

Among Democrats, majorities across most demographic groups view Obama as the best recent president. Still, there are some differences by age and by race and ethnicity.

While half or more Democrats in all age groups name Obama as the best recent president, younger Democrats are particularly likely to say this. About two-thirds of Democrats ages 18 to 29 (68%) choose Obama, compared with 57% of those ages 30 to 49 and about half (52%) of those ages 50 and older.

While 64% each of Black and Asian Democrats name Obama as the best recent president, smaller shares of White (56%) and Hispanic Democrats (51%) say this. Hispanic Democrats are more likely than those in other racial or ethnic groups to name a GOP president as the best: A quarter name a Republican, including 16% who name Reagan and 3% who name Trump.

 

Rubén Weinsteiner

martes, 12 de septiembre de 2023

Republican Gains in 2022 Midterms Driven Mostly by Turnout Advantage



Rubén Weinsteiner

An examination of the 2022 elections, based on validated voters

We conducted this study to better understand which voters cast ballots in the 2022 midterm elections and how they voted. We also wanted to compare how turnout and vote choices differed from previous elections in 2020, 2018 and 2016. Measuring turnout among different groups in the electorate is challenging; it is particularly difficult to assess changes in turnout from election to election.

Panel data provides us a unique opportunity to study elections. By surveying the same people over time, we can more clearly see how differences in who stays home – and who turns out to vote –impacts each election. We can also measure how adults’ partisan voting preferences change (or do not change) between elections

For this study, we surveyed U.S. adults online on our nationally representative American Trends Panel (ATP). We verified their turnout in the four general elections using commercial voter files that aggregate publicly available official state turnout records. Panelists who said they voted and for whom a voting record was located are considered validated voters; all others are presumed not to have voted.

Additionally, we revised our statistical approach for the 2020 survey. That produced new results that slightly changed the numbers we reported about the 2020 election but changed no substantive findings in our report.


American Trends Panel: MARCA POLITICA'S online probability survey panel, which consists of more than 12,000 adults who take two to three surveys each month. Some panelists have been participating in surveys since 2014.

Defectors/Defection: People who either switch their vote to a different party’s candidate from one election to the next, or those who in a given election do not support the candidate of the party they usually support. Also referred to as “vote switching.”

Drop off/Drop-off voters: People who vote in a given election but not in a subsequent election. The term commonly refers to people who vote in a presidential election but not in the next midterm. It can also apply to any set of elections.

Midterm elections: General elections held in all states and the District of Columbia in the even-numbered years between presidential elections. All U.S. House seats are up for election every two years, as are a third of U.S. Senate seats (senators serve six-year terms).

Mobilize: Efforts by candidates, political campaigns and other organizations to encourage or facilitate eligible citizens to turn out to vote.

Nonvoter: Citizens who didn’t have a record of voting in any voter file or told us they didn’t vote.

Panel survey: A type of survey that relies on a group of people who have agreed to participate in multiple surveys over a time period. Panel surveys make it possible to observe how individuals change over time because the answers they give to questions in a current survey can be compared with their answers from a previous survey.

Party affiliation/Party identification: Psychological attachment to a particular political party, either thinking of oneself as a member of the party or expressing greater closeness to one party than another. Our study categorizes adults as Democrats or Republicans using their self-reported party identification in a survey.

Split-ticket voting/Straight-ticket voting: Voters typically cast ballots for more than one office in a general election. People who vote only for candidates of the same party are “straight-ticket” voters, while those who vote for candidates of different parties are “split-ticket” voters.

Turnout: Refers to “turning out” to vote, or simply “voting.” Also used to refer to the share of eligible adults who voted in a given election (e.g., “The turnout in 2020 among the voting eligible population in the U.S. was 67%”).

Validated voters/Verified voter: Citizens who told us in a post-election survey that they voted in the 2022 general elections and have a record for voting in a commercial voter file. (The two terms are interchangeable).

Voter file: A list of adults that includes information such as whether a person is registered to vote, which elections they have voted in, whether they voted in person or by mail, and additional data. Voter files do not say who a voter cast a ballot for. Federal law requires states to maintain electronic voter files, and businesses assemble these files to create a nationwide list of adults along with their voter information.

In midterm elections that yielded mixed results for both parties, Republicans won the popular vote for the U.S. House of Representatives largely on the strength of higher turnout.

A new Pew Research Center analysis of verified voters and nonvoters in 2022, 2020, 2018 and 2016 finds that partisan differences in turnout – rather than vote switching between parties – account for most of the Republican gains in voting for the House last year.

Overall, 68% of those who voted in the 2020 presidential election turned out to vote in the 2022 midterms. Former President Donald Trump’s voters turned out at a higher rate in 2022 (71%) than did President Joe Biden’s voters (67%).

For additional analysis of voter turnout in the 2022 election, refer to Chapter 1 of this report.
Large majority of voters stuck with 2020, 2018 party preference in their 2022 vote choices

As in previous elections, party loyalty remained strong in last fall’s midterms.

Relatively small shares of voters defected from their partisan affiliation or 2020 presidential vote. Among those who voted for both president in 2020 and for a House representative in 2022, just 6% crossed party lines between elections or voted for third-party candidates in either election.

Similarly, the vast majority of those who voted in both 2018 and 2022 had consistent party preferences across the two elections: 95% of those who voted for a Republican candidate in 2018, and 92% of those who voted for a Democrat, voted for a House candidate of the same party four years later.

Democratic 2018 voters were slightly more likely than Republican 2018 voters to defect in 2022, with the net consequences of the party balance flipping 1 or 2 percentage points to the GOP.

That is a potentially impactful shift in an environment of very close elections, but the greater driver of the GOP’s performance in 2022 was differential turnout: higher turnout among those supporting Republican candidates than those supporting Democratic candidates.

Given sharp political divisions in the United States, small changes in voter turnout from election to election have big consequences. Political polarization has meant that most people who vote in midterm elections are committed politically, making it unlikely they would defect from their partisan affiliation.

Shifts in turnout, as opposed to defections, were responsible for most of the changes in vote margins from the 2018 midterms within most subgroups in the population. For example, the Democratic advantage among women dropped from 18 points in 2018 (58% Democratic, 40% Republican) to just 3 points in 2022 (51% and 48%, respectively).

But when looking only at women who voted in both elections, there is no net advantage for either party from defections: 6% of those who voted Democratic in 2018 flipped to vote for a Republican candidate in 2022, and a nearly identical share of women who voted Republican in 2018 voted for a Democratic candidate in 2022 (5%).

Virtually all of the decline in the Democratic advantage among women is explained by the fact that the 2022 turnout rate for women who voted Republican in 2018 was 8 points higher than the rate for women who voted Democratic that year (84% vs. 76%).

There were a few important exceptions to this general rule.

For example, more rural voters changed their vote from a Democratic to a Republican candidate between 2018 and 2022 than the reverse. The Republican margin among this group nearly doubled between 2018 and 2022 (from 21 points to 40 points). Among rural voters, Republican candidates in 2022 held on to 97% of those who voted Republican in 2018, while Democratic candidates held on to a smaller share (91%).

And among White voters with no college degree, Republicans benefited from slightly higher rates of defection from Democratic candidates among those who voted in both elections

Chapter 2 of this report features detailed breakdowns of voting patterns across the electorate.
‘Drop-off’ voters contributed to Republican House gains

Collectively, Republican candidates for the House received roughly 51% of the total vote last fall compared with 48% for Democratic candidates. This helped the Republican Party gain a narrow majority in the House. Democrats retained control of the Senate. While Republicans exceeded expectations in a few states – notably New York and Florida – pre-election predictions of a “red wave” failed to materialize.

However, the broad outcome of the elections in much of the country was shaped largely by the underlying political makeup of the 2022 voters and how they differed from the voters of 2020 and 2018.

Midterm voters tend to be older, more educated and more affluent than those who vote just in presidential election years, a pattern apparent in both 2018 and 2022. The two elections also had something else in common: The president’s party suffered more “drop-off” voters than did the opposing party.

People who voted in 2018 who did not turn out in 2022 (“drop-off” voters), had favored Democrats in 2018 by about two-to-one (64% to 33%). Likewise, about a third of 2020 voters (32%) did not turn out in 2022. This group voted 53% to 43% for Joe Biden. The absence of these 2020 Biden voters resulted in a worse performance for Democratic candidates in 2022.

The drop-off voters mattered but so, too, did voters who turned out in 2022 but not in earlier elections – and these voters also helped Republican candidates. Those voting in 2022 included 21% who had not voted in 2018. This group supported Republican candidates in 2022 by a margin of 58% to 40%.

National polling data, especially when based on interviews conducted over time with the same individuals, can shed light on these dynamics. But there are limitations with national data, given that midterms are state and local elections. Partisan defections and split-ticket voting were critically important to the success of individual candidates for U.S. Senate and governor. These defections tended to benefit Democratic candidates more often than Republican candidates, even when national turnout trends mostly benefited Republican candidates.

This study is based on surveys of members of the Center’s American Trends Panel following the last four general elections (2016-2022). Voter turnout in each election was verified by a comparison with official records.

Some of the analysis focuses on a subset of 7,041 panelists interviewed post-election in 2022 for whom reliable measures of voter turnout and candidate choice were also available for the 2018 and 2020 elections. This allowed us to analyze how individuals’ voting preferences changed over time, separating the political consequences of changes in party preferences from changes in who turned out in each election. (All analysis that considers individual-level changes in turnout or vote preference excludes the 2016 dataset, due to diminishing sample sizes among those who were in the panel across multiple elections.)
Other key findings from the study Voters under 30 continued to strongly support the Democratic Party, voting 68% to 31% for Democratic candidates. But this margin was somewhat narrower than in 2018. Republicans benefitted more from significant drop off in voter turnout among younger age groups between 2018 and 2022, since young voters tend to support Democrats. Voters under 30 accounted for 10% of the electorate in 2022 – similar to their share of all voters in 2018 (11%), but down from 2020 (14%). To learn more about voter demographics, such as age, race & ethnicity, religion and community type, refer to Chapter 3 of this report.
Ideological polarization by party was nearly complete in 2022: Only 1% of self-described conservative Republicans voted for Democratic House candidates and less than 1% of liberal Democrats voted Republican.
Voting in person on Election Day increased sharply in 2022 compared with 2020. More voters reported casting ballots in person on Election Day in both parties, but the share remained much higher among Republican voters (51%) than among Democratic voters (34%).
White voters without college degrees made up a majority (54%) of Republican voters in 2022, compared with 27% of Democratic voters. Yet these voters made up a somewhat greater share of GOP voters in 2020 (58%) and 2018 (57%).
Voters ages 50 and older were a larger share of the total in 2022 (64%) than in any of the past three elections. 70% of Republican voters were 50 or older, as were 57% of Democratic voters.
Hispanic voters continued to support Democrats, but by a much smaller margin than in 2018: Hispanic voters favored Democratic candidates by a 21-point margin in 2022, compared with a 47-point margin in 2018. This change was driven by asymmetric changes in voter turnout among Hispanic adults, rather than changing preferences among individual Hispanic voters.
Black voters continued to support Democrats by overwhelming margins: 93% voted for Democrats in the midterms while 5% supported Republicans. This is similar to levels of support in 2020, 2018 and 2016. Black voters made up 9% of the electorate in both 2022 and 2018 and 11% of the electorate in 2020.
The Republican advantage among White evangelical Protestants was somewhat larger in 2022 than in the past three elections. 86% supported Republican candidates in 2022 and only 12% voted Democratic. 

 

Rubén Weinsteiner

jueves, 13 de julio de 2023

Republican Gains in 2022 Midterms Driven Mostly by Turnout Advantage

 


Rubén Weinsteiner

An examination of the 2022 elections, based on validated voters


We conducted this study to better understand which voters cast ballots in the 2022 midterm elections and how they voted. We also wanted to compare how turnout and vote choices differed from previous elections in 2020, 2018 and 2016. Measuring turnout among different groups in the electorate is challenging; it is particularly difficult to assess changes in turnout from election to election.

Panel data provides us a unique opportunity to study elections. By surveying the same people over time, we can more clearly see how differences in who stays home – and who turns out to vote –impacts each election. We can also measure how adults’ partisan voting preferences change (or do not change) between elections

For this study, we surveyed U.S. adults online on our nationally representative American Trends Panel (ATP). We verified their turnout in the four general elections using commercial voter files that aggregate publicly available official state turnout records. Panelists who said they voted and for whom a voting record was located are considered validated voters; all others are presumed not to have voted.

Additionally, we revised our statistical approach for the 2020 survey. That produced new results that slightly changed the numbers we reported about the 2020 election but changed no substantive findings in our report.


American Trends Panel: MARCA POLITICA’s
online survey panel, which consists of more than 12,000 adults who take two to three surveys each month. Some panelists have been participating in surveys since 2014.

Defectors/Defection: People who either switch their vote to a different party’s candidate from one election to the next, or those who in a given election do not support the candidate of the party they usually support. Also referred to as “vote switching.”

Drop off/Drop-off voters: People who vote in a given election but not in a subsequent election. The term commonly refers to people who vote in a presidential election but not in the next midterm. It can also apply to any set of elections.

Midterm elections: General elections held in all states and the District of Columbia in the even-numbered years between presidential elections. All U.S. House seats are up for election every two years, as are a third of U.S. Senate seats (senators serve six-year terms).

Mobilize: Efforts by candidates, political campaigns and other organizations to encourage or facilitate eligible citizens to turn out to vote.

Nonvoter: Citizens who didn’t have a record of voting in any voter file or told us they didn’t vote.

Panel survey: A type of survey that relies on a group of people who have agreed to participate in multiple surveys over a time period. Panel surveys make it possible to observe how individuals change over time because the answers they give to questions in a current survey can be compared with their answers from a previous survey.

Party affiliation/Party identification: Psychological attachment to a particular political party, either thinking of oneself as a member of the party or expressing greater closeness to one party than another. Our study categorizes adults as Democrats or Republicans using their self-reported party identification in a survey.

Split-ticket voting/Straight-ticket voting: Voters typically cast ballots for more than one office in a general election. People who vote only for candidates of the same party are “straight-ticket” voters, while those who vote for candidates of different parties are “split-ticket” voters.

Turnout: Refers to “turning out” to vote, or simply “voting.” Also used to refer to the share of eligible adults who voted in a given election (e.g., “The turnout in 2020 among the voting eligible population in the U.S. was 67%”).

Validated voters/Verified voter: Citizens who told us in a post-election survey that they voted in the 2022 general elections and have a record for voting in a commercial voter file. (The two terms are interchangeable).

Voter file: A list of adults that includes information such as whether a person is registered to vote, which elections they have voted in, whether they voted in person or by mail, and additional data. Voter files do not say who a voter cast a ballot for. Federal law requires states to maintain electronic voter files, and businesses assemble these files to create a nationwide list of adults along with their voter information.

In midterm elections that yielded mixed results for both parties, Republicans won the popular vote for the U.S. House of Representatives largely on the strength of higher turnout.

A new Pew Research Center analysis of verified voters and nonvoters in 2022, 2020, 2018 and 2016 finds that partisan differences in turnout – rather than vote switching between parties – account for most of the Republican gains in voting for the House last year.

Overall, 68% of those who voted in the 2020 presidential election turned out to vote in the 2022 midterms. Former President Donald Trump’s voters turned out at a higher rate in 2022 (71%) than did President Joe Biden’s voters (67%).
Large majority of voters stuck with 2020, 2018 party preference in their 2022 vote choices

As in previous elections, party loyalty remained strong in last fall’s midterms.

Relatively small shares of voters defected from their partisan affiliation or 2020 presidential vote. Among those who voted for both president in 2020 and for a House representative in 2022, just 6% crossed party lines between elections or voted for third-party candidates in either election.

Similarly, the vast majority of those who voted in both 2018 and 2022 had consistent party preferences across the two elections: 95% of those who voted for a Republican candidate in 2018, and 92% of those who voted for a Democrat, voted for a House candidate of the same party four years later.

Democratic 2018 voters were slightly more likely than Republican 2018 voters to defect in 2022, with the net consequences of the party balance flipping 1 or 2 percentage points to the GOP.

That is a potentially impactful shift in an environment of very close elections, but the greater driver of the GOP’s performance in 2022 was differential turnout: higher turnout among those supporting Republican candidates than those supporting Democratic candidates.

Given sharp political divisions in the United States, small changes in voter turnout from election to election have big consequences. Political polarization has meant that most people who vote in midterm elections are committed politically, making it unlikely they would defect from their partisan affiliation.

Shifts in turnout, as opposed to defections, were responsible for most of the changes in vote margins from the 2018 midterms within most subgroups in the population. For example, the Democratic advantage among women dropped from 18 points in 2018 (58% Democratic, 40% Republican) to just 3 points in 2022 (51% and 48%, respectively).

But when looking only at women who voted in both elections, there is no net advantage for either party from defections: 6% of those who voted Democratic in 2018 flipped to vote for a Republican candidate in 2022, and a nearly identical share of women who voted Republican in 2018 voted for a Democratic candidate in 2022 (5%).

Virtually all of the decline in the Democratic advantage among women is explained by the fact that the 2022 turnout rate for women who voted Republican in 2018 was 8 points higher than the rate for women who voted Democratic that year (84% vs. 76%).

There were a few important exceptions to this general rule.

For example, more rural voters changed their vote from a Democratic to a Republican candidate between 2018 and 2022 than the reverse. The Republican margin among this group nearly doubled between 2018 and 2022 (from 21 points to 40 points). Among rural voters, Republican candidates in 2022 held on to 97% of those who voted Republican in 2018, while Democratic candidates held on to a smaller share (91%).

And among White voters with no college degree, Republicans benefited from slightly higher rates of defection from Democratic candidates among those who voted in both elections
‘Drop-off’ voters contributed to Republican House gains

Collectively, Republican candidates for the House received roughly 51% of the total vote last fall compared with 48% for Democratic candidates. This helped the Republican Party gain a narrow majority in the House. Democrats retained control of the Senate. While Republicans exceeded expectations in a few states – notably New York and Florida – pre-election predictions of a “red wave” failed to materialize.

However, the broad outcome of the elections in much of the country was shaped largely by the underlying political makeup of the 2022 voters and how they differed from the voters of 2020 and 2018.

Midterm voters tend to be older, more educated and more affluent than those who vote just in presidential election years, a pattern apparent in both 2018 and 2022. The two elections also had something else in common: The president’s party suffered more “drop-off” voters than did the opposing party.

People who voted in 2018 who did not turn out in 2022 (“drop-off” voters), had favored Democrats in 2018 by about two-to-one (64% to 33%). Likewise, about a third of 2020 voters (32%) did not turn out in 2022. This group voted 53% to 43% for Joe Biden. The absence of these 2020 Biden voters resulted in a worse performance for Democratic candidates in 2022.

The drop-off voters mattered but so, too, did voters who turned out in 2022 but not in earlier elections – and these voters also helped Republican candidates. Those voting in 2022 included 21% who had not voted in 2018. This group supported Republican candidates in 2022 by a margin of 58% to 40%.

National polling data, especially when based on interviews conducted over time with the same individuals, can shed light on these dynamics. But there are limitations with national data, given that midterms are state and local elections. Partisan defections and split-ticket voting were critically important to the success of individual candidates for U.S. Senate and governor. These defections tended to benefit Democratic candidates more often than Republican candidates, even when national turnout trends mostly benefited Republican candidates.

This study is based on surveys of members of the Center’s American Trends Panel following the last four general elections (2016-2022). Voter turnout in each election was verified by a comparison with official records.

Some of the analysis focuses on a subset of 7,041 panelists interviewed post-election in 2022 for whom reliable measures of voter turnout and candidate choice were also available for the 2018 and 2020 elections. This allowed us to analyze how individuals’ voting preferences changed over time, separating the political consequences of changes in party preferences from changes in who turned out in each election. (All analysis that considers individual-level changes in turnout or vote preference excludes the 2016 dataset, due to diminishing sample sizes among those who were in the panel across multiple elections.)
Other key findings from the study Voters under 30 continued to strongly support the Democratic Party, voting 68% to 31% for Democratic candidates. But this margin was somewhat narrower than in 2018. Republicans benefitted more from significant drop off in voter turnout among younger age groups between 2018 and 2022, since young voters tend to support Democrats. Voters under 30 accounted for 10% of the electorate in 2022 – similar to their share of all voters in 2018 (11%), but down from 2020 (14%).
Ideological polarization by party was nearly complete in 2022: Only 1% of self-described conservative Republicans voted for Democratic House candidates and less than 1% of liberal Democrats voted Republican.
Voting in person on Election Day increased sharply in 2022 compared with 2020. More voters reported casting ballots in person on Election Day in both parties, but the share remained much higher among Republican voters (51%) than among Democratic voters (34%).
White voters without college degrees made up a majority (54%) of Republican voters in 2022, compared with 27% of Democratic voters. Yet these voters made up a somewhat greater share of GOP voters in 2020 (58%) and 2018 (57%).
Voters ages 50 and older were a larger share of the total in 2022 (64%) than in any of the past three elections. 70% of Republican voters were 50 or older, as were 57% of Democratic voters.
Hispanic voters continued to support Democrats, but by a much smaller margin than in 2018: Hispanic voters favored Democratic candidates by a 21-point margin in 2022, compared with a 47-point margin in 2018. This change was driven by asymmetric changes in voter turnout among Hispanic adults, rather than changing preferences among individual Hispanic voters.
Black voters continued to support Democrats by overwhelming margins: 93% voted for Democrats in the midterms while 5% supported Republicans. This is similar to levels of support in 2020, 2018 and 2016. Black voters made up 9% of the electorate in both 2022 and 2018 and 11% of the electorate in 2020.
The Republican advantage among White evangelical Protestants was somewhat larger in 2022 than in the past three elections. 86% supported Republican candidates in 2022 and only 12% voted Democratic. 

 

Rubén Weinsteiner

jueves, 15 de junio de 2023

Life on Social Media Platforms, in Users’ Own Words

 

Rubén Weinsteiner

In focus groups, highly engaged social media users describe the purposes that different platforms serve for them, their choices about what to reveal and how they try to anticipate any hostile reactions that could be lurking

Behind each sweeping exploration of the role social media plays in society stand the unique stories of Americans and their online lives. People bring deeply personal needs to social media, and their experiences play out in deeply personalized ways, tied to the platforms and communities they are part of. These platforms can host nearly any imaginable human encounter or emotion, from powerful self-expression and deep connection to intense hostility and ruinous deceit.

To gain insight into people’s experiences and the platform environments that shape them, Pew Research Center conducted a series of five focus groups from July 11 to 13, 2022. They were designed to capture how the participants – all of whom were especially engaged on social media platforms – might navigate the complexities of their online worlds.

The discussions shed light on topics that are difficult to cover with surveys alone: How do people create the social media environment they hope to enjoy? What choices and calculations do they make about what to reveal, where to reveal it and who might be watching? How do the platforms that companies provide shape the experiences they have?

The views of these 23 U.S. adults – called “highly engaged users” throughout this report as shorthand – are not representative of all social media users or other populations. These individuals:

  • Used multiple platforms frequently: They said they used at least three social media sites and apps, each at least a few times a week;
  • Frequently shared things or commented when using social media: They said they frequently used social media to share things about themselves, share things other people have posted or comment on others’ content – either almost every time they used it or often;
  • Found posting on social media important for self-expression: They said social media was extremely or very important to them in this way.

No single “social media experience” emerged from their stories – instead, participants’ accounts were nuanced and unique. Still, common themes arose from the group discussions, each connecting diverse experiences across platforms:  

Their stories highlight the ways navigating social media can both enrich and complicate people’s lives. Together, they form a detailed snapshot of what life on platforms looked like for these highly engaged users – grounded in the personal experiences of the people behind the screens and the platforms that shaped these experiences. (It should be noted that the groups took place before changes in Twitter’s ownership and as debates about TikTok were just heating up.)

This report describes findings from five live, online focus groups with a total of 23 U.S. adults, conducted from July 11 to 13, 2022. Pew Research Center worked with SSRS to conduct the groups, which were designed to capture the experiences of people who are especially engaged on social media platforms.

All of them were recruited from the SSRS Opinion Panel. To be eligible, they had to meet the following criteria making up the definition of a “highly engaged user” used in this report:

  • Used multiple platforms frequently: They said they used at least three social media sites and apps, each at least a few times a week;
  • Frequently shared things or commented when using social media: They said they frequently used social media to share things about themselves, share things other people have posted or comment on others’ content – either almost every time they used it or often;
  • Found posting on social media important for self-expression: They said social media was extremely or very important to them in this way.

For more details on other eligibility criteria, recruitment and group composition, read the Methodology.

Center researchers developed a recruitment screener and discussion guide with assistance from SSRS, who partnered with InsideOut Insights (IOI) to conduct and moderate the groups.

While these groups are not representative of any broader population, participants were selected in order to achieve a mix of demographic characteristics (e.g., age, gender, race and ethnicity, education, urbanicity and political party). Four groups were organized by either race and ethnicity or political party, while the other did not have additional demographic criteria. The discussion guide was the same for all groups.

Center researchers observed the focus groups and reviewed both the recordings and the transcripts from these groups to identify key themes and quotes. This report is meant to illustrate the variety of views and experiences of focus group participants, not the frequency with which these views and experiences came up. Views expressed by participants have not been fact-checked and are not representative of the overall experiences of any of these groups in the U.S. population. Quotations have been lightly edited for grammar and clarity.

Here are the key takeaways from the focus groups, featuring participants’ quotations (lightly edited for grammar and clarity) related to each of these themes.

What? Where? When? Using different platforms for different purposes

Our surveys have long shown that some people use a variety of online platforms – some of them very frequently – and that these are places for everything from staying in touch with loved ones to navigating contentious conversations. In the focus groups, two highly engaged users described how their use of platforms can make up the puzzle pieces of a highly customized online life.

“Twitter is more serious for me. Snapchat is a playground. We just get on there and post a bunch of goofy, great filters. … Then Facebook is more of a neighborhood or a village in a way. You can create your own set of, I guess, people that understand you. …  Instagram … is a picture book. … [Twitter is] politics and world events. I get a lot of news on my Twitter.”

– Woman, 20s

“Facebook is just mainly to just get in contact with my mom in Messenger. … YouTube is more entertainment. I also use it to learn more things because there’s just a bunch of information on YouTube, which you also have to be careful because a bunch of that is misinformation. And then TikTok is also mainly for entertainment for me. … For Instagram, it’s more friends and Facebook is more family.” 

– Woman, 30s

In the discussions, participants described how platforms served different purposes for them and helped them connect with different audiences. One participant described looking for specific things on TikTok, while another used it mostly to disconnect from life’s pressures; another turned to Twitter to follow politics; still others mentioned using platforms for entertainment or as a way to find solutions to problems they were dealing with.

Several highly engaged users described Facebook as a place to connect and interact with others, and one woman discussed how she uses different platforms for different groups of people in her life. A range of platforms came up throughout the discussions, including some beyond those the research initially set out to explore.


A recurring high note for some participants in the focus groups related to finding community, support and connection on social media platforms. Two participants described ways that interactions on platforms both surprised them and added value to their lives – allowing them to connect with people both online and offline.

“I like Facebook because of the different groups that you can join. … And it lets you know that you’re not alone on whatever it is that you’re experiencing. You’re not on the earth by yourself. … It lets you know that even though you may be facing something, you may see it in a group or see it on a post that somebody else … overcame it and it lets you know that you can too.”

– Woman, 50s

“I put a post out on … both Nextdoor and Facebook, and I was shocked at the outpouring of help I got. People said, ‘Hey, you don’t live that far from me. I’m a notary. I’d actually be happy to stop by and help you.’ … It was nice. It was a real good experience with social media. It was nothing but positive.” 

– Man, 50s

One man described how TikTok helped him to feel less alone amid the pandemic; another used Nextdoor to tap into the local community. When it comes to keeping in touch with people who matter to them, one woman discussed how Facebook served as a way to find connection and maintain long-distance relationships. Another user, adding his experience on a streaming platform, described how connection could have both upsides and downsides.

How much of the ‘real me’? Navigating authenticity and self-expression 


These highly engaged users described vastly different approaches to self-expression on social media, ranging from full authenticity to being highly reserved to aspiring to present their best selves. Participants discussed how forthcoming they felt they could be on platforms, what they want people to take away from their social media presence and how this connects with their offline self.

Two participants highlighted the extremes of a continuum when it comes to how much their social media presence reflects who they are offline – from putting everything out there confidently to drawing a sharp distinction between the “real” self and the social media self.

“I truly feel I’m the same person on social media and when you meet me. I’m definitely one of those people who pride myself in that. … [I use] the same tone, the same attitude, the same emotions, everything is the same for me.”

— Woman, 30s

“I definitely don’t think anyone would ever know the real me from social media, like probably 25% [of] me.” 

— Man, 40s

Some participants said they like being an open book and “living out loud”; others were more reluctant or selective. One user described keeping things “vague,” while others talked about letting their personality shine through or spreading positivity. Especially in one group, several women described using social media platforms to call attention to injustice and stand firm to their views:

 Who’s out there? Tailoring social media posts to different audiences in different places


For some participants in the focus groups – all of whom used multiple platforms frequently – their decisions about what to reveal and how much to share were tied to specific platforms and who would see what they posted on each. Some saw certain places as more suitable than others for some types of content, and made their decisions about what to post accordingly – for example, using multiple platforms to control who saw what. Still others said they only share things in places they feel are more private.

“I think on social media it’s more curated. I show what I want people to see. I mean, depends on what social media. Like Facebook, it’s very like prim and proper … [on] Instagram … I can go more into my political views, my social views and things like that.”

— Woman, 20s

“I use Instagram versus Facebook to separate friends and family. [I put] stuff I don’t want my family to see on Instagram and stuff I don’t want my friends to see on Facebook. And then my getaway from it all would be YouTube just [to] learn about random stuff.”

— Man, 20s

“I like privacy. … I don’t post a lot of things that I don’t want people to know. And I feel like I just want to share that with my family. That’s why I created a special group with them.”

— Woman, 20s

In deciding where to post, several participants described their calculations about how “public” certain platforms seem – even as the platforms offer a variety of privacy settings – and took note of who might follow them on each site. For example, one man in his 20s described feeling more cautious about Facebook, where “everybody’s watching,” while Instagram allowed him to take advantage of the fact that he could share things with less permanence.

Exposed and at risk: Anticipating possible attacks


In recent years, our survey research has explored why some people are reluctant to post on social media about political and social issues, how the public views cancel culture, and whether people think viewpoints are censored on social media.

The focus groups provided insight into risk calculations people might make as they navigate contentious environments, as well as the broader consequences users thought could arise from posting in a relatively public forum. Participants’ worries included concerns around being “screenshotted”; feeling like they have a target on their back; fearing for their reputation; and wondering if what they say might get them banned from a platform. Some of them described these concerns in response to questions about posting political views specifically:

“I don’t [share about political or social issues] online. If I do anything online, it would be … a thumbs up or a dislike, and that’s it. I would just keep it at that, because posting something would just turn into an argument, and I just don’t have the time and energy to continue, because it’s a losing battle anyway.” 

— Man, 40s

“People screenshot [things others say]. And take the private [conversation] out of private and make it public. It’s not safe [to express political views].”

— Woman, 20s

“[I think that] Facebook … [has] gotten so much worse with [banning you or locking your page]. … They will remove your post before they even let you know that you’ve made a mistake. And then they won’t even halfway give you a chance to kind of plead your case.”

— Woman, 30s

One woman discussed being verbally attacked on Facebook, while another described how the Nextdoor communities she was a part of differed based on where they were. Other participants talked about seeing some platforms as particularly combative, or the repercussions they might face in terms of their reputation or from the platforms. Some also talked about the possibility of being monitored or that social media posts could be used against them.  

Listen, CEO: Changing social media
for the better


To close out the focus groups, participants were asked how they would want those who run social media companies to troubleshoot some of the problems they see: If a CEO of a social media company you use were sitting with us right now, what would you tell them to change to make their platform better for you and the people you know?

Some responses covered issues from what speech is allowed on the platforms to prioritizing user wellness and protecting younger users.

“I would say [to Facebook’s CEO and others, I want] … a stronger stance combating authoritarian views and support.”

– Man, 50s

“Give me a time limit. … Once I reach my time limit, lock me out.” 

– Woman, 50s

“I would say everyone who is on social media needs to [meet] an age requirement.”

– Woman, 30s

Other users provided different takes on these themes – some to particular CEOs while others for social media platforms broadly:

domingo, 28 de mayo de 2023

How MARCA POLITICA Center will report on generations moving forward

 


Rubén Weinsteiner

Journalists, researchers and the public often look at society through the lens of generation, using terms like Millennial or Gen Z to describe groups of similarly aged people. This approach can help readers see themselves in the data and assess where we are and where we’re headed as a country.
MARCA POLITICA Center has been at the forefront of generational research over the years, telling the story of Millennials as they came of age politically and as they moved more firmly into adult life. In recent years, we’ve also been eager to learn about Gen Z as the leading edge of this generation moves into adulthood.
But generational research has become a crowded arena. The field has been flooded with content that’s often sold as research but is more like clickbait or marketing mythology. There’s also been a growing chorus of criticism about generational research and generational labels in particular.
Recently, as we were preparing to embark on a major research project related to Gen Z, we decided to take a step back and consider how we can study generations in a way that aligns with our values of accuracy, rigor and providing a foundation of facts that enriches the public dialogue.
A typical generation spans 15 to 18 years. As many critics of generational research point out, there is great diversity of thought, experience and behavior within generations.
We set out on a yearlong process of assessing the landscape of generational research. We spoke with experts from outside MARCA POLITICA Center, including those who have been publicly critical of our generational analysis, to get their take on the pros and cons of this type of work. We invested in methodological testing to determine whether we could compare findings from our earlier telephone surveys to the online ones we’re conducting now. And we experimented with higher-level statistical analyses that would allow us to isolate the effect of generation.
What emerged from this process was a set of clear guidelines that will help frame our approach going forward. Many of these are principles we’ve always adhered to, but others will require us to change the way we’ve been doing things in recent years.
Here’s a short overview of how we’ll approach generational research in the future:
We’ll only do generational analysis when we have historical data that allows us to compare generations at similar stages of life. When comparing generations, it’s crucial to control for age. In other words, researchers need to look at each generation or age cohort at a similar point in the life cycle. (“Age cohort” is a fancy way of referring to a group of people who were born around the same time.)
When doing this kind of research, the question isn’t whether young adults today are different from middle-aged or older adults today. The question is whether young adults today are different from young adults at some specific point in the past.
To answer this question, it’s necessary to have data that’s been collected over a considerable amount of time – think decades. Standard surveys don’t allow for this type of analysis. We can look at differences across age groups, but we can’t compare age groups over time.
Another complication is that the surveys we conducted 20 or 30 years ago aren’t usually comparable enough to the surveys we’re doing today. Our earlier surveys were done over the phone, and we’ve since transitioned to our nationally representative online survey panel, the American Trends Panel. Our internal testing showed that on many topics, respondents answer questions differently depending on the way they’re being interviewed. So we can’t use most of our surveys from the late 1980s and early 2000s to compare Gen Z with Millennials and Gen Xers at a similar stage of life.
This means that most generational analysis we do will use datasets that have employed similar methodologies over a long period of time, such as surveys from the U.S. Census Bureau. A good example is our 2020 report on Millennial families, which used census data going back to the late 1960s. The report showed that Millennials are marrying and forming families at a much different pace than the generations that came before them.
Even when we have historical data, we will attempt to control for other factors beyond age in making generational comparisons. If we accept that there are real differences across generations, we’re basically saying that people who were born around the same time share certain attitudes or beliefs – and that their views have been influenced by external forces that uniquely shaped them during their formative years. Those forces may have been social changes, economic circumstances, technological advances or political movements.
When we see that younger adults have different views than their older counterparts, it may be driven by their demographic traits rather than the fact that they belong to a particular generation.
The tricky part is isolating those forces from events or circumstances that have affected all age groups, not just one generation. These are often called “period effects.” An example of a period effect is the Watergate scandal, which drove down trust in government among all age groups. Differences in trust across age groups in the wake of Watergate shouldn’t be attributed to the outsize impact that event had on one age group or another, because the change occurred across the board.
Changing demographics also may play a role in patterns that might at first seem like generational differences. We know that the United States has become more racially and ethnically diverse in recent decades, and that race and ethnicity are linked with certain key social and political views. When we see that younger adults have different views than their older counterparts, it may be driven by their demographic traits rather than the fact that they belong to a particular generation.
Controlling for these factors can involve complicated statistical analysis that helps determine whether the differences we see across age groups are indeed due to generation or not. This additional step adds rigor to the process. Unfortunately, it’s often absent from current discussions about Gen Z, Millennials and other generations.
When we can’t do generational analysis, we still see value in looking at differences by age and will do so where it makes sense. Age is one of the most common predictors of differences in attitudes and behaviors. And even if age gaps aren’t rooted in generational differences, they can still be illuminating. They help us understand how people across the age spectrum are responding to key trends, technological breakthroughs and historical events.
Each stage of life comes with a unique set of experiences. Young adults are often at the leading edge of changing attitudes on emerging social trends. Take views on same-sex marriage, for example, or attitudes about gender identity.
Many middle-aged adults, in turn, face the challenge of raising children while also providing care and support to their aging parents. And older adults have their own obstacles and opportunities. All of these stories – rooted in the life cycle, not in generations – are important and compelling, and we can tell them by analyzing our surveys at any given point in time.
When we do have the data to study groups of similarly aged people over time, we won’t always default to using the standard generational definitions and labels. While generational labels are simple and catchy, there are other ways to analyze age cohorts. For example, some observers have suggested grouping people by the decade in which they were born. This would create narrower cohorts in which the members may share more in common. People could also be grouped relative to their age during key historical events (such as the Great Recession or the COVID-19 pandemic) or technological innovations (like the invention of the iPhone).
By choosing not to use the standard generational labels when they’re not appropriate, we can avoid reinforcing harmful stereotypes or oversimplifying people’s complex lived experiences.
Existing generational definitions also may be too broad and arbitrary to capture differences that exist among narrower cohorts. A typical generation spans 15 to 18 years. As many critics of generational research point out, there is great diversity of thought, experience and behavior within generations. The key is to pick a lens that’s most appropriate for the research question that’s being studied. If we’re looking at political views and how they’ve shifted over time, for example, we might group people together according to the first presidential election in which they were eligible to vote.
By choosing not to use the standard generational labels when they’re not appropriate, we can avoid reinforcing harmful stereotypes or oversimplifying people’s complex lived experiences.
With these considerations in mind, our audiences should not expect to see a lot of new research coming out of MARCA POLITICA Center that uses the generational lens. We’ll only talk about generations when it adds value, advances important national debates and highlights meaningful societal trends.

Rubén Weinsteiner

domingo, 7 de mayo de 2023

How Public Polling Has Changed in the 21st Century


Rubén Weinsteiner

61% of national pollsters in the U.S. used methods in 2022 that differed from those of 2016


The 2016 and 2020 presidential elections left many Americans wondering whether polling was broken and what, if anything, pollsters might do about it. A new Pew Research Center study finds that most national pollsters have changed their approach since 2016, and in some cases dramatically. Most (61%) of the pollsters who conducted and publicly released national surveys in both 2016 and 2022 used methods in 2022 that differed from what they used in 2016. The study also finds the use of multiple methods increasing. Last year 17% of national pollsters used at least three different methods to sample or interview people (sometimes in the same survey), up from 2% in 2016.

A chart showing Polling has entered a period of unprecedented diversity in methods

This study captures what changes were made and approximately when. While it does not capture why the changes were made, public commentary by pollsters suggests a mix of factors – with some adjusting their methods in response to the profession’s recent election-related errors and others reacting to separate industry trends. The cost and feasibility of various methods are likely to have influenced decisions.

This study represents a new effort to measure the nature and degree of change in how national public polls are conducted. Rather than leaning on anecdotal accounts, the study tracked the methods used by 78 organizations that sponsor national polls and publicly release the results. The organizations analyzed represent or collaborated with nearly all the country’s best-known national pollsters. In this study, “national poll” refers to a survey reporting on the views of U.S. adults, registered voters or likely voters. It is not restricted to election vote choice (or “horserace”) polling, as the public opinion field is much broader. The analysis stretches back to 2000, making it possible to distinguish between trends emerging before 2016 (e.g., migration to online methods) and those emerging more recently (e.g., reaching respondents by text message). Study details are provided in the Methodology. Other key findings from the study include:

Pollsters made more design changes after 2020 than 2016. In the wake of the 2016 presidential election, it was unclear if the polling errors were an anomaly or the start of a longer-lasting problem. 2020 provided an answer, as most polls understated GOP support a second time. The study found that after 2020, more than a third of pollsters (37%) changed how they sample people, how they interview them, or both. This compares with about a quarter (26%) who made changes after 2016. As noted above, though, these changes did not necessarily occur because of concerns about election-related errors.

The number of national pollsters relying exclusively on live phone is declining rapidly. Telephone polling with live interviewers dominated the industry in the early 2000s, even as pollsters scrambled to adapt to the rapid growth of cellphone-only households. Since 2012, however, its use has fallen amid declining response rates and increasing costs. Today live phone is not completely dead, but pollsters who use it tend to use other methods as well. Last year 10% of the pollsters examined in the study used live phone as their only method of national public polling, but 32% used live phone alone or in combination with other methods. In some cases, the other methods were used alongside live phone in a single poll, and in other cases the pollster did one poll using live phone and other polls with a different method.

Several key trends, such as growth of online polling, were well underway prior to 2016. While the 2016 and 2020 elections were consequential events for polling, the study illustrates how some of the methodological churn in recent years reflects longer-term trends. For example, the growth of online methods was well underway before 2016. Similarly, some live phone pollsters had already started to sample from registered voter files (instead of RDD, random-digit dialing) prior to 2016.

A chart showing Polling on probability-based panels is becoming more common

Use of probability-based panels has become more prevalent. A growing number of pollsters have turned to sampling from a list of residential addresses from the U.S. Postal Service database to draw a random sample of Americans, a method known as address-based sampling (ABS). There are two main types of surveys that do this: one-off or standalone polls and polls using survey panels recruited using ABS or telephone (known as probability-based panels). Both are experiencing growth. The number of national pollsters using probability-based panels alone or in combination with other methods tripled from 2016 to 2022 (from seven to 23). The number of national pollsters conducting one-off ABS surveys alone or in combination with other methods during that time rose as well (from one in 2016 to seven in 2022).

A chart showing Growth of online opt-in methods in national public polls paused between 2020 and 2022

The growth of online opt-in among national pollsters appears to have paused after 2020. The number of national pollsters using convenience samples of people online (“opt-in sampling”) – whether alone or in combination with other methods – more than quadrupled between 2012 and 2020 (from 10 to 47). In 2022, however, this number held flat, suggesting that the era of explosive growth could be ending.

Whether changes to sample sources and modes translate into greater accuracy in presidential elections remains to be seen. The fact that pollsters are expanding into new and different methods is not a guarantee that the underrepresentation of GOP support occurring in 2016 and 2020 preelection polls has been fixed. Polling accuracy improved in 2022, but this represents only one nonpresidential election. 

Notable study limitations

A study of this nature requires difficult decisions about what exactly will be measured and what will not. This study focuses on two key poll features: the sample source(s) – that is, where the respondents came from – and the mode(s), or how they were interviewed. While important, these elements are not exhaustive of the decisions required in designing a poll. The study did not attempt to track other details, such as weighting, where public documentation is often missing. Because the study only measured two out of all possible poll features, estimates from this study likely represent a lower bound of the total amount of change in the polling industry.

Another limitation worth highlighting is the fact that state-level polls are not included. Unfortunately, attempting to find, document and code polling from all 50 states and the District of Columbia would have exceeded the time and staff resources available. A related consideration is that disclosure of methods information tends to be spottier for pollsters who exclusively work at the state level, though there are some exceptions. It is not clear whether analysis at the level of detail presented in this report would be possible for state-only pollsters.

While not necessarily a limitation, the decision to use the polling organization rather than individual polls as the unit of analysis has implications for the findings. The proliferation of organizations using online methods implies but does not prove that online polls grew as well. However, research conducted by the American Association for Public Opinion Research (AAPOR) following the 2016 and 2020 elections reveals an explosion in the share of all polling done using online methods. AAPOR estimated that 56% of national polls conducted shortly before the 2016 election used online methods; the comparable share for 2020 was 84%. More details on the strengths and weaknesses of the study are presented in the Methodology.

Changes in methods are driven by many considerations, including costs and quality

In an attempt to verify the accuracy of the categorization of polling methodologies, researchers attempted to contact all organizations represented in the database. Several pollsters contacted for this study noted that use of a particular method was not necessarily an endorsement of methodological quality or superiority. Instead, design decisions often reflect a multitude of factors. Survey cost – especially the increasing cost of live phone polling – came up repeatedly. Timing can also be a factor, as a design like address-based sampling can take weeks or even months to field. As noted above, this study does not attempt to address why each organization polled the way they did. It aims only to describe major changes observable within the polling industry. Nor does it evaluate the quality of different methods, as a multitude of other studies address that question.

Changes to polling after 2020 differed from those after 2016

The study found a different kind of change within the polling industry after 2020 versus 2016. After 2020, changes were both more common and more complex. More than a third (37%) of pollsters releasing national public polls in both 2020 and 2022 changed their methods during that interval. By contrast, the share changing their methods between 2016 and 2018 was 26%.

A chart showing More than a third of national public pollsters changed how they poll after 2020

The nature of the changes also differed. About half of the changes observed from 2016 to 2018 reflected pollsters going online – either by adding online interviewing as one of their methods or fully replacing live phone interviewing. By contrast, the changes observed from 2020 to 2022 were more of a mix. During that period, some added an approach like text messaging (e.g., Change Research, Data for Progress), probability-based panels (Politico, USA Today) or multiple new methods (CNN, Wall Street Journal). About a quarter of the change observed from 2020 to 2022 reflected pollsters who had already moved online dropping live phone as one of their tools (e.g., CBS News, Pew Research Center).

A look at change over the entire recent period – from 2016 to 2022 – finds that more than half of national public pollsters (61%) used methods in 2022 that differed from those they used in 2016. As noted above, if features like weighting protocols were included in the analysis, that rate would be even higher.

A longer view of modern public polling (going back to 2000) shows that methodological churn began in earnest around 2012 to 2014. That was a period when about a third of national pollsters changed their methods. Change during that period was marked by pollsters starting to migrate away from live telephone surveys and toward online surveys.

Pollsters increasingly use multiple methods – sometimes three or more

A chart showing Growing share of national pollsters are using multiple methods

Pollsters are not just using different methods, many are now using multiple methods, the study found. Here again there is a discernable difference in how polls changed after 2016 and how they changed after 2020. After 2016, the share of pollsters using multiple methods remained virtually unchanged (30% in both 2016 and 2018). After 2020, however, the share climbed to 39%. Notably, the share of pollsters using three or more different methodologies in their national public polls tripled from 5% in 2020 to 17% in 2022.

In this analysis, “multiple methods” refers to use of multiple sample sources (e.g., registered voter files and random-digit dial) or multiple interview modes (e.g., online, mail, live telephone). In some cases, several methods were used in a single poll. In other cases the pollster did one poll using one method and another poll using another method.

As an example, in 2014 Pew Research Center switched from exclusively using live phone with random-digit-dial sample to also using a probability-based panel. In 2020 the Center added an additional method, one-off address-based sample surveys offering online or mail response. By 2022, the Center dropped live phone polling. Pollsters that used at least three different methods in 2022 include CNN, Gallup, NPR, Politico and USA Today.

Text messaging and address-recruited panels see growth after 2020

An overarching theme in the study is the growth of new methods. Analysis earlier in this report aimed to describe trends for the most prominent methods. In the past, pollsters often used just one method (e.g., live phone with random-digit dial). That has changed. Today pollsters tend to use new methods (such as text) as one of several ways that they reach people. To track the trajectory of these newer methods, it helps to consider the number of pollsters using the method by itself or in combination with other methods.

A prime example is text message polling. An extremely small share of pollsters conduct national public polls exclusively by text. A larger share use text alongside another method, such as online opt-in.

A chart showing Texting gains some traction in national polling in 2022

How texting is used varies. In some cases respondents receive a text with a web link for an online survey. In other cases, respondents answer the questions via text. Among the pollsters in this study, just one used texting in a national public survey in 2020. In 2022 that number rose to nine, representing 13% of the active national pollsters tracked that year. These figures reflect the number of pollsters using texting alone or in combination with other methods like live phone.

Analysis looking at methods used either alone or in combination with other approaches also suggests a change in the trajectory of online opt-in polling. While online opt-in usage grew tremendously between 2006 and 2020, that growth appears to have slowed if not stopped in 2022 for national polling.

By contrast, the share of national pollsters turning to probability-based panels continues to grow. In 2022 a third (33%) of national pollsters used probability-based panels either alone or in combination with other methods. This is up from roughly 10% during most of the 2010s.

Live phone was once the dominant method of polling but has been in decline since 2016. As of 2022, about a third of national pollsters used live phone alone or in combination (32%), while a much smaller share relied on it as their only method (10%).

The study also tracked the adoption of a specific kind of opt-in sample – members of an online opt-in panel who are matched to a record in a national registered voter file. This study first observed that approach in 2018. In 2018, 2020 and 2022, about 3% to 5% of national public pollsters used online opt-in samples matched to registered voter files, the study found.

Rubén Weinsteiner

domingo, 30 de abril de 2023

Teens and social media: Key findings from MARCA POLITICA Center surveys


Rubén Weinsteiner

Today’s teens are navigating a digital landscape unlike the one experienced by their predecessors, particularly when it comes to the pervasive presence of social media. In 2022, MARCA POLITICA  Center fielded an in-depth survey asking American teens – and their parents – about their experiences with and views toward social media. Here are key findings from the survey:


MARCA POLITICA Center conducted this study to better understand American teens’ experiences with social media and their parents’ perception of these experiences. For this analysis, we surveyed 1,316 U.S. teens ages 13 to 17, along with one parent from each teen’s household. The survey was conducted online by Ipsos from April 14 to May 4, 2022.

This research was reviewed and approved by an external institutional review board (IRB), Advarra, which is an independent committee of experts that specializes in helping to protect the rights of research participants.

Ipsos invited panelists who were a parent of at least one teen ages 13 to 17 from its KnowledgePanel, a probability-based web panel recruited primarily through national, random sampling of residential addresses, to take this survey. For some of these questions, parents were asked to think about one teen in their household. (If they had multiple teenage children ages 13 to 17 in the household, one was randomly chosen.) This teen was then asked to answer questions as well. The parent portion of the survey is weighted to be representative of U.S. parents of teens ages 13 to 17 by age, gender, race, ethnicity, household income and other categories. The teen portion of the survey is weighted to be representative of U.S. teens ages 13 to 17 who live with parents by age, gender, race, ethnicity, household income and other categories.

Here are the questions used for this report, along with responses, and its methodology.

Majorities of teens report ever using YouTube, TikTok, Instagram and Snapchat. YouTube is the platform most commonly used by teens, with 95% of those ages 13 to 17 saying they have ever used it, according to a Center survey conducted April 14-May 4, 2022, that asked about 10 online platforms. Two-thirds of teens report using TikTok, followed by roughly six-in-ten who say they use Instagram (62%) and Snapchat (59%). Much smaller shares of teens say they have ever used Twitter (23%), Twitch (20%), WhatsApp (17%), Reddit (14%) and Tumblr (5%).

Facebook use among teens dropped from 71% in 2014-15 to 32% in 2022. Twitter and Tumblr also experienced declines in teen users during that span, but Instagram and Snapchat saw notable increases.

TikTok use is more common among Black teens and among teen girls. For example, roughly eight-in-ten Black teens (81%) say they use TikTok, compared with 71% of Hispanic teens and 62% of White teens. And Hispanic teens (29%) are more likely than Black (19%) or White teens (10%) to report using WhatsApp. (There were not enough Asian teens in the sample to analyze separately.)

Teens’ use of certain social media platforms also varies by gender. Teen girls are more likely than teen boys to report using TikTok (73% vs. 60%), Instagram (69% vs. 55%) and Snapchat (64% vs. 54%). Boys are more likely than girls to report using YouTube (97% vs. 92%), Twitch (26% vs. 13%) and Reddit (20% vs. 8%).

Majorities of teens use YouTube and TikTok every day, and some report using these sites almost constantly. About three-quarters of teens (77%) say they use YouTube daily, while a smaller majority of teens (58%) say the same about TikTok. About half of teens use Instagram (51%) or Snapchat (50%) at least once a day, while 19% report daily use of Facebook.

Some teens report using these platforms almost constantly. For example, 19% say they use YouTube almost constantly, while 16% and 15% say the same about TikTok and Snapchat, respectively.

More than half of teens say it would be difficult for them to give up social media. About a third of teens (36%) say they spend too much time on social media, while 55% say they spend about the right amount of time there and just 8% say they spend too little time. Girls are more likely than boys to say they spend too much time on social media (41% vs. 31%).

Teens are relatively divided over whether it would be hard or easy for them to give up social media. Some 54% say it would be very or somewhat hard, while 46% say it would be very or somewhat easy.

Girls are more likely than boys to say it would be difficult for them to give up social media (58% vs. 49%). Older teens are also more likely than younger teens to say this: 58% of those ages 15 to 17 say it would be very or somewhat hard to give up social media, compared with 48% of those ages 13 to 14.

Teens are more likely to say social media has had a negative effect on others than on themselves. Some 32% say social media has had a mostly negative effect on people their age, while 9% say this about social media’s effect on themselves.

Conversely, teens are more likely to say these platforms have had a mostly positive impact on their own life than on those of their peers. About a third of teens (32%) say social media has had a mostly positive effect on them personally, while roughly a quarter (24%) say it has been positive for other people their age.

Still, the largest shares of teens say social media has had neither a positive nor negative effect on themselves (59%) or on other teens (45%). These patterns are consistent across demographic groups.

Teens are more likely to report positive than negative experiences in their social media use. Majorities of teens report experiencing each of the four positive experiences asked about: feeling more connected to what is going on in their friends’ lives (80%), like they have a place where they can show their creative side (71%), like they have people who can support them through tough times (67%), and that they are more accepted (58%).

When it comes to negative experiences, 38% of teens say that what they see on social media makes them feel overwhelmed because of all the drama. Roughly three-in-ten say it makes them feel like their friends are leaving them out of things (31%) or feel pressure to post content that will get lots of comments or likes (29%). And 23% say that what they see on social media makes them feel worse about their own life.

There are several gender differences in the experiences teens report having while on social media. Teen girls are more likely than teen boys to say that what they see on social media makes them feel a lot like they have a place to express their creativity or like they have people who can support them. However, girls also report encountering some of the pressures at higher rates than boys. Some 45% of girls say they feel overwhelmed because of all the drama on social media, compared with 32% of boys. Girls are also more likely than boys to say social media has made them feel like their friends are leaving them out of things (37% vs. 24%) or feel worse about their own life (28% vs. 18%).

When it comes to abuse on social media platforms, many teens think criminal charges or permanent bans would help a lot. Half of teens think criminal charges or permanent bans for users who bully or harass others on social media would help a lot to reduce harassment and bullying on these platforms.

About four-in-ten teens say it would help a lot if social media companies proactively deleted abusive posts or required social media users to use their real names and pictures. Three-in-ten teens say it would help a lot if school districts monitored students’ social media activity for bullying or harassment.

Some teens – especially older girls – avoid posting certain things on social media because of fear of embarrassment or other reasons. Roughly four-in-ten teens say they often or sometimes decide not to post something on social media because they worry people might use it to embarrass them (40%) or because it does not align with how they like to represent themselves on these platforms (38%). A third of teens say they avoid posting certain things out of concern for offending others by what they say, while 27% say they avoid posting things because it could hurt their chances when applying for schools or jobs.

These concerns are more prevalent among older teen girls. For example, roughly half of girls ages 15 to 17 say they often or sometimes decide not to post something on social media because they worry people might use it to embarrass them (50%) or because it doesn’t fit with how they’d like to represent themselves on these sites (51%), compared with smaller shares among younger girls and among boys overall.

Many teens do not feel like they are in the driver’s seat when it comes to controlling what information social media companies collect about them. Six-in-ten teens say they think they have little (40%) or no control (20%) over the personal information that social media companies collect about them. Another 26% aren’t sure how much control they have. Just 14% of teens think they have a lot of control.

Despite many feeling a lack of control, teens are largely unconcerned about companies collecting their information. Only 8% are extremely concerned about the amount of personal information that social media companies might have and 13% are very concerned. Still, 44% of teens say they have little or no concern about how much these companies might know about them.

Only around one-in-five teens think their parents are highly worried about their use of social media. Some 22% of teens think their parents are extremely or very worried about them using social media. But a larger share of teens (41%) think their parents are either not at all (16%) or a little worried (25%) about them using social media. About a quarter of teens (27%) fall more in the middle, saying they think their parents are somewhat worried.

Many teens also believe there is a disconnect between parental perceptions of social media and teens’ lived realities. Some 39% of teens say their experiences on social media are better than parents think, and 27% say their experiences are worse. A third of teens say parents’ views are about right.

Nearly half of parents with teens (46%) are highly worried that their child could be exposed to explicit content on social media. Parents of teens are more likely to be extremely or very concerned about this than about social media causing mental health issues like anxiety, depression or lower self-esteem. Some parents also fret about time management problems for their teen stemming from social media use, such as wasting time on these sites (42%) and being distracted from completing homework (38%).

Rubén Weinsteiner