top of page

How Your Values Shape What You Share Online

What happens when your values match the message of a post?


If you use social media, chances are you’ve shared content someone else posted, maybe a meme, a clever joke, or a news article. While sharing is a core part of online engagement, it also has a downside: it can contribute to the spread of misinformation.


That’s why it's important to understand the factors that influence people to share news stories online, especially those that contain false or misleading information.


A 2025 study published in the Journal of Experimental Psychology: General, led by Suhaib Abdurahman at the University of Southern California, explored how a person’s moral values affect their willingness to share certain posts. Specifically, the researchers investigated whether value alignment between a social media post and its audience contributes to the spread of misinformation.


To do this, they focused on issues with potential political relevance. Prior research shows that liberals tend to emphasize individualizing values such as care and equality, while conservatives are more likely to prioritize binding values like loyalty, patriotism, and respect for authority.


The team created mock social media posts linking to either accurate news stories or ones containing misinformation. Each post was framed in one of three ways: highlighting an individualizing value, a binding value, or no specific value (neutral framing).


In one experiment, participants saw several of these posts and indicated whether they would consider sharing them. They also completed a questionnaire assessing the extent to which they personally endorsed individualizing or binding values. The neutral posts served as a control, helping to isolate the influence of value alignment.


The findings replicated across studies were striking. People were more likely to share posts that reflected values they strongly identified with. This effect was particularly pronounced when the content included misinformation. In other words, posts that “felt right” aligned with a person’s values were more likely to be shared, even when they were false.


Interestingly, this value-post match didn’t lead participants to reflect more deeply on the content. Nor did their analytical thinking skills lessen the effect. The emotional comfort of value alignment appeared to override critical evaluation.


One study even analyzed real-world Twitter data. Researchers examined engagement with posts shared by users classified as liberal or conservative, based on who they followed. Using machine learning, they identified whether each post’s language reflected individualizing, binding, or neutral values.


The real-world data echoed the lab results. Liberal accounts received more engagement when their posts emphasized individualizing values, while conservative accounts gained more traction with binding-value framing. This confirmed that the value alignment effect influences behavior not only in controlled settings but also in everyday online interactions.


Ultimately, this research helps explain why misinformation spreads so easily. When a post aligns with our core values, it generates a sense of resonance, a positive emotional response that increases our desire to share it. That “good feeling” can override the impulse to fact-check or question the accuracy of the information.


In a digital landscape where platforms do little to filter false content, these tendencies allow misinformation to thrive. And because our values often align with our political identities, this dynamic means that strongly held political beliefs may be reinforced through exposure to and the sharing of false information.

 
 
 

Comments


bottom of page