A quick check of an online review platform can show you how thousands of other people rated the food and service of restaurants, which thrive on the feedback.
There has been a lot of research on online reviews, but an Arizona State University professor’s paper is breaking new ground by looking at the actual words that people use in their reviews. He and his colleagues found that when the online review platform Yelp started allowing users to post simultaneously on Facebook, it changed their nature.
The result was a double-edged sword for the sites — more reviews, which Yelp wants, but more “emotional” language, which users say is less helpful, according to previous research. In other words, more quantity but less quality.
“Online reviews help consumers make decisions about which products to purchase, and firms want to leverage that to advertise their products and have good word of mouth,” says Yili Hong, an assistant professor of information systems in the W. P. Carey School of Business.
“There is a long stream of research in our discipline looking at user-generated content, but one thing the research hasn’t delved into is the textual aspects.”
That’s because it’s much harder to quantify words compared with counting the number of stars or words in a review.
Hong and his colleagues wanted to see how integrating with Facebook — where friends could see friends’ consumer reviews — would change their words.
“How will this affect people’s behavior in writing reviews? They don’t want to disagree with their friends,” Hong says.
The team had a natural control situation when Yelp integrated with Facebook in July 2009, and TripAdvisor integrated 15 months later, providing a window that the team examined. They then randomly selected nearly 4,000 restaurants in New York City, Los Angeles, Chicago, Philadelphia, and Phoenix that were reviewed on both platforms from 2008 to 2012.
They used automated text-mining software to calculate the presence of words in three linguistic categories — emotional, cognitive, and negation or disagreeing, language — in reviews of the restaurants on both Yelp and TripAdvisor. Then they compared the wording.
They found that when reviewers knew that their Facebook friends would see their reviews, they used more emotional language — and more positive emotions — and less cognitive language. There also is a big decrease in “negation” words.
That’s not necessarily a good thing for the review sites, Hong says.
“If you’re very emotional, people will think you’re just coming to dump your emotions in the reviews as opposed to being logical,” he says.
One takeaway for review sites: Consider ways to encourage users to be more logical and less emotional when writing reviews.
Online reviews are a huge business, both for the platforms — which make money by selling ads based on some views on the site — and the reviewed companies. Hong has two other papers on user-generated content published in the journal Management Science. In one, he and his team measured how people could be persuaded to write online reviews. The best way? A combination of financial incentives and peer pressure — telling them how many of their peers had contributed reviews. In another study, he found that using push alerts to tell reviewers how many “likes” they had compared with other reviewers only prompted them to produce more if they were winning. Low-performing reviewers would slack off when they discovered they weren’t competitive in “likes.”
Analyzing the linguistic features of reviews is the next frontier of research as more sophisticated evaluation techniques develop, Hong says.
“Nowadays people are thinking about tweets and other social-media things, and they want to look into the text to measure things,” he says.
“Maybe they will even be able to measure sarcasm one day.”
First published in ASU Now on March 7, 2017.