در رسانههای خبری، لفظ اتاق پژواکتمثیلی از اتاق پژواک صوتی است، که در آن صدا در فضایی بسته منعکس میشود. اتاق پژواک توصیفی کنایی از وضعیتی است که در آن اطلاعات، ایدهها، یا باورها با ارتباطات و تکرار درون یک سامانه تعریف شده تقویت میشوند. درون یک اتاق پژواک تمثیلی، منابع رسمی غالباً زیر سؤال نمیروند و دیدگاههای متفاوت یا رقیب سانسور، ممنوع، یا کمتر از حد پوشش داده میشوند.
Situation that reinforces beliefs by repetition inside a closed system
In news media, echo chamber is a metaphorical description of a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system. By visiting an "echo chamber", people are able to seek out information that reinforces their existing views, potentially as an unconscious exercise of confirmation bias. This may increase political and social polarization and extremism. The term is a metaphor based on the acoustic echo chamber, where sounds reverberate in a hollow enclosure.
Another emerging term for this echoing and homogenizing effect on the Internet within social communities is cultural tribalism.
The Internet has expanded the variety and amount of accessible political information. On the positive side, this may create a more pluralistic form of public debate; on the negative side, greater access to information may lead to selective exposure to ideologically supportive channels. In an extreme "echo chamber", one purveyor of information will make a claim, which many like-minded people then repeat, overhear, and repeat again (often in an exaggerated or otherwise distorted form) until most people assume that some extreme variation of the story is true.
It is important to distinguish the difference between echo chambers and filter bubbles. Both concepts relate to the ways individuals are exposed to content devoid of clashing opinions, and colloquially might be used interchangeably. However, echo chamber refers to the overall phenomenon by which individuals are exposed only to information from like-minded individuals, while filter bubbles are a result of algorithms that choose content based on previous online behavior, as with search histories or online shopping activity.
The echo chamber effect occurs online due to a harmonious group of people amalgamating and developing tunnel vision. Participants in online discussions may find their opinions constantly echoed back to them, which reinforces their individual belief systems. However, individuals who participate in echo chambers often do so because they feel more confident that their opinions will be more readily accepted by others in the echo chamber. This happens because the Internet has provided access to a wide range of readily available information. People are increasingly receiving their news online through untraditional sources, such as Facebook, Google, and Twitter, that have established personalization algorithms that cater specific information to individuals’ online feeds. This method of curating content has replaced the function of the traditional news editor. The mediated spread of information through online networks causes a risk of an algorithmic filter bubble.
Online social communities become fragmented when like-minded people group together and members hear arguments in one specific direction. In certain online platforms, such as Twitter, echo chambers are more likely to be found when the topic is more political in nature compared to topics that are seen as more neutral.Social networking communities are powerful reinforcers of rumors because people trust evidence supplied by their own social group, more than they do the news media.[unreliable source?] This can create significant barriers to critical discourse within an online medium. Social discussion and sharing suffer when people have a narrow information base and don't reach outside their network. The echo chambers can be detrimental to the well-being of a person. Essentially, the filter bubble can distort our very own realities that we thought could not be altered by outside sources. The Farnam Street academic blog explains that the filter bubble can have a bigger impact on us than we think. It can create echo chambers that leads us to believe that what you are seeing through ads is the only opinion or perspective that is right. This goes back to political ads that were constantly in circulation on the internet making the user think that it is the only correct opinion out there. Put otherwise, “If we don’t like facts, we don’t believe them. If we DO like something presented to us as fact, even if it is false, we tend to believe it. If we see too much of our viewpoint and perspectives everyday, we believe that there are no other opinions and that ours is the correct one in all cases.
Many offline communities are also segregated by political beliefs and cultural views. The echo chamber effect may prevent individuals from noticing changes in language and culture involving groups other than their own. Online echo chambers can sometimes influence an individual’s willingness to participate in similar discussions offline. A 2016 study found that “Twitter users who felt their audience on Twitter agreed with their opinion were more willing to speak out on that issue in the workplace”.
Ideological echo chambers have existed in many forms, for centuries. The echo chamber effect has largely been cited as occurring in politics.
The McMartin preschool trial coverage was criticized by David Shaw in his 1990 Pulitzer Prize winning articles, "None of these charges was ultimately proved, but the media largely acted in a pack, as it so often does on big events, and reporters' stories, in print and on the air, fed on one another, creating an echo chamber of horrors." He said this case "exposed basic flaws" in news organizations like "Laziness. Superficiality. Cozy relationships" and "a frantic search to be first with the latest shocking allegation". "Reporters and editors often abandoned" journalistic principles of "fairness and skepticism." And "frequently plunged into hysteria, sensationalism and what one editor calls 'a lynch mob syndrome.'"
The 2016 presidential election in the United States triggered a stream of discourse about the echo chamber in media. Constituents were more likely to absorb information about topics such as gun control and immigration that aligned with their preexisting beliefs, as they were more likely to view information they already agreed with. Facebook is more likely to suggest posts that are congruent with your standpoints; therefore there was mainly repetition of already stable standpoints instead of a diversity of opinions. Journalists argue that diversity of opinion is necessary for true democracy as it facilitates communication, and echo chambers, like those occurring in Facebook, inhibited this. Some believed echo chambers played a big part in the success of Donald Trump in the 2016 presidential elections.
discussion of opioid drugs as suitable for long-term pain maintenance
Some companies have also made efforts in combating the effects of an echo chamber on an algorithmic approach. A high-profile example of this is the changes Facebook made to its “Trending” page, which is an on-site news source for its users. Facebook modified their “Trending” page by transitioning from displaying a single news source to multiple news sources for a topic or event. The intended purpose of this was to expand the breadth of news sources for any given headline, and therefore expose readers to a variety of viewpoints. There are startups building apps with the mission of encouraging users to open their echo chambers. UnFound.news offers an AI(Artifical Intelligence) curated news app to readers presenting them news from diverse and distinct perspectives, helping them form rational and informed opinion rather than succumbing to their own biases. It also nudges the readers to read different perspectives if their reading pattern is biased towards one side/ideology. Another example is a beta feature on BuzzFeed News, called “Outside Your Bubble". This experiment adds a module at the bottom of Buzzfeed News articles to show reactions from various platforms, like Twitter, Facebook, and Reddit. This concept aims to bring transparency and prevent biased conversations diversifying the viewpoints their readers are exposed to.
^ abHampton, Keith N.; Shin, Inyoung; Lu, Weixu (2017-07-03). "Social media and political discussion: when online presence silences offline conversation". Information, Communication & Society. 20 (7): 1090–1107. doi:10.1080/1369118x.2016.1218526. ISSN1369-118X.
^Barberá, Pablo; Jost, John T.; Nagler, Jonathan; Tucker, Joshua A.; Bonneau, Richard (2015-08-21). "Tweeting From Left to Right". Psychological Science. 26 (10): 1531–1542. doi:10.1177/0956797615594620. PMID26297377.
David Brock, Blinded by the Right: The Conscience of an Ex-Conservative (New York, NY: Three Rivers Press, 2002).
Jeff Chester, "A Present for Murdoch", The Nation, December 2003: "From 1999 to 2002, his company spent almost $10 million on its lobbying operations. It has already poured $200,000 in contributions into the 2004 election, having donated nearly $1.8 million during the 2000 and 2002 campaigns."
Jim Lobe for Asia Times: "the structure's most remarkable characteristics are how few people it includes and how adept they have been in creating new institutions and front groups that act as a vast echo chamber for one another and for the media"