3,505 people helped us to go beyond echo chambers over the past 6 months

While explaining the problem of fake news, we often mention about effects of echo chambers, to which we confine ourselves on social media, and of filter bubbles created by algorithms on emergence and spread of these kinds of stories.

These structures that complicate access to true news stories or opposing views among the polarized media make us all trapped within our own vague realities. Tools which we use for searching suspicious stories on internet might be insufficient to reach to some false stories or misinformation.

In order to avoid the above-mentioned situation and find suspicious stories that spread in different echo chambers, we have been using social media accounts, website, [email protected] email address, and WhatsApp hotline since October 26, 2016. We ask internet users to share the stories they suspect through these channels.

The allegation that the message about boron mines, which was shared through WhatsApp, was sent by TMMOB is not true.

If we publish any analysis on our website or share any content on social media about any suspicious story that has been shared by internet users, we send them feedback and inform about the truth.

We have seen that this kind of experience helped us to reach different echo chambers and find false stories spreading through such messaging apps as WhatsApp or Telegram and e-mail groups, which are called “dark social” since they cannot be measured and tracked. Since October 26, 2016, we have received 4,534 reports in total through all these channels. 2,098 individual suspicious contents have been reported. 62% of these reports were political news and with 2,621 reports, Twitter has been the most used channel.

18.2% of reports have been finalized, 19.3% have been archived in accordance with their emergency, urgent and important and 62.5% could not be concluded since data were not available or credible.  


We replied 3,666 of 4,534 reports as “Hello, thank you for your report. We’ve taken it under review.”  22.415 of the reports taken under review have been concluded as an analysis or social media content. We gave feedback about the results to users who reported the suspicious stories.

77.59% of these reports did not turn into an analysis since there was no sufficient evidence and they were based on interpretation or did not meet our significance, urgency and prevalence criteria as a result of the review.  

The circular that allegedly bans using the word “hayırlı” –in Turkish it is used for greetings and blessings and is very close to the Turkish word for “no,” “hayır”- was sent 27 times via WhatsApp.

The other 881 reports were replied as “Hello, thank you for your report. While verifying, our priorities are significance, urgency and prevalence. We added it into our archive.” 6% of the archived reports have turned into analysis and we gave feedback to the users reporting them.

So far we have met quite a lot times the question “How can I help you in your works?”  3 people out of teyit.org team regularly search media and social media during the day to detect suspicious stories. However, there are other 3,505 people who also helped us via social media by sending suspicious stories they come across on internet.

As it is seen above, reports are important for us to go beyond the echo chambers and detect what people talk in “dark social”. We will continue our efforts to get more stories from different chambers via hotline channels and to develop new tools in future.

Translation: Melek Güler