Inside the bizarre world of internet trolls and propagandists - Andrew Marantz
Lesson created by Lauren McAlpine using TED-Ed's lesson creatorVideo from TED YouTube channel
Let’s Begin…
Journalist Andrew Marantz spent three years embedded in the world of internet trolls and social media propagandists, seeking out the people who are propelling fringe talking points into the heart of conversation online. Go down the rabbit hole of online propaganda and misinformation— and learn we can start to make the internet less toxic.
Create and share a new lesson based on this one.
About TED Talk Lessons
TED Talk Lessons are created by TED-Ed using phenomenal TED Talks. Do you have an idea for a lesson? Create it now using any video from YouTube »
Meet The Creators
- Video created by TED
- Lesson Plan created by Lauren McAlpine
Lauren McAlpine
Lesson creator
What responsibility do social media platforms have in moderating hate speech or misinformation?
Comments are closed on this discussion.
kenzie bradshaw
Lesson completed
Social media platforms have a responsibility to moderate hate speech and misinformation to
Annie Bargan
Lesson completed
It's argued that social media platforms have a great responsibility in moderating agenda or threatening speech. They are regarded as providers of platforms where other people are able to share their ideas, express their feelings about several issues. Even though social media platforms cannot establish whether any information is valid or not, they ought to promote a far more helpful and creative content which is likely to motivate others to do useful things, contributing world or society. Their best bet is to spot whether the content have high engagement or not and what exactly it causes: emotions, rationalism or something else. Thereafter they should determine if this content is valid, is there any hateful subtitles and decide whether it should be blocked or not.
Mallory Johnson
Lesson completed
I feel social media platforms should monitor things their users say or upload, usually accounts that promote hate will just have a post taken down or a warning for violating community guidelines. There aren't really any serious consequences for online trolls if social media platforms took a bigger stance against hate speech or misinformation then social media wouldn't be such a toxic environment.
Manuela Koch
Lesson completed
Social media platforms should alter their goals instead of maximum engagement, which triggers hate speeches, they should try maximum validity
Carol Allen
Lesson in progress
I think this is a great topic given what's been done by social media platforms following the riots of 6 January 2021. Now Social platforms are shutting down accounts. Right? Wrong?
I equate it to yelling fire in a movie theater. Others say it's unconstitutional due to the first amendment. I'd be curious as to arguments on both sides given this topic. I'd like to know the speaker's thoughts on this.
Raya Lantin
Lesson in progress
Social media platforms need to alter their algorithims such that their networks are no longer optimizing for maximal engagement because this is what can cause misinformation and hate speech to spread. Misinformation and hate speech, when worded correctly, can arouse emotional responses in people that cause them to believe what is being said.
Mohammed Talib Khan
Lesson completed
Make human decency and good behavior cool again!!
Steve Frank
Lesson in progress
in response to minseo kim Show comment
Does that mean not being influenced by the world outside? No references, even to great thinkers?
minseo kim
Lesson completed
It's great to be independent-minded, we all should be independent-minded,
but just be smart about it. when actually, that is the very beginning of any meaningful conversation.All the interesting stuff happens after that point.