YouTube tries to find and delete licensed, third-party material on its servers, as well as illicit material. When unacceptable material is found, it is usually replaced on YouTube with one of four messages:
Not long after I started using YouTube, I saw a video that had been replaced with message #1. I assumed that the video had been ripped from a television broadcast or something of that nature. I thought nothing of my run-in with message #1.
Some weeks later, I clicked on a video and got message #2. Curious, I went to YouTube's Terms of Service and read the following: "Respect the community. We're not asking for the kind of respect reserved for nuns, the elderly and brain surgeons. We mean don't abuse the site." That seemed fair enough. I closed the page wondering how terrible the video had been.
But then came the times that messages #1 and #2 popped up to explain why my two most-popular videos were no longer accessible to the public. I had not used copyrighted material or abused the site, and could not rationalize why the videos had been removed. But the long rebuttal process required me to give out all kinds of information, so I deleted the videos from my account and upload them again. Everything was restored but the view count.
Some time later, with views approaching five digits on some of my videos, YouTube completely closed my account. I received no explanation, but I had a clue; some woman had been arguing with me in the "comments" section of my video and begun leaving rude messages on the "discussion" section of my profile. Putting two and two together, I considered the possibility that the woman had been so flustered, she had actually petitioned YouTube to delete my channel because she did not like what I had to say. Luckily, some other YouTube users had already downloaded my videos and reposted them. To this day, they are still searchable on those other channels.
Over the years, I have seen videos on polarizing issues disappear on YouTube - quite often, after the view count reached over 20000 views, which seems to be the "sweet spot" and breaks out from its niche audience. This is probably due to the fact that, with an ever-increasing number of views, the video gets pushed increasingly higher in the search results when one searches for the keywords associated with the video. I suspect there may be a knee-jerk response among some who go to a video on a topic expecting the opposite of what they hear, leading to their irate flagging. Observing how the liberal-left has increasingly tried to silence criticism by calling it "hate", I find the explanation plausible. The scary thing is, if that is the case, it means somebody at YouTube had sympathies towards this woman's hostile reaction and granted her her wish.
Of course, YouTube should be able to choose what it wants to put on its servers. But, when videos are deleted occur under a false pretext and the viewer who attempts to access them gets message #1 or #2, the viewer is put into a false reality where free speech is abound, and public opinion is available to be accessed. I would argue that the illusion of an open forum/impartial gatekeepers is more dangerous than the lack of an open forum/biased gatekeepers. The reason being, it is better to know that not all views are represented than to be under the false impression that they are.
These conclusions are particularly important in view of the movement to create "some organization" to cleanse the internet of copyright violators. Which organization could handle such a policing responsibility is not clear. But how quickly this privilege could be abused by whatever organization is in charge is, after my experience on YouTube, very clear.
- This video is no longer available due to a copyright claim by (Fox/Warner Brothers/Etc).
- This video has been removed because its content violated YouTube's Terms of Service.
- This video is not available in your country.
- This video is no longer available because the YouTube account associated with this video has been terminated.
Not long after I started using YouTube, I saw a video that had been replaced with message #1. I assumed that the video had been ripped from a television broadcast or something of that nature. I thought nothing of my run-in with message #1.
Some weeks later, I clicked on a video and got message #2. Curious, I went to YouTube's Terms of Service and read the following: "Respect the community. We're not asking for the kind of respect reserved for nuns, the elderly and brain surgeons. We mean don't abuse the site." That seemed fair enough. I closed the page wondering how terrible the video had been.
But then came the times that messages #1 and #2 popped up to explain why my two most-popular videos were no longer accessible to the public. I had not used copyrighted material or abused the site, and could not rationalize why the videos had been removed. But the long rebuttal process required me to give out all kinds of information, so I deleted the videos from my account and upload them again. Everything was restored but the view count.
Some time later, with views approaching five digits on some of my videos, YouTube completely closed my account. I received no explanation, but I had a clue; some woman had been arguing with me in the "comments" section of my video and begun leaving rude messages on the "discussion" section of my profile. Putting two and two together, I considered the possibility that the woman had been so flustered, she had actually petitioned YouTube to delete my channel because she did not like what I had to say. Luckily, some other YouTube users had already downloaded my videos and reposted them. To this day, they are still searchable on those other channels.
Over the years, I have seen videos on polarizing issues disappear on YouTube - quite often, after the view count reached over 20000 views, which seems to be the "sweet spot" and breaks out from its niche audience. This is probably due to the fact that, with an ever-increasing number of views, the video gets pushed increasingly higher in the search results when one searches for the keywords associated with the video. I suspect there may be a knee-jerk response among some who go to a video on a topic expecting the opposite of what they hear, leading to their irate flagging. Observing how the liberal-left has increasingly tried to silence criticism by calling it "hate", I find the explanation plausible. The scary thing is, if that is the case, it means somebody at YouTube had sympathies towards this woman's hostile reaction and granted her her wish.
Of course, YouTube should be able to choose what it wants to put on its servers. But, when videos are deleted occur under a false pretext and the viewer who attempts to access them gets message #1 or #2, the viewer is put into a false reality where free speech is abound, and public opinion is available to be accessed. I would argue that the illusion of an open forum/impartial gatekeepers is more dangerous than the lack of an open forum/biased gatekeepers. The reason being, it is better to know that not all views are represented than to be under the false impression that they are.
These conclusions are particularly important in view of the movement to create "some organization" to cleanse the internet of copyright violators. Which organization could handle such a policing responsibility is not clear. But how quickly this privilege could be abused by whatever organization is in charge is, after my experience on YouTube, very clear.