Up until his death at the hands of USA forces in 2011, Anwar al-Awlaki was a major voice for Islamic extremists, advocating for war and attacks against the United States, and helped to inspire numerous terrorist attacks around the world.
Over the years, many counterterrorism experts and governments have long pushed for his videos, Islamist propaganda and other content that incites violence to be banned from YouTube.
As for the task of triaging the massive streams of video that are uploaded every minute to YouTube to flag questionable content: the company is already using AI to sniff this stuff out, and such systems will likely improve over time to further improve and automate the process. Most of the videos still in YouTube's archive of Awlaki are news reports about his life and death and debates over his killing and work, among other related material.
Terror organizations such as the Islamic State in Iraq and Syria (ISIS) have used online platforms to recruit and radicalize people across the world. The video sharing website had removed several extremist videos from the site.
While Juniper Downs, YouTube's global director of public policy, noted this week that such measures could make it more hard for certain videos to gain an audience, I believe there's greater merit in creating a space that's free of extremist content and safer for people across the world to use. "You just don't want to make it easy for people to listen to a guy who wants to harm us".
Recently, Google, Facebook and Twitter admitted that their platforms were used by Russian operatives during the 2016 presidential election to spread propaganda created to influence political opinion and sway the vote.