YouTube Hints at ‘Further Consequences’ for Logan Paul After Controversial Suicide Video

YouTube responded to a controversial video uploaded by video blogger Logan Paul of an apparent suicide victim last week, saying that it’s taking steps “to ensure a video like this is never circulated again.”

The response comes after the popular YouTube star apologized for and removed a video of himself and his friends taken in Japan’s notorious Aokigahara forest, which is known as a magnet for suicides.

Although Paul initially said the video was intended to raise awareness about suicide, critics slammed him for seemingly making light of a serious situation. At the time, YouTube issued a short statement saying that it “prohibits violent or gory content posted in a shocking, sensational or disrespectful manner,” but it did not say whether it would take any action against Paul or if it would delete his account.

In Tuesday’s statement, YouTube elaborated by saying it is “looking at further consequences” against Paul, although it didn’t specify what those would be or why it has waited a week to react beyond the original short statement. It acknowledged that it failed to communicate quickly enough with users about the incident, saying: “Many of you have been frustrated with our lack of communication recently. You’re right to be.”

Get Data Sheet, Fortune’s technology newsletter.

YouTube also expressed shock at the video and added that “suicide is not a joke, nor should it ever be a driving force for views.”

Paul, who has over 15 million YouTube followers, has not uploaded a video to YouTube since last week when he apologized in a separate video.

YouTube has faced criticism that it fails to properly screen its service for inappropriate content. In November, several big-name advertisers like Adidas and candymaker Mars said they would suspend advertising on YouTube because their online ads sometimes appeared next to videos that appeared to exploit children.

The streaming service has been trying to crack down on inappropriate content by hiring more human reviewers and using technology to automate the process of flagging offensive videos.

“Now, we are applying the lessons we’ve learned from our work fighting violent extremism content over the last year in order to tackle other problematic content,” wrote YouTube CEO Susan Wojcicki in a blog post in December. “Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”