Frequently Asked Questions
This page is designed to provide a list of answers to frequently asked questions about Report Harmful Content:
What is Report Harmful Content?
Launched in December 2018, Report Harmful Content is a national reporting centre designed to assist everyone in reporting harmful content online. We aim to:
- Provide information on the community guidelines of commonly used social networking sites (see ‘what are community guidelines?’ for more information).
- Give advice on how to report problems to social media.
- Mediate between you and social media (e.g. escalate unsolved reports or explain why content hasn’t been removed).
- Provide assistance in removing harmful content from platforms.
Report Harmful Content is based at South West Grid for Learning (SWGfL), an online safety charity and the lead partner in the UK Safer Internet Centre. At SWGfL, we work in collaboration with Childnet International and the Internet Watch Foundation (IWF). We have over 8 years of experience in dealing with online safety concerns and online harms, gained through running two helplines: the Professionals Online Safety Helpline (POSH) and the Revenge Porn Helpline.
Our specialist advice and support draws upon these partnerships and experiences, and makes use of the excellent working relationships that we have established with social media.
What can Report Harmful Content help me with?
If you have come across suspected harmful content online, but you are still unsure, you can find help in our advice section. This section contains information about the different types of harmful content you might find online.
If you decide you want to make a harmful content report to a social media platform, the report section of our website provides you with clear guidance on how to do so, including up to date information on community standards and direct links to the correct reporting facilities across multiple platforms.
If you have already submitted a harmful content report to social media and have received an unsatisfactory reply (e.g. the harmful content has not been removed or addressed), we might be able to further assist you. We can review responses to reports about the following eight types of online harm:
- Threats
- Impersonation
- Bullying or Harassment
- Self-Harm or Suicide Content
- Online Abuse
- Violent Content
- Unwanted Sexual Advances
- Pornographic Content
We studied the community guidelines of several different platforms and found that these types of harm are most likely to violate terms. Our team of experts will be able to assess industry responses to these 8 harms and escalate your unsolved report, if deemed appropriate.
What can’t Report Harmful Content help me with?
You might wonder why we don’t offer reporting support for online harms other than the main 8 (Bullying or Harassment, Threats, Impersonation, Unwanted Sexual Advances, Violent Content, Self-Harm or Suicide Content and Pornographic Content, for more information see ‘what can Report Harmful Content help me with?’). This is because there are often alternative routes to resolution where other categories of harmful content are concerned. For more information, see the Other category in the advice section of our website.
When it comes to the main 8 harms, there may also be some cases where we are unable to review or escalate the response you have received from social media. For example, we are unable to escalate reports in instances where we agree with the platform’s decision in not removing the content. We are also unable to escalate reports which we deem to be off topic (e.g. where the report isn’t about online harm but rather an offline harm) or when the report is about an illegal harm and there is a more appropriate recourse (e.g. contacting the police). We are also unable to assist with reports about online harms which fall outside the remit of the project (e.g. spam, incapacitation etc.).
In any of the above instances, we will always explain why we are unable to assist you and direct you to more appropriate sources of support.
We are unable to take reports of sexual images of under 18s. You can report sexual images of under 18s online directly to the Internet Watch Foundation. If you see something online that supports, directs or glorifies terrorism, we recommend that you report it to Action Counters Terrorism.
Under new GDPR regulations we are unable to accept information from children under the age of 13. If you are under 13 and need help removing harmful content online, please ask your parent or guardian to complete our reporting wizard on your behalf.
How do you define harmful content?
In simple terms, harmful content is anything online which causes a person distress or harm. This encompasses a huge amount of content and can be very subjective depending on who is doing the viewing; what may be harmful to one person might not be considered an issue by someone else.
At Report Harmful Content, we review reports made to social media about the following eight types of online harm:
- Online Abuse
- Bullying or Harassment
- Threats
- Impersonation
- Unwanted Sexual Advances (Not Image Based)
- Violent Content
- Self-Harm or Suicide Content
- Pornographic Content
For more information, see ‘what can Report Harmful Content help me with?’
How do I submit a report to Report Harmful Content?
Before you submit a report to us it is essential that you have reported the material to the social media service directly, using their online tools (at least 48 hours ago). You can find information on how to do this in the report section of our website. If you have already reported to social media and would like the outcome reviewed by us, you can submit a report via our reporting wizard.
What should I include in my report?
When making a report to Report Harmful Content you will be directed to our reporting wizard. So that we can help you quickly and efficiently, please provide as much detail as possible in the ‘supporting information’ section of the wizard. This might include detail such as the nature of the harm, who it is directed at, who the perpetrator is, whether you known the perpetrator, when you submitted your report to social media, what the outcome was and other ‘bigger picture’ details (e.g. whether this online harm is part of a broader pattern of harassment or abuse). Make sure to also provide a URL of the harmful content and to upload the response you have received from social media (ideally as a screenshot).
When should I go to the police?
At Report Harmful Content we deal with a wide range of online harms, some of which might be considered illegal. If you are being threatened, harassed or blackmailed online, we recommend that you report to the police. If you or someone you know is in immediate danger, please contact the police on their 999 emergency number. Further details including a list of relevent laws can be found on our When should I go to the Police page.
What are community guidelines?
Millions of individuals use social media to connect with friends and family and share experiences. In order to ensure that social media remains a safe and respectful environment for everyone, most social media platforms have community guidelines. These are a set of rules which outline what is expected of you as a social media user, including how you should treat other users and what behaviour is and is not tolerated. Report Harmful Content offers advice and support on content and behaviour that breaches the community guidelines of commonly used social media platforms.
Report Harmful Content has not escalated my report.
There are a number of reasons why your report may not have been escalated. Report Harmful Content are unable to escalate reports to social media in instances where we agree with the platform’s decision in not removing the content. We are also unable to escalate reports which we deem to be off topic (e.g. where the report isn’t about online harm but rather an offline harm) or when the report is about an illegal harm and there is a more appropriate recourse (e.g. contacting the police). We are also unable to assist with reports about online harms which fall outside the remit of the project (e.g. spam, incapacitation etc.)
In any of the above instances, we will always explain why we are unable to assist you and direct you to more appropriate sources of support.
If you disagree with our decision because you feel like your report has been misunderstood, we recommend that you submit another report containing additional information and/or evidence.
I have been accused of posting harmful content.
Social networking sites have community guidelines in place to prevent harm from occurring on their platforms (see ‘what are community guidelines?’) If someone makes a report about your behaviour or content on a social networking site, and you are deemed to have breached community guidelines, certain features of your account may be temporarily blocked or suspended. Whilst we appreciate how frustrating this can be, we are usually unable to assist with this issue.
Often social media users do not intend to cause harm. In these instances, we recommend familiarising yourself with the platform’s community guidelines, so as to prevent further misunderstandings.
If you feel that someone has made a report about you in order to unfairly target you (e.g. as part of a harassment campaign), and you have unsuccessfully raised these concerns with the social media platform, then we might be able to help you. Submit a report to us using our reporting wizard, being sure to provide as much detail as possible.
Harmful content on other platforms.
If you can’t see the platform where the harmful content is located on our report page, this may be because we don’t yet have a partnership with this service. It might also be because there are other issues involved with reporting to certain platforms. Nevertheless, if you have discovered harmful content on a platform other than those listed on the report page, you can submit a report through the other link on our reporting wizard.
How long after I submit a report until I hear back from you?
We aim to reply to reports within 72 hours.
What should I do if I am not satisfied with your response?
If we are unable to escalate a report to social media on your behalf, we will always explain why and direct you to more appropriate sources of support. If you disagree with our decision because you feel like your report has been misunderstood, we recommend that you submit another report containing additional information and/or evidence.
If you are still not satisfied, you can submit your feedback to us via our satisfaction survey, which you will find attached to all of our communications.
Where can I send comments and feedback?
We welcome all comments and feedback from our service users. You will find a satisfaction survey attached to all of our communications, where you can share your experiences of using Report Harmful Content.
I am a professional working with children and young people.
Report Harmful Content is a national reporting centre designed to assist everyone in reporting harmful content online.
If you are a member of the children’s workforce, you might also find it useful to contact the Professionals Online Safety Helpline (POSH). This is one of our sister helplines, we work very closely with them. They are able offer more specific advice to members of the children’s workforce as they have an in depth understanding of the additional online safety issues involved in this line of work.
I am under 18.
Report Harmful Content can assist all adults and young people between the ages of 13-18 with reporting harmful content online. Under new GDPR regulations we are unable to accept information from children under the age of 13. If you are under 13 and need help removing harmful content online, please ask your parent or guardian to complete our reporting wizard on your behalf.
I am concerned about my child’s social media use.
If you have a concern regarding a specific piece of content on your child’s social media account, then you can report it directly to social media. For advice on how to do this, see the report section on our website. If you have already reported to social media and would like the outcome reviewed by us, you can submit a Report Harmful Content report via our reporting wizard.
Under new GDPR regulations, we are unable to accept information from children under the age of 13. If your child is under 13 you will need to submit a report on their behalf. If your child is over the age of 13, we recommend that they submit reports themselves.
If you have more general concerns about your child’s social media use (e.g. staying safe, monitoring behaviour) you can find help and tips on our advice for parents’ section.
Someone is targeting my workplace and/or staff members.
Report Harmful Content can help with this. We commonly deal with reports concerning threats or harassment made in the workplace, alongside professional impersonation and intellectual property violations. Report Harmful Content are well equipped to offer advice and support on all of these issues.
Someone is pretending to be me online.
We can help with this. Account hacking or impersonation is one of the most common issues we deal with. Report Harmful Content are well equipped to offer advice and support if someone is pretending to be you online.
I am a police officer.
If you are a police officer dealing with harmful content as part of a criminal investigation, you might be wondering if we can help. Each criminal investigation is unique. There will be times when we are able to assist and times when we cannot. We recommend that you submit a report to us, which we will assess on a case by case basis. If we are unable to assist you, we will always explain why and direct you to more appropriate sources of support.
I have been a victim of a crime.
Unfortunately, we are unable to assist with removing harmful content related to criminal investigations. This is to ensure that the investigation is not affected in anyway. If you have concerns about harmful content related to your investigation, speak to a police officer or victim support.