Sunday, December 22, 2024
spot_img
HomeChild HealthYou Could Be Binging And Streaming Sadness

You Could Be Binging And Streaming Sadness

The team set out to investigate the negative and positive mental health implications that are associated with using the streaming platform, notably regarding individuals under the age of 29 years old and those who regularly watch content about other people’s lives who appear to be the one’s experiencing the most negative effects. 

Dr. Luke Balcombe and Emeritus Professor Diego De Leo from Griffith University’s School of Applied Psychology explained that the fostering of parasocial relationships between followers and content creators may be a cause of concern, and there appear to be fewer neutral or positive instances in which creators developed closer relationships with their followers. 

“These online ‘relationships’ can fill a gap for people who, for example, have social anxiety, however, it can exacerbate their issues when they don’t engage in face-to-face interactions, which are especially important in developmental years,” says Balcombe. “We recommend individuals limit their time on YouTube and seek out other forms of social interaction to combat loneliness and promote positive mental health.”

Many parents are concerned about the amount of time that their children are wasting on YouTube and similar platforms, and it appears as if they are correct to be concerned. However, most parents say that it is hard to constantly monitor their children’s use of platforms regardless of whether or not it is for educational or other purposes. 

For this study spending over two hours a day on YouTube was considered to be high-frequency use and over 5 hours was considered to be saturated use. The researchers concluded that much more needs to be done to prevent suicide-related content from being recommended to users via the suggested viewing algorithms of the platform which suggests videos based on previous searches which can potentially send users even further down an alarmingly disturbing and dangerous rabbit hole. 

While it is possible to report alarming content most often it goes unreported, when it does get reported the content still stays actively online while the report is investigated, and due to the sheer volume of content being uploaded, it is nearly impossible to catch it all. When content is flagged as having the potential to promote self-harm or suicide the platform generates a warning to ask users if they want to continue watching the content, but they can still watch it with a simple click if there are no parental controls in place. 

Parental controls are an important part of keeping this type of content away from those who are vulnerable or children and adolescents who in general are highly suggestive. Your computer has built-in safeguards such as parental controls as do most smart TVs and mobile devices. Additionally, most platforms also have parental controls which adds an extra layer of security and peace of mind. If you are not sure how to do this, a search on Google or YouTube itself can help you find the information that you need to help self-guard what kind of content is being watched by your family on all of your devices. 

“With vulnerable children and adolescents who engage in high-frequency use, there could be value in monitoring and intervention through artificial intelligence,” Dr. Balcombe continues. “We’ve explored human–computer interaction issues and proposed a concept for an independent-of-YouTube algorithmic recommendation system which will steer users toward verified positive mental health content or promotions.”

“YouTube is increasingly used for mental health purposes, mainly for information seeking or sharing and many digital mental health approaches are being tried with varying levels of merit, but with over 10,000 mental health apps currently available, it can be really overwhelming knowing which ones to use, or even which ones to recommend from a practitioner point of view.”

“There is a gap for verified mental health or suicide tools based on a mix of AI-based machine learning, risk modeling and suitably qualified human decisions, but by getting mental health and suicide experts together to verify information from AI, digital mental health interventions could be a very promising solution to support increasing unmet mental health needs,” he concludes.

As with anything you read on the internet, this article should not be construed as medical advice; please talk to your doctor or primary care provider before changing your wellness routine. This article is not intended to provide a medical diagnosis, recommendation, treatment, or endorsement.

Opinion Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy of WHN/A4M. Any content provided by guest authors is of their own opinion and is not intended to malign any religion, ethic group, club, organization, company, individual, or anyone or anything.

Content may be edited for style and length.

References/Sources/Materials provided by:

https://news.griffith.edu.au/2023/05/15/impacts-of-youtube-on-loneliness-and-mental-health/

https://www.griffith.edu.au/

https://www.mdpi.com/2227-9709/10/2/39

RELATED ARTICLES

Most Popular