Tell Me How You Really Feel: Mood Prediction Tech

mood

Privacy is a choice, but we may not always be conscious of how much we’re sharing. Time and time again we’ve learned how our networked world gathers our information in some cases without our complete transparency. From location data leaking from physical fitness devices to law enforcement passively collecting biometric information with facial recognition software, our lives are often quantified without full knowledge by the user. Even our genetic quest for self-knowledge may be subject to privacy concern.

Now in addition to who we are, where we are, and what we are, there’s a fresh effort underway to see inside our lives: How we really feel.

Sifting Social Media for Your Mood

From BBC Future’s exploration of social media’s “impact on mental health and well-being,” comes the piece, “How Social Media Betrays Your Mood.” A new wave of studies underway hope to leverage machine learning and data mining to read between your tweets and Instagram posts. The objective? Detection of mental health issues.

According to researchers,

“our ability to predict suicidal thoughts and behaviour has not materially improved across 50 years of research.” The hope is that clues in our feeds will shine a light on depression, suicidal thoughts, schizophrenia, and PTSD. With enough data, we might just be able to more accurately “inform diagnostic and screening tools.”

While the impulse towards mental health care is positive, the privacy concerns are profound. As the article states:

“What if digital traces of your mental health become visible to all? You might be targeted by pharmaceutical companies or face discrimination from employers and insurers. In addition, some of these types of projects aren’t subject to the rigorous ethical oversight of clinical trials. Users are frequently unaware their data has been mined.”

And what if the technology making these assertions about your mental health are flawed? Transparency in methodology is a major issue, given how hard it can be to see inside the “black box” algorithms behind these programs.

Though the technology may deliver on promises of improved diagnosis and treatment for those suffering from mental health issues, it’s also an important reminder to pause before you post and consider how what you share publicly on platforms may have unintended future effects.

The Promise and Peril of a Mood Prediction App

Social media is not the only area where our “emotional weather” is under scrutiny. In fact,

“psychologists and technologists are together trying to build emotional databases that teach machines how to read human feelings by compiling a bunch of data about biological signals that indicate impending changes in order to digitally predict moods,”

according to a recent article in Quartz.

The idea is in line with a host of other tracking technologies design to help people interested in the “quantified self.” The movement seeks greater self-knowledge through gathering data, interpreting trends, and forecasting ways in which we might change behaviors to positively impact our lives. One app, Mindstrong, is a well-funded project in pursuit of mood monitoring and prediction, but according to Scientific American, evidence of its efficacy remains scant.

Some argue biometric markers are insufficient indicators of our mood and furthermore that the digital collection methods used to inform the apps are flawed. Lisa Feldman Barrett, a neuroscientist and psychology professor at Northeastern University believes that our attempt to outsource our emotional self-knowledge to technology is wrongheaded. From her perspective, we are not helplessly subject to an internal emotional weather so much as we have a more active role in creating and managing our mood than we might think. Naturally the privacy issues at hand are significant. Who might use this information to make decisions which impact your healthcare or your job? How can we know corporations won’t release this information to law enforcement without notifying us first?

The Search for Self-Awareness

Technology is unquestionably capable of helping us derive greater self-awareness through mindful tracking and pattern recognition. But when it comes to our mental health, we must ask ourselves how much we want to share with strangers and corporations versus the guaranteed privacy of a counselor’s office. How we really feel should not be part of public record until we choose to make our feelings known.

#selfknowledge #privacy #socialmedia #quantifiedself #facialrecognition #moodprediction

0 views0 comments

Recent Posts

See All

Perhaps you’ve been thinking about becoming an advisor and want to know more before jumping in? It’s something I highly recommend. Whatever your field or area of expertise, there are many ways to go a