All posts in “Social Good”

What your Instagram posts might reveal about experiencing depression

Whether you realize it or not, you leave a trail of clues about your mental health on social media. Emerging research suggests that words, characters, and even emoji can reveal information about people’s moods and mental well-being. 

That might seem obvious when it comes to explicitly emotional posts, but information about mental health can be embedded in even the most standard of social media messages — and a new field of data science is trying to understand how to detect and interpret those signals. 

In a new study published in EPJ Data Science, two researchers accessed the Instagram accounts of 166 volunteers and then applied machine learning to their collective 43,950 images in order to identify and predict depression. By comparing those predictions to each individual’s clinical diagnosis, the researchers discovered that their model outperformed the average rate of physicians accurately diagnosing depression in patients.  

“Doctors don’t have visibility into our lives the way our mobile phone does.”

In other words, an Instagram account may have the potential to reveal whether you’re experiencing depression. The right algorithm might just be better at making that prediction than a trained physician. 

“Doctors don’t have visibility into our lives the way our mobile phone does,” said Chris Danforth, co-author of the study and Flint Professor of Mathematical, Natural, and Technical Sciences at the University of Vermont. “It knows a lot more about us than we know about ourselves.”

There are, however, a few caveats to the study’s fascinating discovery. 

The researchers recruited the volunteers through Amazon Mechanical Turk, an online marketplace that matches workers with businesses and developers. Once the volunteers learned that the study required letting the researchers access their Instagram accounts, more than half of them dropped out. The small sample size and lack of demographic information mean it’s impossible to generalize the findings to the larger population. 

The study analyzed color, filters, face detection, and user comments and engagement to make its predictions. 

Instagram photos posted by depressed individuals had HSV values shifted towards those in the right photograph, compared with photos posted by healthy individuals.

Instagram photos posted by depressed individuals had HSV values shifted towards those in the right photograph, compared with photos posted by healthy individuals.

Image: EPJ Data Science

Photos posted by depressed users were more often bluer, darker, and grayer, hues that previous research has associated with negative mood. They also were less likely to use Instagram filters, but disproportionately chose the black-and-white “Inkwell” filter when they did take advantage of the tool. In contrast, “healthy” participants favored the tint-lightening “Valencia” filter. Meanwhile, the more comments a post received, the more likely someone with depression had posted it.

The researchers also enlisted a second group of volunteers to rate the photos for “happiness, sadness, likability, [and] interestingness.” The human assessments for sadness and happiness predicted which participants experienced depression, but they didn’t correlate with the mental health signals picked up by machine learning. 

That’s a promising finding because it means algorithms may excel at detecting conditions where humans fail. 

“There has been anecdotal and preliminary evidence that these sorts of signals are present and relevant to mental health, but this study provides compelling evidence of their utility,” said Glen Coppersmith, founder and CEO of the startup mental health analytics company Qntfy. (Coppersmith wasn’t involved in the study.)

“It’s important that computers can do this.” 

Danforth said he can imagine a future in which people opt to download an app that analyzes their social media for signs of emotional or psychological distress and sends a message to an individual’s doctor when they need to be seen by a mental health professional. 

That future, of course, requires a lot more research — and the trust of social media users. Allowing an app to collect and analyze social media data in the context of mental health raises serious questions about about security. In particular, that information could never get into the hands of insurers and potential employers, who might use it to make decisions about a person’s health care premiums or employment. 

But Danforth believes such technology serves an important public good. 

“It’s important that computers can do this,” he said. “It’s better if we can get somebody who [might] die by suicide in 2018 in front of a psychologist sooner because there’s something about their social media that made it clear to the machine that they needed help and it wasn’t obvious to the people around them.” 

That might not be a possibility you imagined when you first started uploading pictures to Instagram, but, if successful, an algorithm like that could change the way we detect and treat mental health conditions in the 21st century. 

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f81107%2f4c3340b6 c3ba 421b be5c f169ae3b7bb7

This new AP course could drastically improve diversity in tech

It’s no secret that tech has a diversity problem. 

The big-name Silicon Valley companies that release their staffing figures every year often seem embarrassed to do so, because their workplaces are not inclusive or representative of the larger population — even after those companies vocally commit to changing the status quo.

And every year, we argue over the same question: Why don’t firms like Yahoo, Facebook, and Google hire more women and people of color? 

Hadi Partovi, CEO of Code.org, a nonprofit that works to create more diversity in tech, believes there’s finally one promising solution within reach. 

That would be a popular new Advanced Placement course called Computer Science Principles. It debuted in classrooms nationwide last year as a broader introduction to computer science. Instead of focusing narrowly on programming, as the traditional course and exam do, CS Principles teaches subjects like networking, big data, cybersecurity, and app development.

“The traditional AP exam is if you want to become a coder,” Partovi says. “The new exam is if you want to become a well-educated, well-rounded person.” 

It also has the potential to fundamentally change what’s known as the “pipeline problem,” or what tech companies describe as the difficulty of recruiting underrepresented applicants out of college and into tech jobs because they comprise a minority of graduates with a degree in computer science.  

After just one year in high schools, the course more than doubled the number of female and underrepresented students who took a computer science AP exam. In 2016, only 12,642 female students took the traditional exam. This year, that number trended upwards as it has in previous years, and an additional 15,028 female students took the new test.

Similarly, the participation of “underrepresented minorities,” which includes African American and Hispanic students, increased by nearly 170 percent from 2016 to 2017, with 13,024 students taking the new exam. Seventy percent of students received a passing score on the CS Principles test. 

Partovi expected a surge in the number of students taking the new AP course and exam, but he was “ecstatic” to see participation in computer science at the high school level skyrocket. 

He credits that partly to the course’s widespread appeal. It’s a modern take on computer science that relates to the internet and app development, even requiring students to submit a portfolio of app projects in order to complete the exam. 

Code.org, along with other educational organizations, helped develop the curriculum for the course and has trained teachers with no computer science background how to teach the material in their classrooms. 

Partovi is hopeful that the new AP course and exam will encourage more female and underrepresented students to pursue computer science in college and in their careers. That shift could be evident within four to five years as the first crop of students finishes their bachelor degrees and enters the job market. 

Trevor Packer, senior vice president of Advanced Placement and Instruction at CollegeBoard, comments on the new AP CS test.

Trevor Packer, senior vice president of Advanced Placement and Instruction at CollegeBoard, comments on the new AP CS test.

Whether or not that can help reduce the so-called pipeline problem is a controversial subject. Many critics believe the challenges to diversifying tech’s ranks have more to do with a culture that is unwelcoming or dismissive of people whose backgrounds and experiences don’t match the stereotype of the brogrammer in a hoodie.  

Partovi believes that both insular culture and pipeline challenges are at the heart of why tech companies haven’t improved at hiring more diverse employees. If the AP course and test give young students the confidence and knowledge to stick with computer science in college when they might not have otherwise done so, it will make an important difference. 

“The trend is in the right direction,” says Partovi.

That’s good news that the tech industry — and those who hope to join it one day — desperately needs.

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f80827%2fc4765746 1c50 4338 bd20 1bb497141f91

4 things you should do when you’re the victim of revenge porn

Rob Kardashian decided on Wednesday to exact revenge on his former fiancée Blac Chyna by publishing nude and explicit images of her on social media. Now the internet is talking about one of the worst risks of our digital lives: so-called revenge porn.

Also known as nonconsensual porn, the act of publishing intimate photos meant to stay private in order to shame or emotionally harm someone is a crime in many states. In Kardashian’s case, the posts were ultimately removed from both Instagram and Twitter, and Chyna’s lawyer told People that she’s considering all “legal remedies.” 

“If you’re a good person, you don’t want to believe someone did something bad like this.”

While Chyna has a high-powered lawyer at her side, most victims of nonconsensual porn don’t know how to get such images removed from the internet, when to contact the police, or how to collect preliminary evidence. 

Anisha Vora knows this process all too well. In 2012, an ex-boyfriend posted nude images of her online, along with her contact information. Since then, Vora has become a victim advocate and formerly volunteered for the Cyber Civil Rights Initiative (CCRI), a nonprofit advocacy organization that provides resources to people whose images are posted online without their consent.

“If you’re a good person, you don’t want to believe someone did something bad like this,” Vora says.

While the shock of betrayal can be hard to shake, Vora recommends focusing on the following four steps to protect your privacy and online reputation.

1. Report the nonconsensual images to the social media platform where they were posted. 

You might be tempted to immediately contact the person responsible for posting your intimate images, but Vora advises against reaching out. Your ex, she says, might try to convince you they’ve been hacked or urge you not to report his behavior to the authorities. 

If you want the images taken down, your best bet is to contact the social media platforms where they were published. Facebook, Instagram, Snapchat, Tumblr, and Twitter all have guidelines for how to report images that violate the platforms’ terms of service. Every company uses different language to describe such abuse, so you should include in any written complaint that the images contain nudity and were posted without your consent.

For more on what to expect, the CCRI maintains comprehensive instructions on how to file reports with major social media companies and the biggest search engines. 

2. Screenshot the images. 

Screenshots are a short-term strategy for gathering evidence that you can share with social media companies or the police. In addition to documenting the images themselves, you’ll want to screenshot search engine results of your name and any messages or friend requests you’ve received after the images and your identity were published. Being able to show the negative impact of nonconsensual porn on your reputation and identity can be useful if you want to build a legal case.  

Ultimately, your case won’t come down to the screenshots themselves, since companies and investigators can collect digital information like a user’s account history and the IP address associated with the published images. Yet having the screenshots compiled and printed can be a valuable resource for discussing the case with investigators or a lawyer. 

3. File a report with the police. 

Dozens of states and Washington, D.C., have laws against nonconsensual pornography. Vora recommends filing a police report regardless of whether it’s illegal where you live. In some cases, publishing the images might not be punishable by law, but the behavior involved can qualify as harassment or stalking. The report also creates a paper trail at the outset, and can be useful if you want to file charges or pursue a civil suit. 

Even if you don’t want to take legal action at first, Vora says the harassment can escalate to the point where you might need to in the future, particularly if your identity and contact information is disclosed along with the photos. C.A. Goldberg, a New York-based law firm that specializes in nonconsensual porn cases, keeps a list of the state laws here, and you can read more about filing a police report in CCRI’s FAQ. Additionally, CCRI maintains a list of lawyers across the country who’ve volunteered to represent victims pro bono or for a reduced fee. 

4. Consider hiring a takedown service to remove your images. 

Once images have been posted on social media or to websites, they can quickly spread across the internet, thanks to users who screenshot or download them before the posts have been removed. That can make it exhausting to keep track of where the images appear. While Vora’s images first appeared on just three websites, they’ve since shown up on more than 4,000 sites. 

“Knowing that someone else is handling that, it’s such a weight off my shoulders.” 

You can try to handle the takedown requests yourself, but that can be time-consuming and occasionally risky. Some porn websites are run by small-time purveyors and you won’t necessarily know who you’re dealing with. If someone offers to take down the images for a fee, Vora says it’s a scam.  

She instead recommends using a reputable takedown service, but to first make sure that investigators have collected the data they need to build your case. Takedown services, which track where your images appear and contact the site owners directly, can be costly depending on the extent of tracking you need. Vora estimates that the annual cost can run anywhere from $100 to $1,000. The CCRI recommends DMCA Defender and Copybyte as trustworthy companies. 

“Knowing that someone else is handling that, I don’t have to think twice or worry about it,” Vora says. “It’s such a weight off my shoulders.” 

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2017%2f7%2fd6c3d835 b8b2 e588%2fthumb%2f00001

Rob Kardashian owes Blac Chyna, and all of us, an apology

Rob Kardashian wants revenge on his ex-fiancee, Blac Chyna, and he could go to jail in pursuit of it. 

Kardashian made that clear Wednesday when he relentlessly posted nude images designed to shame and embarrass Chyna on social media. The pair, who have an infant daughter, are again fighting in public, this time over accusations involving infidelity and betrayal. 

While nonconsensual porn is terrible no matter who shares it, Kardashian bears a higher level of responsibility given that he used his fame and social media megaphone to exact revenge. By posting the private images, Kardashian endorsed the practice of shaming a woman for her sexuality.

He already owes Chyna an apology for his social media abuse, but the public deserves one too. 

The saga began when Kardashian posted a series of explicit images that he appeared to have taken, or Chyna sent to him, on Instagram. When his account was disabled, presumably for violating the platform’s terms of service, he turned to Twitter. Within about 30 minutes, Twitter had removed the nude images of Chyna that Kardashian shared, though his account remains active. 

Image: rob kardashian / twitter

While Kardashian can still reach his 7.4 million Twitter followers, he may land in jail for the juvenile outburst. Nonconsensual porn or “revenge porn” is illegal in California and can be punishable by jail time and a fine, though not all states have similar laws. In California, images or recordings of intimate body parts meant to stay private cannot be distributed with the intent to cause emotional distress, according to the law. If the victim is indeed psychologically harmed, the perpetrator may be guilty of a misdemeanor.

You might think that violating the companies’ terms of service, doing something potentially illegal, and terrorizing someone you once loved in front of an audience of millions could convince Kardashian to take a deep breath and step back from social media. But his string of text-only tweets documenting Chyna’s alleged betrayals suggests he thought only of channelling his rage and disappointment. 

“But she couldn’t remain loyal and cheated and f**ked way too many people and she got caught and now this is all happening and it’s sad,” Kardashian wrote on Twitter. 

So in case it wasn’t clear already: Don’t be like Rob Kardashian. Don’t use your heartbreak to justify emotional terror. If you feel manipulated by a partner, think of a more productive way to address that pain than resorting to the age-old misogyny of calling her a slut in the public square. Therapy, anger management — even a long walk in the woods could help. 

Neither Instagram or Twitter would comment on Kardashian’s individual account. “We want to maintain a safe and supportive environment on Instagram, and we work quickly to remove reported content that violates our community guidelines,” a spokesperson for Instagram said. 

Two high-profile lawyers who specialize in nonconsensual porn cases both tweeted about Kardashian’s posts. Carrie Goldberg, founder of the firm C.A. Goldberg, told People that Kardashian faces an uphill legal battle.

“This is sort of the classic, quintessential revenge porn,” she said. “Rob has made the work of a prosecutor or a victim’s attorney quite easy so far as to even post about the very motive behind his outrageous act of posting these private and nude photos of his ex.”

Lisa Bloom, of The Bloom Firm, linked to coverage of Kardashian’s tweets and wrote, “Revenge porn is a crime, a civil wrong, a form of domestic partner violence, and a violation of social media’s terms of use. And truly sick.” 

But you shouldn’t need lawyers to convince you of why nonconsensual porn is so devastating. There are countless first-person accounts, many from women who won’t share their last name on dates, never know what prospective employers will find out about them in a Google search, and who are even stalked and harassed by strangers who saw their picture and personal information on the internet. If such a threat doesn’t seem real, just pick up a recent issue of Time, which documented those experiences in horrifying detail. 

Despite the work of victims and advocates to raise awareness of nonconsensual porn, Kardashian’s cruel tantrum arguably just gave people in possession of intimate images an excuse to feel satisfied — vindicated, even — in making those pictures public. 

For that he owes everyone a sincere apology — at the very least.

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f80651%2fa89e5d82 4395 4b27 92cf 20cc235f1746

‘Gender champions’ in rural Kenya work with new hotline to protect women and girls

[embedded content]

Drought in a developing country can mean many things: a lack of water, a lack of food and nutrition, and a lack of economic growth that puts even more pressure on impoverished communities relying on farming for their livelihoods. 

For women and girls, it also means a lack of protection. In Africa, women do 90% of the work of gathering water and wood. During a drought, they have to walk even longer distances to find potable water for themselves and their families. That makes them more susceptible to violence and attacks from men in remote areas — which often go unreported.

But a new initiative in rural northern Kenya turns to technology and members of the community to make the region safer, and put an end to gender-based violence.

The Building Resilience and Adaptation to Climate Extremes and Disasters (BRACED) program has launched a new hotline and “gender-support desk” in Wajir, northeastern Kenya. According to the Thomson Reuters Foundation, community members, police officers, health workers, and more have all joined forces to offer a toll-free number for girls who have been attacked, with the goal of bringing the predators to justice.

“We will continue working on this until Wajir County will be free of gender-based violence.”

“In my community, if a girl is raped, we will give you an animal and that’s the end of the story,” Sophie, the “oldest activist in Wajir,” says in the video above.

The hotline initiative, funded by the UK Department for International Development (DFID) and led by humanitarian organization Mercy Corps, aims to change that.

When a girl calls the number, facilitators alert police officers and health workers who not only investigate the situation, but also provides the survivor with “moral and medical support.” Once allegations are confirmed, the gender desk will help her bring the case to court.

Meanwhile, men who want to make a difference in the community — dubbed “gender champions” — are encouraged to speak out on local radio shows to denounce violence against women and promote gender equality.

While effective in its approach (and its specificity), this isn’t exactly the first hotline of its kind in rural Kenya to help girls. In 2014, for example, the Kenyan government launched a hotline to curb female genital mutilation (FGM) and child marriage.

Approximately 45 percent of women in Kenya between the ages of 15 and 49 have experienced physical or sexual violence, according to USAID. In Wajir, there have already been 10 reported cases of rape this year. The severe drought, currently affecting 17 African countries for more than two years, has worsened the problem in northern Kenya, where there’s more rainfall than the rest of the country. People often engage in conflict over land and water, some of whom use violence against women as a way to send a message.

With the prevalence of mobile throughout the country, using a hotline to tackle the issue is a smart and impactful move.

“We will continue working on this until Wajir County will be free of gender-based violence, free from violation of human rights,” Sophie says.

You can read more about the initiative and its impact here.

[H/T Thomson Reuters Foundation]

WATCH: Ethiopian villagers talk about what water means to them

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f4360%2fbebdf77b 432f 4b1d a667 dda110b6e139