Thousands of Facebook users are giving a big thumbs-down to the idea that the social network may be reading their unpublished thoughts.
An online petition on Care2, a social network for healthy living, boasts more than 27,000 “signatures” from people who want Facebook to “stop stalking our unposted thoughts!” The group is referring to posts, comments and status updates that people type out, then delete before they ever share.
The problem with the petition: Facebook says it isn’t collecting users’ unposted thoughts.
“Facebook does not collect or track any content that people have chosen not to post,” a Facebook spokesperson told Mashable on Friday.
It appears the petition’s creators are acting on a recent report written by a Facebook data scientist and former company intern. The report was the focus of an article from Slate in mid-December, and explored “self-censorship” among Facebook users. In other words, how often do users type out posts, status updates or comments, and then delete the text before publishing?
The study examined 3.9 million Facebook users over a 17-day period, and found that 71% of users “self-censored” at least once.
The study was meant to identify when people typed and then deleted, not what people typed and then deleted.
The study goes on to explain that users were anonymized, meaning no activity was tied back to any one individual’s identity. “Content of self-censored posts and comments was not sent back to Facebook’s servers: Only a binary value that content was entered at all,” according to the study.
Michael McTernan started the Care2 petition after reading the article published by Slate last month. McTernan says that he often thinks twice before posting to social sites like Facebook, asking himself “Who will see this, and what will they think of me if I post it?” he wrote in an email to Mashable.
McTernan is concerned with Facebook gathering metadata around user’s unpublished thoughts — even if they aren’t collecting the actual content, he says.
“I think the fact they released a study on user behavior that acknowledges people do this, and that Facebook sees this as a problem, is a huge red flag,” he wrote. “Also, the fact the study was released and they are being pedantic about whether or not the content is collected is indicative of a corporation with fast and loose ethics.”
The technology is still available, and could be activated at any time to gather info on whether users are self-censoring. (Again, Facebook says actual content would not be collected.)
This might be handy to Facebook if, for example, the company was testing out a new product or feature. The company could see if a new feature or policy change was causing users to self-censor more often, or if something as simple as a redesign of the text box led to increased self-censorship.
Source: Mashable
“[katogoaward]”