Announcement

Collapse
No announcement yet.

Yellow Journalism in the Digital Age

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Yellow Journalism in the Digital Age

    Facebook admits manipulating users' emotions by modifying news feeds

    Website publishes details of experiment in which users can be made to feel better or worse through 'emotional contagion'

    It already knows whether you are single or dating, the first school you went to and whether you like or loathe Justin Bieber. But now Facebook, the world's biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.

    It has published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion".

    In a study with academics from Cornell and the University of California, Facebook filtered users' news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened.

    The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

    Lawyers, internet activists and politicians said this weekend that the mass experiment in emotional manipulation was "scandalous", "spooky" and "disturbing".

    On Sunday evening, a senior British MP called for a parliamentary investigation into how Facebook and other social networks manipulated emotional and psychological responses of users by editing information supplied to them.

    Jim Sheridan, a member of the Commons media select committee, said the experiment was intrusive. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he said. "They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."

    A Facebook spokeswoman said the research, published this month in the journal of the Proceedings of the National Academy of Sciences in the US, was carried out "to improve our services and to make the content people see on Facebook as relevant and engaging as possible".

    She said: "A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow."

    But other commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues.

    In a series of Twitter posts, Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama's online campaign for the presidency in 2008, said: "The Facebook 'transmission of anger' experiment is terrifying."

    He asked: "Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?"

    It was claimed that Facebook may have breached ethical and legal guidelines by not informing its users they were being manipulated in the experiment, which was carried out in 2012.

    The study said altering the news feeds was "consistent with Facebook's data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research".

    But Susan Fiske, the Princeton academic who edited the study, said she was concerned. "People are supposed to be told they are going to be participants in research and then agree to it and have the option not to agree to it without penalty."

    James Grimmelmann, professor of law at Maryland University, said Facebook had failed to gain "informed consent" as defined by the US federal policy for the protection of human subjects, which demands explanation of the purposes of the research and the expected duration of the subject's participation, a description of any reasonably foreseeable risks and a statement that participation is voluntary. "This study is a scandal because it brought Facebook's troubling practices into a realm – academia – where we still have standards of treating people with dignity and serving the common good," he said on his blog.

    It is not new for internet firms to use algorithms to select content to show to users and Jacob Silverman, author of Terms of Service: Social Media, Surveillance, and the Price of Constant Connection, told Wire magazineon Sunday the internet was already "a vast collection of market research studies; we're the subjects".

    "What's disturbing about how Facebook went about this, though, is that they essentially manipulated the sentiments of hundreds of thousands of users without asking permission," he said. "Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there's little reason to think that they won't do just that. As long as the platform remains such an important gatekeeper – and their algorithms utterly opaque – we should be wary about the amount of power and trust we delegate to it."

    Robert Blackie, director of digital at Ogilvy One marketing agency, said the way internet companies filtered information they showed users was fundamental to their business models, which made them reluctant to be open about it.

    "To guarantee continued public acceptance they will have to discuss this more openly in the future," he said. "There will have to be either independent reviewers of what they do or government regulation. If they don't get the value exchange right then people will be reluctant to use their services, which is potentially a big business problem."


  • #2
    Re: Yellow Journalism in the Digital Age

    Originally posted by don View Post
    Facebook admits manipulating users' emotions by modifying news feeds

    Website publishes details of experiment in which users can be made to feel better or worse through 'emotional contagion'

    ...In a study with academics from Cornell and the University of California, Facebook filtered users' news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened.

    The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."...

    What's the big deal? Isn't this the same reason we've paid so much attention over the decades to all the mindless (but apparently emotionally satisfying) mush on television. They needed to run an "experiment" to discover this? All they needed to do was watch a few summer beer adverts to see the influence of "reduced exposure to negative emotional content" :-)
    Last edited by GRG55; June 29, 2014, 06:02 PM.

    Comment


    • #3
      Re: Yellow Journalism in the Digital Age

      http://www.telegraph.co.uk/technolog...-emotions.html

      http://www.independent.co.uk/life-st...d-9571004.html
      Last edited by vt; June 29, 2014, 08:41 PM.

      Comment


      • #4
        Re: Yellow Journalism in the Digital Age

        Originally posted by GRG55 View Post
        What's the big deal? Isn't this the same reason we've paid so much attention over the decades to all the mindless (but apparently emotionally satisfying) mush on television. They needed to run an "experiment" to discover this? All they needed to do was watch a few summer beer adverts to see the influence of "reduced exposure to negative emotional content" :-)
        no big deal at all . . .

        I’ve spent pretty much all day reading as much as possible about the extremely controversial Facebook “emotional contagion” study in which the company intentionally altered its news feed algorithm to see if it could manipulate its users’ emotions. In case you weren’t aware, Facebook is always altering your news feed under the assumption that there’s no way they could fill your feed with all of your “friends’” pointless, self-absorbed, dull updates (there’s just too much garbage).

        As such, Facebook filters your news feed all the time, something which advertisers must find particularly convenient. In any event, the particular alteration under question occurred during one week in January 2012, and the company filled some people’s feeds with positive posts, while others were fed more negative posts.

        Once the data was compiled, academics from the University of California, San Francisco and Cornell University were brought in to analyze the results. Their findings were then published in the prestigious Proceedings of the National Academy of Sciences. They found that:

        For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.

        You probably know most of this already, but here is where it starts to get really strange. Initially, the press release from Cornell highlighting the study said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.” Once people started asking questions about this, Cornell claimed it had made a mistake, and that there was no outside funding. Jay Rosen, Journalism Professor at NYU, seems to find this highly questionable. He wrote on his Facebook page that:


        Strange little turn in the story of the Facebook “emotional contagion” study. Last month’s press release from Cornell highlighting the study had said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.”

        Why would the military be interested? I wanted to know. So I asked Adam D.I. Kramer, the Facebook researcher, that question on his Facebook page, where he has posted what he called a public explanation. (He didn’t reply to my or anyone else’s questions.)See:https://www.facebook.com/akramer/pos...52987150867796

        Now it turns out Cornell was wrong! Or it says it was wrong. The press release now reads: “Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.”

        Why do I call this strange? Any time my work has been featured in an NYU press release, the PR officers involved show me drafts and coordinate closely with me, for the simple reason that they don’t want to mischaracterize scholarly work. So now we have to believe that Cornell’s Professor of Communication and Information Science, Jeffrey Hancock, wasn’t shown or didn’t read the press release in which he is quoted about the study’s results (weird) or he did read it but somehow failed to notice that it said his study was funded by the Army when it actually wasn’t (weirder).

        I think I would notice if my university was falsely telling the world that my research was partially funded by the Pentagon… but, hey, maybe there’s an innocent and boring explanation that I am overlooking.


        It gets even more interesting from here. The Professor of Communication and Information Science, Jeffrey Hancock, who Mr. Rosen mentions above, has a history of working with the U.S. military, specifically the Minerva Institute. In case you forgot what this is, the Guardian reported on it earlier this year. It explained:

        A US Department of Defense (DoD) research program is funding universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world, under the supervision of various US military agencies. The multi-million dollar program is designed to develop immediate and long-term “warfighter-relevant insights” for senior officials and decision makers in “the defense policy community,” and to inform policy implemented by “combatant commands.”

        Launched in 2008 – the year of the global banking crisis – the DoD ‘Minerva Research Initiative’ partners with universities “to improve DoD’s basic understanding of the social, cultural, behavioral, and political forces that shape regions of the world of strategic importance to the US.”
        SCG News has written one of the best articles I have seen yet on the links between the Facebook study and the Department of Defense. It notes:


        In the official credits for the study conducted by Facebook you’ll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you’ll find that Jeffery Hancock received funding from the Department of Defense for a study called “Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes”. If you go to the project site for that study you’ll find a visualization program that models the spread of beliefs and disease.

        Cornell University is currently being funded for another DoD study right now called “Cornell: Tracking Critical-Mass Outbreaks in Social Contagions” (you’ll find the description for this project on the Minerva Initiative’s funding page).


        So I went ahead and looked at the study mentioned above, and sure enough I found this:



        There he is, Jeff Hancock, the same guy who analyzed the Facebook data for Cornell, which initially claimed funding from the Pentagon and then denied it.

        I call bullshit. Stinking bullshit.

        So it seems that Facebook and the U.S. military are likely working together to study civil unrest and work on ways to manipulate the masses into apathy or misguided feelings of contentment in the face of continued banker and oligarch theft. This is extremely disturbing, but this whole affair is highly troubling in spite of this.

        For one thing, although governments and universities need to take certain precautions when conducting such “research,” private companies like Facebook apparently do not. Rather, all they have to do is get people to click “I accept” to a terms of service agreement they never read, which allows companies to do almost anything they want to you, your data and your emotions. What we basically need to do as a society is completely update our laws. For starters, if a private corporation is going to lets say totally violate your most basic civil liberties as defined under the Bill of Rights, a simple terms of service agreement should not be sufficient. For more invasive violations of such rights, perhaps a one page simple-to-read document explaining clearly which of your basic civil liberties you are giving away should be mandatory.

        For example, had Facebook not partnered at the university level to analyze this data, we wouldn’t even know this happened at all. So what sort of invasive, mind-******* behavior do you think all these large corporations with access to your personal data are up to. Every. Single. Day.

        The Faculty Lounge blog put it perfectly when it stated:


        Academic researchers’ status as academics already makes it more burdensome for them to engage in exactly the same kinds of studies that corporations like Facebook can engage in at will.If, on top of that, IRBs didn’t recognize our society’s shifting expectations of privacy (and manipulation) and incorporate those evolving expectations into their minimal risk analysis, that would make academic research still harder, and would only serve to help ensure that those who are most likely to study the effects of a manipulative practice and share those results with the rest of us have reduced incentives to do so. Would we have ever known the extent to which Facebook manipulates its News Feed algorithms had Facebook not collaborated with academics incentivized to publish their findings?

        We can certainly have a conversation about the appropriateness of Facebook-like manipulations, data mining, and other 21st-century practices. But so long as we allow private entities freely to engage in these practices, we ought not unduly restrain academics trying to determine their effects.Recall those fear appeals I mentioned above. As one social psychology doctoral candidate noted on Twitter, IRBs make it impossible to study the effects of appeals that carry the same intensity of fear as real-world appeals to which people are exposed routinely, and on a mass scale, with unknown consequences. That doesn’t make a lot of sense. What corporations can do at will to serve their bottom line, and non-profits can do to serve their cause, we shouldn’t make (even) harder—or impossible—for those seeking to produce generalizable knowledge to do.

        I strongly dislike Facebook as a company. However, this is much bigger than just one experiment by Facebook with what appears to be military ties. What this is really about is the frightening reality that these sorts of things are happening every single day, and we have no idea it’s happening. We need to draw the lines as far as to what extent we as a society wish to be data-mined and experimented on by corporations with access to all of our private data. Until we do this, we will continue to be violated and manipulated at will.


        Michael Krieger

        Comment


        • #5
          Re: Yellow Journalism in the Digital Age

          Does this stuff remind anyone of what reggie was often saying?
          I.e. using internet as a method for selectively pushing information to control the masses?
          Also need to agree with the post above that previously this job could be done with tv, newspapers and magazines, which all are losing influence in the modern world...
          engineer with little (or even no) economic insight

          Comment


          • #6
            Re: Yellow Journalism in the Digital Age

            Is this what some call the new Surveillance State simply what's always been, now expressed in a digital environment, or is this a qualitative change brought together by a confluence of historical forces?

            Comment


            • #7
              Re: Yellow Journalism in the Digital Age

              Originally posted by FrankL View Post
              Does this stuff remind anyone of what reggie was often saying?...
              Reggie is here with me, PCR and Tyler Durden. Always room for another.

              Comment


              • #8
                Re: Yellow Journalism in the Digital Age

                Originally posted by FrankL View Post
                Does this stuff remind anyone of what reggie was often saying?
                I.e. using internet as a method for selectively pushing information to control the masses?
                Also need to agree with the post above that previously this job could be done with tv, newspapers and magazines, which all are losing influence in the modern world...
                Yes, it reminds me of reggies's postings.

                Comment


                • #9
                  Re: Yellow Journalism in the Digital Age

                  Originally posted by FrankL View Post
                  Does this stuff remind anyone of what reggie was often saying?
                  I.e. using internet as a method for selectively pushing information to control the masses?
                  Also need to agree with the post above that previously this job could be done with tv, newspapers and magazines, which all are losing influence in the modern world...
                  Yes. This is a part of what he was talking about.

                  Be kinder than necessary because everyone you meet is fighting some kind of battle.

                  Comment


                  • #10
                    Re: Yellow Journalism in the Digital Age

                    bit by bit . . .

                    US military studied how to influence Twitter users in Darpa-funded research

                    • Defense Department spent millions to research social networks
                    • Studies focused on Occupy and Middle East protests
                    • Projects also analysed memes, celebrities and disinformation


                    The activities of users of Twitter and other social media services were recorded and analysed as part of a major project funded by the US military, in a program that covers ground similar to Facebook’s controversial experiment into how to control emotions by manipulating news feeds.

                    Research funded directly or indirectly by the US Department of Defense’s military research department, known as Darpa, has involved users of some of the internet’s largest destinations, including Facebook, Twitter, Pinterest and Kickstarter, for studies of social connections and how messages spread.

                    While some elements of the multi-million dollar project might raise a wry smile – research has included analysis of the tweets of celebrities such as Lady Gaga and Justin Bieber, in an attempt to understand influence on Twitter – others have resulted in the buildup of massive datasets of tweets and additional types social media posts.

                    Several of the DoD-funded studies went further than merely monitoring what users were communicating on their own, instead messaging unwitting participants in order to track and study how they responded.

                    Shortly before the Facebook controversy erupted, Darpa published a lengthy list of the projects funded under its Social Media in Strategic Communication (SMISC) program, including links to actual papers and abstracts.

                    The project list includes a study of how activists with the Occupy movement used Twitter as well as a range of research on tracking internet memes and some about understanding how influence behaviour (liking, following, retweeting) happens on a range of popular social media platforms like Pinterest, Twitter, Kickstarter, Digg and Reddit.


                    “Through the program, Darpa seeks to develop tools to support the efforts of human operators to counter misinformation or deception campaigns with truthful information."

                    However, papers leaked by NSA whistleblower Edward Snowden indicate that US and British intelligence agencies have been deeply engaged in planning ways to covertly use social media for purposes of propagandaand deception.



                    Earlier this year, the Associated Press also revealed the clandestine creation by USAid of a Twitter-like, Cuban communications network to undermine the Havana government. The network, built with secret shell companies and financed through a foreign bank, lasted more than two years and drew tens of thousands of subscribers. It sought to evade Cuba's stranglehold on the internet with a primitive social media platform.

                    Of the funding provided by Darpa, $8.9m has been channeled through IBM to a range of academic researchers and others. A further $9.6m has gone through academic hubs like Georgia Tech and Indiana University.

                    Facebook, the world’s biggest social networking site, has apologised for the study, which involved secret psychological tests on nearly 700,000 users in 2012, and prompted outrage from users and experts alike, being “poorly communicated” to the public.

                    The experiment, which resulted in a scientific paper published in the March issue of Proceedings of the National Academy of Sciences, hid “a small percentage” of emotional words from peoples’ news feeds, without their knowledge, to test what effect that had on the statuses or “likes” that they then posted or reacted to.

                    However, it appears that Facebook was involved in at least one other military-funded social media research project, according to the records recently published by Darpa.

                    The research was carried by Xuanhuai Wang, an engineering manager at Facebook, as well as Yi Chang, a lead scientist at Yahoo labs, and others based at the Universities of Michigan and Southern California.

                    The project, which related to how users understood and consumed information on Twitter, at one point analysed the tweets, retweets and other interactions spawned by Lady Gaga (described as “the most popular elite user on Twitter”) and Justin Bieber (“who is extremely popular among teenagers”).


                    Facebook's CEO Sheryl Sandberg

                    Several studies related to the automatic assessment of how well different people in social networks knew one another, through analysing frequency, tone and type of interaction between different users. Such research could have applications in the automated analysis of bulk surveillance metadata, including the controversial collection of US citizens’ phone metadata revealed by Snowden.

                    Studies which received military funding channeled through IBM included one called "Modeling User Attitude toward Controversial Topics in Online Social Media", which analysed Twitter users’ opinions on fracking.



                    One of multiple studies looking into how to spread messages on the networks, titled “Who Will Retweet This? Automatically Identifying and Engaging Strangers on Twitter to Spread Information” did just this.

                    The researchers explained: “Since everyone is potentially an influencer on social media and is capable of spreading information, our work aims to identify and engage the right people at the right time on social media to help propagate information when needed.”



                    “According to federal regulations of human experimentation, for studies that don’t affect the environment of online users, and whereas one can freely gather online data – say, from the public Twitter feed – there is no requirement of informed consent. This is the framework under which our Twitter study was carried out; moreover, all our studies on Twitter look into aggregate collective phenomena and never at the individual level.”

                    A colleague added: “In our lab we study all aspects of the diffusion of information in social media.

                    “This work has broad applications as we strive to understand fundamental mechanism of social communication, such as how ideas and ‘memes’ compete for our attention, how they sometimes go viral, etc.”

                    Comment

                    Working...
                    X