The Golden Globes’ Social Team Is as Stressed About Meme-ing Celebs as You


TinaFeyAmyPoehlerHighFive


When Tina Fey and Amy Poehler hosted the Golden Globes for the first time two years ago, their opening monologue featured many an LOL moment, but it wasn’t a joke that immortalized their gig. It was a high-five. “Ang Lee’s been nominated for Best Director for the Life of Pi…,” Poehler said, “…which is what I’m gonna call the six weeks after I take this dress off!” Then, with a perfect blend of irony and genuine glee, the two slapped hands, and it almost immediately became the enduring memory of the 2013 show—not who was nominated, not who won (like Jennifer Lawrence beating out Meryl Streep for Best Actress), but rather this one little gesture. The next day, few remembered who wore what, but the high-five GIF lived on in social feeds forevermore.


This likely happened for two reasons: First, people will make GIFs of almost anything and post them ad infinitum. Second, Fey and Poehler are two of the Internet’s most beloved broads, and when you pair them with the bloodsport that is live-reacting to event television, it’s only natural that their high-five would go viral (along with tweeted quotes, Lawrence’s “I beat Meryl” acceptance speech, and other oddities). “I worked on 30 Rock for many years and then worked on Parks and Rec, so I know how popular these two are with this [online] audience,” says Jared Goldsmith, NBC Entertainment’s head of digital marketing. “They’re able to convey a sensibility that’s in line with how people at home are watching.” But regardless of hosting brilliance, this kind of social potential exists any time average folk—armed with laptops, smartphones, and opinions—watch rich beautiful people get awards. Like the Oscars or the Olympics or the World Cup, many millions of people watch the Golden Globes, and many of them are simultaneously snarking about it on Twitter and posting the highlights on Tumblr.


jennifer-lawrence-beat-meryl


The Globes went on to peak at 19,886 tweets per minute that night, up from a high of nearly 7,472 TPM the year before. During last year’s telecast—also hosted, just like tomorrow night’s installment, by Poehler and Fey—that number went up again, to 28,117 tweets per minute (the TPM peaked when Breaking Bad won the award for best drama on TV). There were 2.4 million tweets total about the Globes telecast in 2014—a 39 percent increase over the year prior—and the volume of tweets (not to mention Facebook posts, Tumblr’d GIFs, and Instagram “likes”) will likely be even higher this weekend. Goldsmith anticipates it will be the “most social Golden Globes ever.”


The Hollywood Foreign Press Association, the nonprofit organization that hands out the Globes, knows this well. Five years ago the HFPA asked Michael Carter, then the chief information officer at the California Institute of the Arts (he’s now COO), to help with social media. In 2010, he covered the Globes by himself, armed only with a smartphone and a laptop; “it was quite a task,” he says. This year he has a team of five helping him. And in addition to their constant output from the event, the HFPA has forged partnerships with Instagram (to have fashion photographer Ellen von Unwerth post pics of winners), Facebook (for a “Facebook Lounge” where celebs can answer online questions), and Twitter, which will host a “GIF Mirror” among other things. (They’re also bringing a “Kick Cam” to capture celeb shoes—a move that prevents Elisabeth Moss flipping Twitter the bird like she did E! on their red carpet “Mani Cam” last year.)


The goal for the Globes’ social media endeavors, Carter says, is to show the audience things they can’t see on TV—but this year he’s also making sure those watching on their second screen will have video content as well. “We’ll be sharing live video from the actual show as it happens,” he says. “So, if there’s a cool moment during the show, you’ll be able to watch it online as well.” In other words, if Tina and Amy do something hysterical, you no longer have to wait for some friendly netizen to rip the clip and put it on YouTube.


But at the same time, waiting to see what the Internet will seize on is part of the fun. Does being hand-fed the night’s memes by companies—whose sole aim is, as Goldsmith says, to “extend the couch”—take away their luster? Things go viral organically, after all, not because they came from any official feed. Tweeting and Facebooking during televised live events became the new hotness because those things served as backchannels, a way to talk about the gowns and glamor without being a correspondent from E! Will there come a time when @GoldenGlobes tweeting the most popular Vine of the night make it feel like that moment your mom friended you on Facebook?


Perhaps, but not if Carter and his team have anything to do with it. For one, he acknowledges that they’re just trying to keep up with the online conversation like everyone else. (There’s a monitor in the social team’s “war room” just for this purpose.) And, like the best friend of a celeb who gets a last-minute invite to the ceremony, they’re also just happy to be there. “The chatter online about the Golden Globes is amazing,” he says. “It would happen without us—but it’s fun to be a part of it.”



New Novel Pits Darwin Against God, Indiana Jones-Style


Morrow

Witek Kaczanowski



James Morrow is widely regarded as the foremost satirist in science fiction. His new novel, Galapagos Regained , tells the story of a Victorian actress named Chloe Bathhurst who attempts to use Darwin’s theory of natural selection to disprove the existence of God. Morrow spent six years writing the book, then a few more trying to sell it. In recent years atheist-oriented films like The Golden Compass and Creation have faced a public backlash, and Morrow isn’t sure whether that might have made some publishers leery of the book.


“The publishers who turned it down came up with other reasons,” Morrow says in Episode 132 of the Geek’s Guide to the Galaxy podcast. “So I’ll never know if they thought it was just too incendiary.”


Galapagos Regained is a novel of ideas, full of politics, philosophy, and theology, but it’s also a globe-spanning tale bursting with battles, shipwrecks, and narrow escapes. In this Morrow was influenced by the journeys of Darwin himself. Today we tend to picture the famous scientist as a white-bearded patriarch or quiet invalid, but in fact his theories were shaped by a lifetime of wild adventure.


“The young Darwin was indeed this kind of Indiana Jones figure,” says Morrow. “And I very much had that in mind when I conceived of Chloe’s escapades.”


Those escapades, which take Chloe from the halls of Oxford to the Amazon jungle to the rocky shores of the Galapagos, are beset by uncertainty and doubt, but for Morrow the question of God is more clear-cut.


“If there were such a thing as a disproof of God, if there could be such a thing, it seems to me it would look a lot like Darwinian materialism coupled to the argument from evil,” he says. “And I think that one-two punch, for me, causes God to go belly up.”


Listen to our complete interview with James Morrow in Episode 132 of the Geek’s Guide to the Galaxy podcast (above), and check out some highlights from the discussion below.


James Morrow on South America:


“I wanted the book to be entertaining. I’ve always liked the truism that all art is entertainment, that all drama is melodrama—it doesn’t work the other way around, not all melodrama is drama and not all entertainment is art. But I love epics, I love Jules Verne. A lot of the South American material is an homage to Voltaire’s Candide . And I said, OK, I just want to see what happens if I put my characters down in that zone. … I had read several books about Amazonia as part of the research, because Chloe has to find her way across the continent of South America after she’s shipwrecked off the coast of Brazil. And when she gets to Peru, she gets caught up in the Great Rubber War. The research I’d done had given me a lot of information about the rubber industry and how horribly exploitative it was of the natives—of the Indian population. The mistreatment that’s documented maps on to the historical facts, sad to say. There was not actually an event that was called the Great Rubber War—that’s sort of a poetic conceit on my part—but a generation or so after the events of Galapagos Regained, there is a terrible conflict in the country of Colombia, on the Rio de Mayo, that does correspond to the middle section of my book.”


James Morrow on Charles Darwin:


“I can appreciate why many people would regard Darwin’s theory as bad news—he brought bad news back from the Galapagos Islands. But for me the story doesn’t end there. There’s something exhilarating, for me, about our interconnectedness to everything that’s alive right now, and has ever lived, and ever will live. I think Darwin’s sin, the reason he makes people so nervous, is not that he killed God, but that he replaced God. He didn’t just make a case for atheism, he also made a case for something that’s equivalent to God, it just happens to be materialist. … He replaced God with something that for me is far more magnificent than anything one finds in scripture, far more magnificent—complex, detailed, exhilarating, transcendent—than anything ever encountered in the zone of prophets insisting that their revelations are the case. He pushed the reset button on the whole of the Western psyche, you know, he refreshed the screen, except something brand new came up that we weren’t expecting. The Christian narrative … is beautiful, it’s coherent, it’s very satisfactory, but it doesn’t seem to have anything to do with the world that we’re actually in.”


James Morrow on Rosalind Franklin:


“[Her] work was crucial to Watson and Crick’s unraveling the structure of the DNA molecule—the X-ray photographs she took and also her interpretation of those photographs. It’s sometimes forgotten that what James Watson pilfered from her files was not simply the pictures, but was her understanding that the phosphate chains of the DNA molecule were anti-parallel, and this strongly suggests a double helix. Rosalind Franklin was famously and notoriously ignored and forgotten. When Watson and Crick and Maurice Wilkins shared their Nobel Prize, they did not even mention her from the podium. … If you read James Watson’s book called The Double Helix , while it offers many fascinating insights into how scientific research actually progresses and what an all-too-human enterprise it is, he takes such a sardonic and childish and—to be sure—sexist view of Rosalind Franklin that you want to throw the book across the room.”


James Morrow on Teilhard de Chardin:


“This was a time when the Catholic Church was not reconciled to Darwin. They’ve done better in recent generations, although I think there’s still more work to do. He was almost kicked out of the Jesuit order, given his passion for evolutionary theory. I think the Holy Office—which is the euphemism for the Inquisition—the Holy Office regarded him as a borderline heretic. … [He was] quite unequivocal that evolution had occurred on this planet, that Darwin had nailed it, that the theory of natural selection accounted for the transmutation of species in a way that the Book of Genesis never begins to do. But Teilhard took it into this mystical realm, this teleological realm, where we are a transitional species—which of course Darwin would have agreed with, but not in the sense that Teilhard meant it—where we are transitional in that we are on our way to a rendezvous with the Cosmic Christ, and all human minds are going to meld, and the consciousness that we enjoy, day in and day out, will seem feeble, pathetic, a mere whisper, compared to the transcendent chorus of our eventual fusion with the divine, and our union with this Omega Point that lies outside of time and space. It’s very clever, and it’s satisfying in an intellectual way, but as I said before, it doesn’t seem to describe the world that we’ve actually inherited.”



The Feds Got the Sony Hack Right, But the Way They’re Framing It Is Dangerous


Pedestrians in Seoul, South Korea watch a news program showing North Korean leader Kim Jong-Un delivering a speech, on Thursday, Jan. 1, 2015.

Pedestrians in Seoul, South Korea watch a news program showing North Korean leader Kim Jong-Un delivering a speech, on Thursday, Jan. 1, 2015. Ahn Young-joon/AP



The FBI’s statement that North Korea is responsible for the cyber attack on Sony Pictures Entertainment has been met with various levels of support and criticism, which has polarized the information security community. At its core, the debate comes down to this: Should we trust the government and its evidence or not? But I believe there is another view that has not been widely represented. Those who trust the government, but disagree with the precedent being set.


Polarization and Precedents


The government knew when it released technical evidence surrounding the attack that what it was presenting was not enough. The evidence presented so far has been lackluster at best, and by its own admission, there was additional information used to arrive at the conclusion that North Korea was responsible, that it decided to withhold. Indeed, the NSA has now acknowledged helping the FBI with its investigation, though it still unclear what exactly the nature of that help was.


But in presenting inconclusive evidence to the public to justify the attribution, the government opened the door to cross-analysis that would obviously not reach the same conclusion it had reached. It was likely done with good intention, but came off to the security community as incompetence, with a bit of pandering.



Robert M. Lee


Robert M. Lee is a PhD candidate at Kings College London and an active-duty Air Force Cyber Warfare Operations Officer who has led operational teams in the Air Force and Intelligence Community.




When I served in the intelligence community as an analyst and team lead doing digital network analysis, dealing with these types of threat attribution cases was the norm. What was not the norm was going public with attribution. I understand the reason for wanting to publicly identify attackers and I also understand the challenges of identifying attackers while at the same time preserving sources and methods. Being open with evidence does have serious consequences. But being entirely closed with evidence is a problem, too. The worst path is the middle ground though. The problem in this case is that the government made a decision to have public attribution without the needed public evidence to prove it. It sets a dangerous international precedent whereby we’re saying to the world “we did the analysis, don’t question it—it’s classified—just accept it as proof.”


This opens up scary possibilities. If Iran had reacted the same way when it’s nuclear facility was hit with the Stuxnet malware we likely would have all critiqued it. The global community would have not accepted “we did analysis but it’s classified so now we’re going to employ countermeasures” as an answer. If the attribution was wrong and there was an actual countermeasure or response to the attack then the lack of public analysis could have led to incorrect and drastic consequences. But with the precedent now set—what happens next time? In a hypothetical scenario, China, Russia, or Iran would be justified to claim that an attack against their private industry was the work of a nation-state, say that the evidence is classified, and then employ legal countermeasures. This could be used inappropriately for political posturing and goals. The Sony case should not be over simplified as there were no clear cut correct answers but it’s important to understand the precedent being set and the potential for blowback.


I Believe the FBI


Let me be clear. I’m not one of the people in the infosec community who thinks the government got the attribution wrong. I agree with the attribution supporters who say the FBI has access to more data than the public has and can therefore reach a better conclusion. The FBI and the intelligence community have highly competent professionals and have experience working on these types of cases. And in this case, they’ve also engaged the private sector to add outside expertise. This combination of internal government expertise with industry expertise was a mature response to a complex situation.


In my intelligence work, we did tech analysis with government sources and methods on a regular basis for attribution. Sometimes we got it right. Sometimes we got it wrong, because we’re human and technical data, while not magic, is not easy to always interpret right. But finished intelligence reports that have examined multiple sources of data and competing analyses are often highly accurate. That type of quality intelligence product is what the FBI has internally.


I believe that North Korea probably did hack Sony. I do trust the government in that regard. I do not trust the standard it is setting, however, and I will never accept “it’s classified and we can’t tell you, but we’re going to publicly blame someone anyway” as a legitimate response. I believe the FBI’s analysis is likely right. But I also believe the critics to be correct.


The Critics Are Right


I don’t think the critics are posing the best counter theories on the attribution issue in the Sony hack—pointing the finger at company insiders—and I don’t think they have enough data to “know” anything about who did it. But the critics accurately state that technical analysis is prone to bias and error, making inherent trust in the government’s theory unwise. The evidence presented so far does not accurately show that North Korea was responsible for the Sony attack. And by its nature, the information security community does not generally accept “because I said so” and “trust us” as adequate answers. Not blindly trusting information is exactly what makes for a good infosec professional. And asking tough questions is an important part of solidifying theories and reaching appropriate conclusions. The FBI should have predicted this response from the community when it decided to publicly attribute while withholding significant portions of the evidence. What the government chose was a middle ground that not only polarized the community but set a bad precedent. More transparency would have strengthened the case and established a higher bar for attribution.


The government in the future needs to pick one path and stick to it. It either needs to realize that attribution in a case like this is important enough to risk disclosing sources and methods or it needs to realize that the sources and methods are more important and withhold attribution entirely or present it without any evidence. Trying to do both results in losses all around. There will be lessons learned from this, but whether or not they get applied will be determined by history.


These views do not represent or constitute an opinion by the U.S. Government, Department of Defense, or Air Force. They are the author’s views alone.



The Golden Globes’ Social Team Is as Stressed About Meme-ing Celebs as You


TinaFeyAmyPoehlerHighFive


When Tina Fey and Amy Poehler hosted the Golden Globes for the first time two years ago, their opening monologue featured many an LOL moment, but it wasn’t a joke that immortalized their gig. It was a high-five. “Ang Lee’s been nominated for Best Director for the Life of Pi…,” Poehler said, “…which is what I’m gonna call the six weeks after I take this dress off!” Then, with a perfect blend of irony and genuine glee, the two slapped hands, and it almost immediately became the enduring memory of the 2013 show—not who was nominated, not who won (like Jennifer Lawrence beating out Meryl Streep for Best Actress), but rather this one little gesture. The next day, few remembered who wore what, but the high-five GIF lived on in social feeds forevermore.


This likely happened for two reasons: First, people will make GIFs of almost anything and post them ad infinitum. Second, Fey and Poehler are two of the Internet’s most beloved broads, and when you pair them with the bloodsport that is live-reacting to event television, it’s only natural that their high-five would go viral (along with tweeted quotes, Lawrence’s “I beat Meryl” acceptance speech, and other oddities). “I worked on 30 Rock for many years and then worked on Parks and Rec, so I know how popular these two are with this [online] audience,” says Jared Goldsmith, NBC Entertainment’s head of digital marketing. “They’re able to convey a sensibility that’s in line with how people at home are watching.” But regardless of hosting brilliance, this kind of social potential exists any time average folk—armed with laptops, smartphones, and opinions—watch rich beautiful people get awards. Like the Oscars or the Olympics or the World Cup, many millions of people watch the Golden Globes, and many of them are simultaneously snarking about it on Twitter and posting the highlights on Tumblr.


jennifer-lawrence-beat-meryl


The Globes went on to peak at 19,886 tweets per minute that night, up from a high of nearly 7,472 TPM the year before. During last year’s telecast—also hosted, just like tomorrow night’s installment, by Poehler and Fey—that number went up again, to 28,117 tweets per minute (the TPM peaked when Breaking Bad won the award for best drama on TV). There were 2.4 million tweets total about the Globes telecast in 2014—a 39 percent increase over the year prior—and the volume of tweets (not to mention Facebook posts, Tumblr’d GIFs, and Instagram “likes”) will likely be even higher this weekend. Goldsmith anticipates it will be the “most social Golden Globes ever.”


The Hollywood Foreign Press Association, the nonprofit organization that hands out the Globes, knows this well. Five years ago the HFPA asked Michael Carter, then the chief information officer at the California Institute of the Arts (he’s now COO), to help with social media. In 2010, he covered the Globes by himself, armed only with a smartphone and a laptop; “it was quite a task,” he says. This year he has a team of five helping him. And in addition to their constant output from the event, the HFPA has forged partnerships with Instagram (to have fashion photographer Ellen von Unwerth post pics of winners), Facebook (for a “Facebook Lounge” where celebs can answer online questions), and Twitter, which will host a “GIF Mirror” among other things. (They’re also bringing a “Kick Cam” to capture celeb shoes—a move that prevents Elisabeth Moss flipping Twitter the bird like she did E! on their red carpet “Mani Cam” last year.)


The goal for the Globes’ social media endeavors, Carter says, is to show the audience things they can’t see on TV—but this year he’s also making sure those watching on their second screen will have video content as well. “We’ll be sharing live video from the actual show as it happens,” he says. “So, if there’s a cool moment during the show, you’ll be able to watch it online as well.” In other words, if Tina and Amy do something hysterical, you no longer have to wait for some friendly netizen to rip the clip and put it on YouTube.


But at the same time, waiting to see what the Internet will seize on is part of the fun. Does being hand-fed the night’s memes by companies—whose sole aim is, as Goldsmith says, to “extend the couch”—take away their luster? Things go viral organically, after all, not because they came from any official feed. Tweeting and Facebooking during televised live events became the new hotness because those things served as backchannels, a way to talk about the gowns and glamor without being a correspondent from E! Will there come a time when @GoldenGlobes tweeting the most popular Vine of the night make it feel like that moment your mom friended you on Facebook?


Perhaps, but not if Carter and his team have anything to do with it. For one, he acknowledges that they’re just trying to keep up with the online conversation like everyone else. (There’s a monitor in the social team’s “war room” just for this purpose.) And, like the best friend of a celeb who gets a last-minute invite to the ceremony, they’re also just happy to be there. “The chatter online about the Golden Globes is amazing,” he says. “It would happen without us—but it’s fun to be a part of it.”



New Novel Pits Darwin Against God, Indiana Jones-Style


Morrow

Witek Kaczanowski



James Morrow is widely regarded as the foremost satirist in science fiction. His new novel, Galapagos Regained , tells the story of a Victorian actress named Chloe Bathhurst who attempts to use Darwin’s theory of natural selection to disprove the existence of God. Morrow spent six years writing the book, then a few more trying to sell it. In recent years atheist-oriented films like The Golden Compass and Creation have faced a public backlash, and Morrow isn’t sure whether that might have made some publishers leery of the book.


“The publishers who turned it down came up with other reasons,” Morrow says in Episode 132 of the Geek’s Guide to the Galaxy podcast. “So I’ll never know if they thought it was just too incendiary.”


Galapagos Regained is a novel of ideas, full of politics, philosophy, and theology, but it’s also a globe-spanning tale bursting with battles, shipwrecks, and narrow escapes. In this Morrow was influenced by the journeys of Darwin himself. Today we tend to picture the famous scientist as a white-bearded patriarch or quiet invalid, but in fact his theories were shaped by a lifetime of wild adventure.


“The young Darwin was indeed this kind of Indiana Jones figure,” says Morrow. “And I very much had that in mind when I conceived of Chloe’s escapades.”


Those escapades, which take Chloe from the halls of Oxford to the Amazon jungle to the rocky shores of the Galapagos, are beset by uncertainty and doubt, but for Morrow the question of God is more clear-cut.


“If there were such a thing as a disproof of God, if there could be such a thing, it seems to me it would look a lot like Darwinian materialism coupled to the argument from evil,” he says. “And I think that one-two punch, for me, causes God to go belly up.”


Listen to our complete interview with James Morrow in Episode 132 of the Geek’s Guide to the Galaxy podcast (above), and check out some highlights from the discussion below.


James Morrow on South America:


“I wanted the book to be entertaining. I’ve always liked the truism that all art is entertainment, that all drama is melodrama—it doesn’t work the other way around, not all melodrama is drama and not all entertainment is art. But I love epics, I love Jules Verne. A lot of the South American material is an homage to Voltaire’s Candide . And I said, OK, I just want to see what happens if I put my characters down in that zone. … I had read several books about Amazonia as part of the research, because Chloe has to find her way across the continent of South America after she’s shipwrecked off the coast of Brazil. And when she gets to Peru, she gets caught up in the Great Rubber War. The research I’d done had given me a lot of information about the rubber industry and how horribly exploitative it was of the natives—of the Indian population. The mistreatment that’s documented maps on to the historical facts, sad to say. There was not actually an event that was called the Great Rubber War—that’s sort of a poetic conceit on my part—but a generation or so after the events of Galapagos Regained, there is a terrible conflict in the country of Colombia, on the Rio de Mayo, that does correspond to the middle section of my book.”


James Morrow on Charles Darwin:


“I can appreciate why many people would regard Darwin’s theory as bad news—he brought bad news back from the Galapagos Islands. But for me the story doesn’t end there. There’s something exhilarating, for me, about our interconnectedness to everything that’s alive right now, and has ever lived, and ever will live. I think Darwin’s sin, the reason he makes people so nervous, is not that he killed God, but that he replaced God. He didn’t just make a case for atheism, he also made a case for something that’s equivalent to God, it just happens to be materialist. … He replaced God with something that for me is far more magnificent than anything one finds in scripture, far more magnificent—complex, detailed, exhilarating, transcendent—than anything ever encountered in the zone of prophets insisting that their revelations are the case. He pushed the reset button on the whole of the Western psyche, you know, he refreshed the screen, except something brand new came up that we weren’t expecting. The Christian narrative … is beautiful, it’s coherent, it’s very satisfactory, but it doesn’t seem to have anything to do with the world that we’re actually in.”


James Morrow on Rosalind Franklin:


“[Her] work was crucial to Watson and Crick’s unraveling the structure of the DNA molecule—the X-ray photographs she took and also her interpretation of those photographs. It’s sometimes forgotten that what James Watson pilfered from her files was not simply the pictures, but was her understanding that the phosphate chains of the DNA molecule were anti-parallel, and this strongly suggests a double helix. Rosalind Franklin was famously and notoriously ignored and forgotten. When Watson and Crick and Maurice Wilkins shared their Nobel Prize, they did not even mention her from the podium. … If you read James Watson’s book called The Double Helix , while it offers many fascinating insights into how scientific research actually progresses and what an all-too-human enterprise it is, he takes such a sardonic and childish and—to be sure—sexist view of Rosalind Franklin that you want to throw the book across the room.”


James Morrow on Teilhard de Chardin:


“This was a time when the Catholic Church was not reconciled to Darwin. They’ve done better in recent generations, although I think there’s still more work to do. He was almost kicked out of the Jesuit order, given his passion for evolutionary theory. I think the Holy Office—which is the euphemism for the Inquisition—the Holy Office regarded him as a borderline heretic. … [He was] quite unequivocal that evolution had occurred on this planet, that Darwin had nailed it, that the theory of natural selection accounted for the transmutation of species in a way that the Book of Genesis never begins to do. But Teilhard took it into this mystical realm, this teleological realm, where we are a transitional species—which of course Darwin would have agreed with, but not in the sense that Teilhard meant it—where we are transitional in that we are on our way to a rendezvous with the Cosmic Christ, and all human minds are going to meld, and the consciousness that we enjoy, day in and day out, will seem feeble, pathetic, a mere whisper, compared to the transcendent chorus of our eventual fusion with the divine, and our union with this Omega Point that lies outside of time and space. It’s very clever, and it’s satisfying in an intellectual way, but as I said before, it doesn’t seem to describe the world that we’ve actually inherited.”



The Feds Got the Sony Hack Right, But the Way They’re Framing It Is Dangerous


Pedestrians in Seoul, South Korea watch a news program showing North Korean leader Kim Jong-Un delivering a speech, on Thursday, Jan. 1, 2015.

Pedestrians in Seoul, South Korea watch a news program showing North Korean leader Kim Jong-Un delivering a speech, on Thursday, Jan. 1, 2015. Ahn Young-joon/AP



The FBI’s statement that North Korea is responsible for the cyber attack on Sony Pictures Entertainment has been met with various levels of support and criticism, which has polarized the information security community. At its core, the debate comes down to this: Should we trust the government and its evidence or not? But I believe there is another view that has not been widely represented. Those who trust the government, but disagree with the precedent being set.


Polarization and Precedents


The government knew when it released technical evidence surrounding the attack that what it was presenting was not enough. The evidence presented so far has been lackluster at best, and by its own admission, there was additional information used to arrive at the conclusion that North Korea was responsible, that it decided to withhold. Indeed, the NSA has now acknowledged helping the FBI with its investigation, though it still unclear what exactly the nature of that help was.


But in presenting inconclusive evidence to the public to justify the attribution, the government opened the door to cross-analysis that would obviously not reach the same conclusion it had reached. It was likely done with good intention, but came off to the security community as incompetence, with a bit of pandering.



Robert M. Lee


Robert M. Lee is a PhD candidate at Kings College London and an active-duty Air Force Cyber Warfare Operations Officer who has led operational teams in the Air Force and Intelligence Community.




When I served in the intelligence community as an analyst and team lead doing digital network analysis, dealing with these types of threat attribution cases was the norm. What was not the norm was going public with attribution. I understand the reason for wanting to publicly identify attackers and I also understand the challenges of identifying attackers while at the same time preserving sources and methods. Being open with evidence does have serious consequences. But being entirely closed with evidence is a problem, too. The worst path is the middle ground though. The problem in this case is that the government made a decision to have public attribution without the needed public evidence to prove it. It sets a dangerous international precedent whereby we’re saying to the world “we did the analysis, don’t question it—it’s classified—just accept it as proof.”


This opens up scary possibilities. If Iran had reacted the same way when it’s nuclear facility was hit with the Stuxnet malware we likely would have all critiqued it. The global community would have not accepted “we did analysis but it’s classified so now we’re going to employ countermeasures” as an answer. If the attribution was wrong and there was an actual countermeasure or response to the attack then the lack of public analysis could have led to incorrect and drastic consequences. But with the precedent now set—what happens next time? In a hypothetical scenario, China, Russia, or Iran would be justified to claim that an attack against their private industry was the work of a nation-state, say that the evidence is classified, and then employ legal countermeasures. This could be used inappropriately for political posturing and goals. The Sony case should not be over simplified as there were no clear cut correct answers but it’s important to understand the precedent being set and the potential for blowback.


I Believe the FBI


Let me be clear. I’m not one of the people in the infosec community who thinks the government got the attribution wrong. I agree with the attribution supporters who say the FBI has access to more data than the public has and can therefore reach a better conclusion. The FBI and the intelligence community have highly competent professionals and have experience working on these types of cases. And in this case, they’ve also engaged the private sector to add outside expertise. This combination of internal government expertise with industry expertise was a mature response to a complex situation.


In my intelligence work, we did tech analysis with government sources and methods on a regular basis for attribution. Sometimes we got it right. Sometimes we got it wrong, because we’re human and technical data, while not magic, is not easy to always interpret right. But finished intelligence reports that have examined multiple sources of data and competing analyses are often highly accurate. That type of quality intelligence product is what the FBI has internally.


I believe that North Korea probably did hack Sony. I do trust the government in that regard. I do not trust the standard it is setting, however, and I will never accept “it’s classified and we can’t tell you, but we’re going to publicly blame someone anyway” as a legitimate response. I believe the FBI’s analysis is likely right. But I also believe the critics to be correct.


The Critics Are Right


I don’t think the critics are posing the best counter theories on the attribution issue in the Sony hack—pointing the finger at company insiders—and I don’t think they have enough data to “know” anything about who did it. But the critics accurately state that technical analysis is prone to bias and error, making inherent trust in the government’s theory unwise. The evidence presented so far does not accurately show that North Korea was responsible for the Sony attack. And by its nature, the information security community does not generally accept “because I said so” and “trust us” as adequate answers. Not blindly trusting information is exactly what makes for a good infosec professional. And asking tough questions is an important part of solidifying theories and reaching appropriate conclusions. The FBI should have predicted this response from the community when it decided to publicly attribute while withholding significant portions of the evidence. What the government chose was a middle ground that not only polarized the community but set a bad precedent. More transparency would have strengthened the case and established a higher bar for attribution.


The government in the future needs to pick one path and stick to it. It either needs to realize that attribution in a case like this is important enough to risk disclosing sources and methods or it needs to realize that the sources and methods are more important and withhold attribution entirely or present it without any evidence. Trying to do both results in losses all around. There will be lessons learned from this, but whether or not they get applied will be determined by history.


These views do not represent or constitute an opinion by the U.S. Government, Department of Defense, or Air Force. They are the author’s views alone.