The Weird Way Facebook and Instagram Are Making Us Happier


Materialism and the urge to ‘keep up with the Joneses’ is our current culture’s version of an essential animal and human trait. It is our way of shaking our manes, flashing our feathers, and howling like monkeys. But there’s been a shift recently—in some circles, people have embraced an idea called experientialism. For these people, it is now not only socially acceptable but also socially expected to prefer experiences over stuff. And this idea is already starting to spread from these innovators to the mainstream because of one of the 21st century’s most important innovations: Facebook.


A New Kind of Conspicuous Consumption


If you ask them, most experientialists would laugh at the idea that they try to keep up with the Joneses. Yet one of the most counterintuitive things about experientialists in general is that, although they want nothing to do with keeping up with the Joneses in the traditional sense, many consume just as conspicuously as even the most status-conscious materialists. I blame Facebook.


Excerpted from Stuffocation: Why We've Had Enough of Stuff and Need Experience More Than Ever Excerpted from Stuffocation: Why We’ve Had Enough of Stuff and Need Experience More Than Ever

Remember how friends used to tell you about their latest vacation? They would invite you over for dinner, and, as the after-dinner chocolates were passed around, pull out their photos and bore you for a bit.


Now, using Facebook, Instagram, Twitter, and all the other social media sites, you can share every last detail of your trip in real time. You can let everyone know that, right now, you are watching the sunrise over Angkor Wat or the sunset from the rooftop of your riad in Marrakech, or that you are on a chairlift in the Alps, or that you have just finished packing and cannot wait to go. You need not, of course, broadcast your thoughts and updates only when you are on vacation. Why not share that you have just run a marathon, that you are at a Rolling Stones concert or a TEDx conference, or that you are thrilled because someone bought you flowers? Today, where you are, how you are feeling, what you are doing, and what you have done have suddenly become valuable social currency—just as they were before the 20th century.


Anyone can buy most material goods, but not everyone can be at the event you are Instagramming a picture from.




Then, most people lived in small communities. Everyone knew everybody else in the village. That meant everyone would just as likely know what you did with your time as how many possessions you owned, and how expensive and how good those possessions were. That meant, for signaling your status to others and establishing your place in the village’s social hierarchy, what you did was as important as what you owned. To signal status, the conspicuous consumption of leisure—that is, experiences—was equal to the conspicuous consumption of goods.

It was the arrival of cities that changed all that. The mass migrations of the 20th century, from small communities where everyone knew everyone else to large metropolises where you barely knew your neighbor, meant that what you did with your time became virtually useless as a way to signify status. In the relative anonymity of urban and, to a lesser extent, suburban life, your neighbors, friends, colleagues at work, and the people you passed on the street were much more likely to see what you owned than know what you did.


A material possession could deliver far more status than an experiential purchase. And so, in the 20th century, the conspicuous consumption of leisure was not nearly so effective as the conspicuous consumption of goods at telling others who you were.


Social media has turned this on its head. Now only a few people, relatively, might see your new sofa, or the car parked in your driveway. But with all your friends and followers on Twitter, Facebook, Pinterest, and Instagram, many more will now know you are partying in Ibiza, are in the front row of a Jay-Z concert, or that you have just completed a Tough Mudder assault course. And these people are more likely to be in your peer group, the people, in other words, whose opinion you are most interested in.


Facebook is giving us a new way to worry that we may not be keeping up with the Joneses.




Social media also plays a vital role in making experiences appear more valuable, thanks to their pivotal role in the “rarity principle.” According to this idea, the bigger the difference between the number of people who have access to something and the number of people who know about it, the rarer and more valuable the thing is. Anyone, after all, can buy most material goods, but not everyone can be at the event you are tweeting about or Instagramming a picture from.

The Double-Edged Sword of Social Media


Thanks to social media we want to keep up with what the Joneses are doing. Are we going to enough pop-ups, conferences, and concerts—like all our friends and acquaintances seem to be? This concern has become so widespread that it has a new name: fear of missing out, better known by its acronym, FOMO. At the birth of the experiential era, four out of every 10 people aged 18 to 34 in the US and UK say they sometimes worry that they are missing out. Facebook, you might say, is giving us a new way to keep up with the Joneses, and a new way to worry that we may not be keeping up.


FOMO is, at the least, problematic for experientialism. Because if this new way of living is just as likely to deliver anxiety and stress as materialism, how is it an improvement? Thought of in these terms, experientialism might even sound worse than materialism.


In today’s hyper-connected, 24/7 world, the game has changed. Not only do we notice material status cues when we see people in the real world, we are also getting and giving status updates through Facebook, Twitter, and all the other social networks. And since we check these throughout the day—when we get up and when we go to bed, on the toilet, on the train, in the classroom, and in the office—that means that we are playing the game more regularly, and thinking about the game more too. As we do that, we are more likely to end up feeling anxious and stressed, and perhaps depressed, about status.


There is another change in the game that is having an even more damaging effect, I believe, on happiness. In the past as we went about our everyday lives, we would not only see people with fancier cars and watches and clothes. We would also encounter people with cheaper, older, more threadbare, and less designer equivalents to the stuff we had. That mix would leave us feeling secure. It felt okay not being at the top of the social ladder, as long as you weren’t at the bottom of the pile either.


Comparing experiences is less clear-cut than it is with material goods, which means you're less likely to think about the status implications of what you do.


Now, think about the last time you looked at a social network. Have you ever noticed how Facebook, and other social networks, sometimes brings to mind upscale magazines filled with the look-but-don’t-touch lifestyles of the rich and lucky?


Your friends’ lives may well not be quite so perfect, of course. Life for most people, after all, is not a flawless timeline of weekends away and weddings in glamorous places. And if you stop to think about it, you know that. But it is hard to keep that in perspective. And since we are all connected to so many people on Facebook, there is always someone jetting off to Miami, having lunch in Lima, lounging on a boat in the Mediterranean, or attending a wedding in the Caribbean.


This constant bombardment leaves us feeling that we are always at bottom of the pile looking up. And that, in a meritocratic system like ours, can leave us feeling anxious, stressed, and depressed.


So all of this puts Facebook and other social networks in the curious position of supporting the rise of experientialism, while also undermining its benefits. This suggests, ironically, that if you buy into experientialism, you could end up with just as much anxiety, stress, and depression as you would have had in gentler, more materialistic times. But, and it is a very important but, even though keeping up with the Joneses through experiences has the potential to be anxiety-inducing, experientialism is still better than materialism.


Despite the double-edged sword of social media, it’s important to learn and remember four discoveries social scientists have made in recent years: that experiences are more likely to make us happy because we are less likely to get bored with them, more likely to see them with rose-tinted glasses, and more likely to think of them as part of who we are, and because they are more likely to bring us closer to other people and are harder to compare.


Even if experiences can be compared, the comparison is less clear-cut than it is with material goods, and that means you are less likely to think about the comparison, less likely to regret your choice afterward, and less likely to think about the status implications of what you do. So if you want to be happier, save yourself the hard work of all that thinking, and just choose experiences instead. They’re not only the smart route to happiness. Thanks to Facebook, they’re also the best way to shake your tail feathers.


Reprinted from Stuffocation by James Wallman with permission of Spiegel & Grau, an imprint of Random House, a division of Penguin Random House, LLC. Copyright (c) James Wallman, 2015.


Editor: Samantha Oltman (@samoltman)



“We Are Not Ready”: Ebola Analysis from Front-Line Workers (And Bill Gates)

Bill Gates speaks at TED 2015 in Vancouver. (Image courtesy TED


We’ve just passed a difficult and little-noticed anniversary: Last week, the Ebola epidemic in West Africa achieved its first birthday. Though the viral outbreak has been contained, it is still not under control: According to the World Health Organization, cases continue in Sierra Leone and are rising again in Guinea. Liberia was about to record an entire incubation period without a new case — a signal that the chain of person-to-person transmission might have been broken — but on Friday, it announced that it had found a single new case. How that woman became infected is unclear; it is possible that she represents, not a new outbreak, but a brief interruption in an otherwise promising trend.


It has been decades since there was an epidemic of this persistence and magnitude. No other Ebola outbreak matches it; nor does the 2003 epidemic of SARS. You would have to go back to the early days of HIV in the 1980s, or to the flu pandemics in 1968, 1957 or even 1918, to find an outbreak that sickened so many people, challenged international response capacity so much, and instilled such fear in other countries.


The anniversary has triggered reflections. Some criticize the response to Ebola for being inadequate and slow. Others — such as two talks at last week’s TED conference, one by Bill Gates — extract lessons that should inform responses to future epidemics.


Because there will be future epidemics. That’s for sure.


In a report published this morning, Doctors Without Borders (also known as MSF, its acronym in French) excoriates the international response to the discovery of the first cases a year ago, deriding a “global coalition of inaction”:




The Ebola outbreak proved to be an exceptional event that exposed the reality of how inefficient and slow health and aid systems are to respond to emergencies. ‘Business as usual’ was exposed on the world stage, with the loss of thousands of lives.



The assessment gives low marks to the governments of the affected countries, for attempting to debunk or suppress early reports; to the World Health Organization, for failures of leadership; and to aid agencies in the West, for being slow to send personnel and supplies. Conscientiously, MSF also scores its own response, examining its slowness to mobilize the full organization, difficulty in choosing among competing priorities, and struggles to balance care for patients while exercising its duty to protect its employees.


Meanwhile: Last week was the influential annual TED conference, which focuses on “ideas worth spreading.” Two prominent members of the international response to Ebola spoke at the conference, and simultaneously published pieces based on their talks to ensure that their ideas had wide distribution. (I attended TED as a speaker and heard both.) TED is known for a kind of indomitable technological optimism — biosynthetic design, genetic engineering, self-driving cars — so it was refreshing and a little surprising to see how attentively the TED audience listened to discussions that were measured, if not outright pessimistic.


First, Seth Berkley, chief executive of GAVI, the Vaccine Alliance, described how Ebola has shone a spotlight on a fundamental problem in vaccine development. “We have known about Ebola since 1976. We have had ample opportunity to study it; 24 outbreaks of it have occurred so far, and we have had vaccine candidates available now for more than a decade,” he said. But, he added: “The people most at risk from these diseases are also the ones least able to pay for vaccines. This leaves little in the way of market incentives for manufacturers to develop vaccines, unless there are large numbers of people at risk in wealthy countries. It is simply too commercially risky.”


In a companion essay in Nature , Berkley argues for changing the way vaccines are funded, switching to a government-industry partnership that could get vaccines out quickly once the basic research is paid for.



When an outbreak occurs and vaccines are needed, it would help significantly to have vectors ready to deliver them. With the right investment, these vectors, typically a harmless virus or bacterium, could be prepared and tested in advance. Crucially, they could be pressed into service to tackle a range of diseases. Four of the five Ebola vaccines currently going through clinical trials use vectors developed and tested for HIV.


Such generic vectors would, in effect, modularize the vaccine development process — conducting much of the safety testing and ironing out manufacturing processes for different vectors ready for the addition of a ‘payload’ antigen. By developing such mechanisms in advance, and pre-testing them for safety and dose, we can save significant amounts of money and time by having stockpiles frozen and ready for use or efficacy testing as soon as an outbreak occurs.



In another talk, philanthropist Gates expanded Berkley’s call for better vaccine development into a multi-stranded argument for better preparation for outbreaks. Recalling the obsession with global nuclear war that dominated his childhood, he said:



If anything kills over 10 million people in the next decades, it is most likely to be a highly infectious virus, rather than a war: not missiles, but microbes. We have invested a huge amount in nuclear deterrence. We actually have invested very little in a system to stop an epidemic. We are not ready for the next epidemic.



In an essay published simultaneously in the New England Journal of Medicine, Gates describes the investments that would be necessary to create a global warning and response system. His list, from the end of the piece:



This system should:



  • be coordinated by a global institution that is given enough authority and funding to be effective,

  • enable fast decision making at a global level,

  • expand investment in research and development and clarify regulatory pathways for developing new tools and approaches,

  • improve early warning and detection systems, including scalable everyday systems that can be expanded during an epidemic,

  • involve a reserve corps of trained personnel and volunteers,

  • strengthen health systems in low- and middle-income countries, and

  • incorporate preparedness exercises to identify the ways in which the response system needs to improve.



In the same way that there has never been an outbreak quite like this Ebola epidemic — globally unnerving but geographically limited, fast-moving and yet contained enough that it can be assessed from a distance — there probably has never been an outbreak that offered such concrete examples of things to do better next time. For the sake of everyone’s health, let’s hope those lessons are learned.



Glee Is Gone, But Its GIFs Will Live Forever

Grant Gustin (now The Flash on CW) during his time as a less-heroic Dalton Academy Warbler on Glee. Grant Gustin (now The Flash on CW) during his time as a less-heroic Dalton Academy Warbler on Glee. Fox



After six seasons, Glee sang its way offstage on Friday—a cold slushy to the face for viewers who had been following the New Directions since the time when the wordplay in their name was still funny. But the Fox show isn’t leaving us empty-handed; Glee‘s legacy is a crapload of fantastic GIFs we can use long after its characters are but a distant memory.


Because of the emotional teacup ride Glee took us on weekly, there are GIFs for almost every occasion: crying, hugging, dancing, slapping, more crying, judging … you get the idea. Basically, there’s not an Internet flame war out there that can’t be deaded with a reaction from the students of William McKinley High School.


But while Glee’s greatest gift might have been its plethora of GIFs, we can’t put them all here (our load times would suck). So instead, we decided to stick to just the best diva moments from the show—and frankly, that was even a lot to cull through. What lives below are some of the greatest bits of shade, judgment, and sass from the show’s many seasons. And although BeyoncĂ©’s definition that “a diva is a female version of a hustla” still stands, we—and Blaine Anderson—would like to remind you guys can be divas too.


HellToTheNo Fox

Hell to the No


The Use Case: Calling out someone’s BS.

The Background: If there’s one talent Mercedes Jones (Amber Riley) has besides being the best park-and-bark player on Glee, it’s her ability to shut. It. Down. Her “hell no” abilities are flawless.


SuePunch Fox

Sue Said Knock You Out


The Use Case: Wanting to punch someone through a screen.

The Background: When Sue’s new team loses to New Directions at regionals, she goes full Tyson on the woman who announces the results.


RachelNo Fox

The Sassier “No”


The Use Case: Not just dismissing someone’s question, but making them feel stupid for even asking it.

The Background: Rachel Berry (Lea Michele) already looks like she’s trying hard to fight off a bad case of BRF, so when she actually expresses her disapproval it’s just that much better.


SantanaShade Fox

Throwing Shade, Santana Style


The Use Case: Telling someone they’re acting ugly.

The Background: The way Santana Lopez (Naya Rivera) throws it, it’s not subtle enough to be called it shade, but who cares? She will toss it at anyone, any time, any place—including small children.


KurtJudgingYou Fox

The Full Read


The Use Case: “Oh, so it’s like that?”

The Background: Kurt being…Kurt.


RachelUGH Fox

Ugh


The Use Case: Being over it.

The Background: Who needs one? Just a classic Rachel Berry eye roll.


NotAmused Fox

Not Amused


The Use Case: Shaming someone who’s acting out of pocket.

The Background: Two cheerleaders (Brittany/Heather Morris and Quinn/Dianna Agron) and one goth girl staring at you with that bored look in their eye that just says you’re the most uninteresting.


LikesBoys Fox

Out and Proud


The Use Case: Expressing general fabulousness.

The Background: During a rousing performance of Lady Gaga’s “Born This Way,” Kurt’s jacket is ripped away to show that a T-shirt that definitively answers all questions.


LactatingWithRage Fox

Lactating With Rage


The Use Case: Being so mad you can’t even think straight.

The Background: Sue Sylvester drops insults the way Eminem drops bars—densely multilayered, and sometime nonsensical.


Dreamgirls Fox

“I’m Tiyaaahd!”


The Use Case: Having had enough of someone’s bad behavior.

The Background: During a performance of “It’s All Over” from Dreamgirls, Kurt only gets about one whole line, but he delivers it with perfection.



Inside Laser Printer Toner: Wax, Static, Lots of Plastic


Toner is one of those everyday products we all take for granted. When the printer runs low you pop a new cartridge in—out of sight, out of mind. Well, we got to wondering what’s actually in that cartridge … so we busted one open. Bad idea! (More on that later.) But we’re all cleaned up now and back with answers.


Turns out toner is mostly powdered plastic—and that’s key to the whole technology. Plastic has two handy properties: You can move it around like magic with static electricity, and then you can melt it onto the paper for crisp, smudge-proof images. This tech­nique of printing with powder instead of ink is called xerography (xeros is Greek for “dry”), and it works the same whether you’re printing or copying. In fact, Gary Starkweather invented the laser printer at Xerox in 1969, in a famous bit of rogue engineering, by modding one of the company’s office copiers. (He had to work in secret after his boss ordered him to drop the idea.)


See, a photocopier has a rotating drum that’s coated with a semiconductor like selenium; that coating converts light into electricity, just like in a solar cell. By bouncing bright light off a hard copy (… or select parts of your anatomy) and onto the drum, it creates a ghostly reflection of the original in static charges for toner to stick to. Starkweather realized you could use the same rig to print digital files by scanning a laser directly on the drum. The only difference is in how the electrostatic image is generated.


Aside from plastic, early printer toners in the 1970s contained little more than soot and rust. The latter—iron oxide—made it magnetic, for better control in the imaging process. That wouldn’t work for color print­ing, which came along in 1994; the dark oxide would have turned the colors brown. But manufacturers have come up with other additives and refinements to improve speed and image quality. Actual formulations are custom-designed for specific machines, so ingredient lists can vary. But here’s the basic recipe for most newer printers.


Polyester


Color toners are 85 to 95 percent plastic, milled into a superfine powder; the smaller the grains, the better the image resolution. Because plastic doesn’t conduct electricity, the particles can hold a static charge—and like socks in a dryer, they’ll cling to anything with an opposite charge. Laser printers use that clinginess to get the toner onto the imaging drum, and from there onto a sheet of paper. The page then goes through hot fuser rolls that melt the plastic and smoosh it into the paper fibers. A variety of polymers can be used, but polyester, the stuff of disco suits and soda bottles, is the top choice nowadays. It’s pricier than the old standby, styrene acrylate, but it makes for bright colors, smells less toxic, and has a lower melting point, which saves energy and lets the machine run faster. Just handle those cartridges with care: Toner spills are a mess, and inhaling tiny airborne particles can do a number on your lungs. Oh, and don’t wash your pants in hot water; that low melting point will turn your cotton Dockers into a polyester blend.


Polypropylene Wax


The first xerographic copiers in the early ’60s used radiant heat, like toaster ovens, to melt the toner onto the page; unfortunately, the boss’s memos sometimes caught on fire. (Xerox’s flagship model came with a small extinguisher.) Fuser rolls fixed that problem but caused a new one: Toner would stick to the rollers and smudge the next page. The solution? Add polypropy­lene wax for lubrication. It’s a polymer like polyester, but its long carbon strands have fewer chemical gewgaws hanging off them, so the molecules can easily slip and slide past one another.


Carbon Black


Polyester is clear. To make it look black, manufactur­ers stir in this grimy stuff—essentially high-purity soot. Made by burning tar or creosote, carbon black is mainly used to toughen up rubber products; it’s why tires are black. It’s also a class II carcinogen, but once the melted plastic hardens on your copies it’s safely sealed in place. Chemically, it’s a jumble of carbon atoms over which float clouds of shared electrons. Because these electrons have lots of room to move, they can absorb light energy at all visible wavelengths. The result: No light reflects back to your retina, an absence your brain calls “black.” (If you think about it, you can’t actually see these words. You’re inferring their shape from the white space around them.)


Pigment Yellow 180


Along with black, color printers have separate cartridges for yellow, magenta, and cyan toners, and these four can be overlaid to make any other hue. The yellow comes from this benzimidazolone compound. Like all organic pigments, it has alternating single and double bonds that again leave electrons free to absorb light—but not all of it. Here, short-wavelength violet light is trapped while longer-wavelength yellow passes through, bouncing off the page and into your eyeballs.


Pigment Red 122


Compounds of quinacridone produce a range of intense reddish hues, depending on their exact makeup and arrangement. They’re super durable, which is why they’re favored in exterior paints—think cherry-red sports cars. In Red 122 (2,9-dimethyl-quinacridone), the flat molecules are stacked up like dinner plates in a neat crystal structure; that shifts the reflected color toward the blue end of the spectrum, yielding magenta.


Pigment Blue 15:3


Copper phthalocyanine produces cyan, a rather alarming hue midway between green and blue. Surgical gowns are made in this color because it’s complementary to crimson (blood spatters on cyan look black). This common pigment is also used as a thin-film semiconductor in solar cells. It could even power quantum computers someday, since its electrons can hang out in a state of superposition for long periods.


Fumed Silica


Microscopic glass beads (SiO2) on the surface of the toner particles provide a silky, almost liquid flow. That’s essential to spread toner over the page at the maniacal speeds of modern office printers. It’s especially needed in polyester toners, which are more prone to caking. Fun project: Make your own fumed silica by vaporizing beach sand in a 3,000-degree-Celsius electric arc.


Charge Control Agents


As the toner leaves the cartridge, it brushes against a metering blade, which gives it a static charge. Scientists call that triboelectrification, and it’s what actually makes those socks cling in the dryer or a balloon stick to the wall after you rub it on your sweater. You’re literally scraping electrons from one material onto another (tribo- means to rub, same root as in diatribe). Here, putting a negative bias on the toner makes it cling to the imaging drum, and added bits of iron, chromium, or zinc help boost and hold the charge. Pro tip: If you ever do spill toner, don’t try to vacuum it up. Without special gear, all that agitation can spark a violent, albeit colorful, dust explosion.


Special thanks to John Cooper, Toner Research Services.



The Weird Way Facebook and Instagram Are Making Us Happier


Materialism and the urge to ‘keep up with the Joneses’ is our current culture’s version of an essential animal and human trait. It is our way of shaking our manes, flashing our feathers, and howling like monkeys. But there’s been a shift recently—in some circles, people have embraced an idea called experientialism. For these people, it is now not only socially acceptable but also socially expected to prefer experiences over stuff. And this idea is already starting to spread from these innovators to the mainstream because of one of the 21st century’s most important innovations: Facebook.


A New Kind of Conspicuous Consumption


If you ask them, most experientialists would laugh at the idea that they try to keep up with the Joneses. Yet one of the most counterintuitive things about experientialists in general is that, although they want nothing to do with keeping up with the Joneses in the traditional sense, many consume just as conspicuously as even the most status-conscious materialists. I blame Facebook.


Excerpted from Stuffocation: Why We've Had Enough of Stuff and Need Experience More Than Ever Excerpted from Stuffocation: Why We’ve Had Enough of Stuff and Need Experience More Than Ever

Remember how friends used to tell you about their latest vacation? They would invite you over for dinner, and, as the after-dinner chocolates were passed around, pull out their photos and bore you for a bit.


Now, using Facebook, Instagram, Twitter, and all the other social media sites, you can share every last detail of your trip in real time. You can let everyone know that, right now, you are watching the sunrise over Angkor Wat or the sunset from the rooftop of your riad in Marrakech, or that you are on a chairlift in the Alps, or that you have just finished packing and cannot wait to go. You need not, of course, broadcast your thoughts and updates only when you are on vacation. Why not share that you have just run a marathon, that you are at a Rolling Stones concert or a TEDx conference, or that you are thrilled because someone bought you flowers? Today, where you are, how you are feeling, what you are doing, and what you have done have suddenly become valuable social currency—just as they were before the 20th century.


Anyone can buy most material goods, but not everyone can be at the event you are Instagramming a picture from.




Then, most people lived in small communities. Everyone knew everybody else in the village. That meant everyone would just as likely know what you did with your time as how many possessions you owned, and how expensive and how good those possessions were. That meant, for signaling your status to others and establishing your place in the village’s social hierarchy, what you did was as important as what you owned. To signal status, the conspicuous consumption of leisure—that is, experiences—was equal to the conspicuous consumption of goods.

It was the arrival of cities that changed all that. The mass migrations of the 20th century, from small communities where everyone knew everyone else to large metropolises where you barely knew your neighbor, meant that what you did with your time became virtually useless as a way to signify status. In the relative anonymity of urban and, to a lesser extent, suburban life, your neighbors, friends, colleagues at work, and the people you passed on the street were much more likely to see what you owned than know what you did.


A material possession could deliver far more status than an experiential purchase. And so, in the 20th century, the conspicuous consumption of leisure was not nearly so effective as the conspicuous consumption of goods at telling others who you were.


Social media has turned this on its head. Now only a few people, relatively, might see your new sofa, or the car parked in your driveway. But with all your friends and followers on Twitter, Facebook, Pinterest, and Instagram, many more will now know you are partying in Ibiza, are in the front row of a Jay-Z concert, or that you have just completed a Tough Mudder assault course. And these people are more likely to be in your peer group, the people, in other words, whose opinion you are most interested in.


Facebook is giving us a new way to worry that we may not be keeping up with the Joneses.




Social media also plays a vital role in making experiences appear more valuable, thanks to their pivotal role in the “rarity principle.” According to this idea, the bigger the difference between the number of people who have access to something and the number of people who know about it, the rarer and more valuable the thing is. Anyone, after all, can buy most material goods, but not everyone can be at the event you are tweeting about or Instagramming a picture from.

The Double-Edged Sword of Social Media


Thanks to social media we want to keep up with what the Joneses are doing. Are we going to enough pop-ups, conferences, and concerts—like all our friends and acquaintances seem to be? This concern has become so widespread that it has a new name: fear of missing out, better known by its acronym, FOMO. At the birth of the experiential era, four out of every 10 people aged 18 to 34 in the US and UK say they sometimes worry that they are missing out. Facebook, you might say, is giving us a new way to keep up with the Joneses, and a new way to worry that we may not be keeping up.


FOMO is, at the least, problematic for experientialism. Because if this new way of living is just as likely to deliver anxiety and stress as materialism, how is it an improvement? Thought of in these terms, experientialism might even sound worse than materialism.


In today’s hyper-connected, 24/7 world, the game has changed. Not only do we notice material status cues when we see people in the real world, we are also getting and giving status updates through Facebook, Twitter, and all the other social networks. And since we check these throughout the day—when we get up and when we go to bed, on the toilet, on the train, in the classroom, and in the office—that means that we are playing the game more regularly, and thinking about the game more too. As we do that, we are more likely to end up feeling anxious and stressed, and perhaps depressed, about status.


There is another change in the game that is having an even more damaging effect, I believe, on happiness. In the past as we went about our everyday lives, we would not only see people with fancier cars and watches and clothes. We would also encounter people with cheaper, older, more threadbare, and less designer equivalents to the stuff we had. That mix would leave us feeling secure. It felt okay not being at the top of the social ladder, as long as you weren’t at the bottom of the pile either.


Comparing experiences is less clear-cut than it is with material goods, which means you're less likely to think about the status implications of what you do.


Now, think about the last time you looked at a social network. Have you ever noticed how Facebook, and other social networks, sometimes brings to mind upscale magazines filled with the look-but-don’t-touch lifestyles of the rich and lucky?


Your friends’ lives may well not be quite so perfect, of course. Life for most people, after all, is not a flawless timeline of weekends away and weddings in glamorous places. And if you stop to think about it, you know that. But it is hard to keep that in perspective. And since we are all connected to so many people on Facebook, there is always someone jetting off to Miami, having lunch in Lima, lounging on a boat in the Mediterranean, or attending a wedding in the Caribbean.


This constant bombardment leaves us feeling that we are always at bottom of the pile looking up. And that, in a meritocratic system like ours, can leave us feeling anxious, stressed, and depressed.


So all of this puts Facebook and other social networks in the curious position of supporting the rise of experientialism, while also undermining its benefits. This suggests, ironically, that if you buy into experientialism, you could end up with just as much anxiety, stress, and depression as you would have had in gentler, more materialistic times. But, and it is a very important but, even though keeping up with the Joneses through experiences has the potential to be anxiety-inducing, experientialism is still better than materialism.


Despite the double-edged sword of social media, it’s important to learn and remember four discoveries social scientists have made in recent years: that experiences are more likely to make us happy because we are less likely to get bored with them, more likely to see them with rose-tinted glasses, and more likely to think of them as part of who we are, and because they are more likely to bring us closer to other people and are harder to compare.


Even if experiences can be compared, the comparison is less clear-cut than it is with material goods, and that means you are less likely to think about the comparison, less likely to regret your choice afterward, and less likely to think about the status implications of what you do. So if you want to be happier, save yourself the hard work of all that thinking, and just choose experiences instead. They’re not only the smart route to happiness. Thanks to Facebook, they’re also the best way to shake your tail feathers.


Reprinted from Stuffocation by James Wallman with permission of Spiegel & Grau, an imprint of Random House, a division of Penguin Random House, LLC. Copyright (c) James Wallman, 2015.


Editor: Samantha Oltman (@samoltman)



“We Are Not Ready”: Ebola Analysis from Front-Line Workers (And Bill Gates)

We’ve just passed a difficult and little-noticed anniversary: Last week, the Ebola epidemic in West Africa achieved its first birthday. Though the viral outbreak has been contained, it is still not under control: According to the World Health Organization, cases continue in Sierra Leone and are rising again in Guinea. Liberia was about to record an entire incubation period without a new case — a signal that the chain of person-to-person transmission might have been broken — but on Friday, it announced that it had found a single new case. How that woman became infected is unclear; it is possible that she represents, not a new outbreak, but a brief interruption in an otherwise promising trend.


It has been decades since there was an epidemic of this persistence and magnitude. No other Ebola outbreak matches it; nor does the 2003 epidemic of SARS. You would have to go back to the early days of HIV in the 1980s, or to the flu pandemics in 1968, 1957 or even 1918, to find an outbreak that sickened so many people, challenged international response capacity so much, and instilled such fear in other countries.


The anniversary has triggered reflections. Some criticize the response to Ebola for being inadequate and slow. Others — such as two talks at last week’s TED conference, one by Bill Gates — extract lessons that should inform responses to future epidemics.


Because there will be future epidemics. That’s for sure.


In a report published this morning, Doctors Without Borders (also known as MSF, its acronym in French) excoriates the international response to the discovery of the first cases a year ago, deriding a “global coalition of inaction”:




The Ebola outbreak proved to be an exceptional event that exposed the reality of how inefficient and slow health and aid systems are to respond to emergencies. ‘Business as usual’ was exposed on the world stage, with the loss of thousands of lives.



The assessment gives low marks to the governments of the affected countries, for attempting to debunk or suppress early reports; to the World Health Organization, for failures of leadership; and to aid agencies in the West, for being slow to send personnel and supplies. Conscientiously, MSF also scores its own response, examining its slowness to mobilize the full organization, difficulty in choosing among competing priorities, and struggles to balance care for patients while exercising its duty to protect its employees.


Meanwhile: Last week was the influential annual TED conference, which focuses on “ideas worth spreading.” Two prominent members of the international response to Ebola spoke at the conference, and simultaneously published pieces based on their talks to ensure that their ideas had wide distribution. (I attended TED as a speaker and heard both.) TED is known for a kind of indomitable technological optimism — biosynthetic design, genetic engineering, self-driving cars — so it was refreshing and a little surprising to see how attentively the TED audience listened to discussions that were measured, if not outright pessimistic.


First, Seth Berkley, chief executive of GAVI, the Vaccine Alliance, described how Ebola has shone a spotlight on a fundamental problem in vaccine development. “We have known about Ebola since 1976. We have had ample opportunity to study it; 24 outbreaks of it have occurred so far, and we have had vaccine candidates available now for more than a decade,” he said. But, he added: “The people most at risk from these diseases are also the ones least able to pay for vaccines. This leaves little in the way of market incentives for manufacturers to develop vaccines, unless there are large numbers of people at risk in wealthy countries. It is simply too commercially risky.”


In a companion essay in Nature , Berkley argues for changing the way vaccines are funded, switching to a government-industry partnership that could get vaccines out quickly once the basic research is paid for.



When an outbreak occurs and vaccines are needed, it would help significantly to have vectors ready to deliver them. With the right investment, these vectors, typically a harmless virus or bacterium, could be prepared and tested in advance. Crucially, they could be pressed into service to tackle a range of diseases. Four of the five Ebola vaccines currently going through clinical trials use vectors developed and tested for HIV.


Such generic vectors would, in effect, modularize the vaccine development process — conducting much of the safety testing and ironing out manufacturing processes for different vectors ready for the addition of a ‘payload’ antigen. By developing such mechanisms in advance, and pre-testing them for safety and dose, we can save significant amounts of money and time by having stockpiles frozen and ready for use or efficacy testing as soon as an outbreak occurs.



In another talk, philanthropist Gates expanded Berkley’s call for better vaccine development into a multi-stranded argument for better preparation for outbreaks. Recalling the obsession with global nuclear war that dominated his childhood, he said:



If anything kills over 10 million people in the next decades, it is most likely to be a highly infectious virus, rather than a war: not missiles, but microbes. We have invested a huge amount in nuclear deterrence. We actually have invested very little in a system to stop an epidemic. We are not ready for the next epidemic.



In an essay published simultaneously in the New England Journal of Medicine, Gates describes the investments that would be necessary to create a global warning and response system. His list, from the end of the piece:



This system should:



  • be coordinated by a global institution that is given enough authority and funding to be effective,

  • enable fast decision making at a global level,

  • expand investment in research and development and clarify regulatory pathways for developing new tools and approaches,

  • improve early warning and detection systems, including scalable everyday systems that can be expanded during an epidemic,

  • involve a reserve corps of trained personnel and volunteers,

  • strengthen health systems in low- and middle-income countries, and

  • incorporate preparedness exercises to identify the ways in which the response system needs to improve.



In the same way that there has never been an outbreak quite like this Ebola epidemic — globally unnerving but geographically limited, fast-moving and yet contained enough that it can be assessed from a distance — there probably has never been an outbreak that offered such concrete examples of things to do better next time. For the sake of everyone’s health, let’s hope those lessons are learned.



Archaea: Surviving in hostile territory

Many strange creatures live in the deep sea, but few are odder than archaea, primitive single-celled bacteria-like microorganisms. Archaea go to great lengths -- eating methane or breathing sulfur or metal instead of oxygen -- to thrive in the most extreme environments on the planet.



Recently, while searching the ocean's depths off the coast of Santa Monica, California, a team of UC Santa Barbara scientists discovered something odder still: a remarkable new virus that seemingly infects methane-eating archaea living beneath the ocean's floor. The investigators were further surprised to discover that this virus selectively targets one of its own genes for mutation and, moreover, that some archaea do too. The researchers' findings appear today in the journal Nature Communications.


"Our study illustrates that self-guided mutation is relevant to life within Earth's subsurface and uncovers mechanisms by which viruses and archaea can adapt in this hostile environment," said David Valentine, a professor in UCSB's Department of Earth Science and at the campus's Marine Science Institute (MSI). "These findings raise exciting new questions about the evolution and interaction of the microbes that call Earth's interior home."


Using the submarine Alvin, Valentine and colleagues collected samples from a deep-ocean methane seep by pushing tubes into the ocean floor and retrieving sediments. The contents were brought back to the lab and fed methane gas, which helped the archaea in the samples grow. When the team assayed the samples for viral infection, they discovered a new virus with a distinctive genetic fingerprint that suggested its likely host was methane-eating archaea.


"It's now thought that there's more biomass inside Earth than anywhere else, just living very, very slowly in this dark, energy-limited, starved environment," said co-author Sarah Bagby, a postdoctoral scholar in the Valentine lab.


The researchers used the genetic sequence of the new virus to chart other occurrences in global databases. "We found a partial genetic match from methane seeps in Norway and California," said lead author Blair Paul, a postdoctoral scholar in the Valentine lab. "The evidence suggests this viral type is distributed around the globe in deep ocean methane seeps."


Further investigation revealed another unexpected finding: a diversity-generating retroelement that greatly accelerates mutation of a specific section of the viral genome. Such small genetic elements had previously been identified in bacteria and their viruses, but never among archaea or the viruses that infect them. While the self-guided mutation element in the archaeal virus clearly resembled the known bacterial elements in many respects, the researchers found that it has a divergent evolutionary history.


"The target of guided mutation -- the tips of the virus that make first contact when infecting a cell -- was similar," said Paul. "The ability to mutate those tips is an offensive countermeasure against the cell's defenses -- a move that resembles a molecular arms race."


Having found guided mutation in a virus infecting archaea, the scientists reasoned that archaea themselves might use the same mechanism for genetic adaptation. Indeed, in an exhaustive search, they identified parallel features in the genomes of a more mysterious subterranean group of archaea known as nanoarchaea. Unlike the deep-ocean virus that uses guided mutation to alter a single gene, nanoarchaea target at least four distinct genes.


"This is a new record," said Bagby. "Previously, a few bacteria had been observed to target two genes with this mechanism. That may not seem like a huge difference, but targeting four is extraordinary. If they're all firing at once, suddenly the number of combinations of protein variants in play is really massive."


According to Valentine, the genetic mutation that engenders these potential variations may be a key element to survival of archaea beneath Earth's surface. "The cell is choosing to modify certain proteins," he explained."It's doing its own protein engineering internally. While we don't know what those proteins are being used for, I think learning about the process can tell us something about the environment in which these organisms thrive. Right now, we know so little about life in that environment."




Story Source:


The above story is based on materials provided by University of California - Santa Barbara . The original article was written by Julie Cohen. Note: Materials may be edited for content and length.



Glee Is Gone, But Its GIFs Will Live Forever

Grant Gustin (now The Flash on CW) during his time as a less-heroic Dalton Academy Warbler on Glee. Grant Gustin (now The Flash on CW) during his time as a less-heroic Dalton Academy Warbler on Glee. Fox



After six seasons, Glee sang its way offstage on Friday—a cold slushy to the face for viewers who had been following the New Directions since the time when the wordplay in their name was still funny. But the Fox show isn’t leaving us empty-handed; Glee‘s legacy is a crapload of fantastic GIFs we can use long after its characters are but a distant memory.


Because of the emotional teacup ride Glee took us on weekly, there are GIFs for almost every occasion: crying, hugging, dancing, slapping, more crying, judging … you get the idea. Basically, there’s not an Internet flame war out there that can’t be deaded with a reaction from the students of William McKinley High School.


But while Glee’s greatest gift might have been its plethora of GIFs, we can’t put them all here (our load times would suck). So instead, we decided to stick to just the best diva moments from the show—and frankly, that was even a lot to cull through. What lives below are some of the greatest bits of shade, judgment, and sass from the show’s many seasons. And although BeyoncĂ©’s definition that “a diva is a female version of a hustla” still stands, we—and Blaine Anderson—would like to remind you guys can be divas too.


HellToTheNo Fox

Hell to the No


The Use Case: Calling out someone’s BS.

The Background: If there’s one talent Mercedes Jones (Amber Riley) has besides being the best park-and-bark player on Glee, it’s her ability to shut. It. Down. Her “hell no” abilities are flawless.


SuePunch Fox

Sue Said Knock You Out


The Use Case: Wanting to punch someone through a screen.

The Background: When Sue’s new team loses to New Directions at regionals, she goes full Tyson on the woman who announces the results.


RachelNo Fox

The Sassier “No”


The Use Case: Not just dismissing someone’s question, but making them feel stupid for even asking it.

The Background: Rachel Berry (Lea Michele) already looks like she’s trying hard to fight off a bad case of BRF, so when she actually expresses her disapproval it’s just that much better.


SantanaShade Fox

Throwing Shade, Santana Style


The Use Case: Telling someone they’re acting ugly.

The Background: The way Santana Lopez (Naya Rivera) throws it, it’s not subtle enough to be called it shade, but who cares? She will toss it at anyone, any time, any place—including small children.


KurtJudgingYou Fox

The Full Read


The Use Case: “Oh, so it’s like that?”

The Background: Kurt being…Kurt.


RachelUGH Fox

Ugh


The Use Case: Being over it.

The Background: Who needs one? Just a classic Rachel Berry eye roll.


NotAmused Fox

Not Amused


The Use Case: Shaming someone who’s acting out of pocket.

The Background: Two cheerleaders (Brittany/Heather Morris and Quinn/Dianna Agron) and one goth girl staring at you with that bored look in their eye that just says you’re the most uninteresting.


LikesBoys Fox

Out and Proud


The Use Case: Expressing general fabulousness.

The Background: During a rousing performance of Lady Gaga’s “Born This Way,” Kurt’s jacket is ripped away to show that a T-shirt that definitively answers all questions.


LactatingWithRage Fox

Lactating With Rage


The Use Case: Being so mad you can’t even think straight.

The Background: Sue Sylvester drops insults the way Eminem drops bars—densely multilayered, and sometime nonsensical.


Dreamgirls Fox

“I’m Tiyaaahd!”


The Use Case: Having had enough of someone’s bad behavior.

The Background: During a performance of “It’s All Over” from Dreamgirls, Kurt only gets about one whole line, but he delivers it with perfection.