Kingsman’s DNA Is Like a Mash-Up of James Bond and Kick-Ass


Kingsman

Jaap Buitendijk/20th Century Fox



Every movie started somewhere as a tiny shadow of an idea, a single-celled organism that evolved into a walking, talking, fighting, exploding, and complex being. And if we look at the DNA, if you will, of these organisms, we can trace their origins. Here is a breakdown of the primordial soup that brought Kingsman: The Secret Service to the big screen.


Antecedents


Director Matthew Vaughn has been clocking Kingsman for a long time now—starting with his love of spy cinema from the 1970s. While doing press for the movie Layer Cake all the way back in 2005, the director spoke to IGN FilmForce about his potential involvement with the 007 film Casino Royale, which he nearly directed, and has long been public about his desire to get in on the espionage game. He told Den of Geek last month “I’ve always loved Bond. There were two franchises that I would always have dropped everything to do as a director. Bond was one. Star Wars was other. And neither of them came my way, so…” So, Kingsman happened instead! And Vaughn was so set on the project he even walked away from X-Men: Days of Future Past to do it. Despite having the keys to the X kingdom after reviving the franchise with First Class, Vaughn felt Kingsman was ultimately more in line with his sensibilities.


And when we say “sensibilities” we mean Kick-Ass-meets-James-Bond-meets-Lock, Stock and Two Smoking Barrels, his first venture with fellow stylized crime enthusiast Guy Ritchie. Vaughn telegraphed his violent tendencies in Lock Stock and, with the exception of the criminally underappreciated Stardust from 2007, has stayed true to that aesthetic of cartoonishly charged brutality ever since. Fortunately for the director, he found a star-crossed companion in comics luminary and ultra-violence enthusiast, Mark Millar. Millar is the co-creator of both the Kick-Ass and The Secret Service series, making Kingsman their second collaboration. Together, the pair might be doing more work than anyone to counter-balance the dark superhero motif established by Christopher Nolan—a style that Vaughn himself has explicitly declared his boredom with. This is a man committed to putting asses in seats and delivering a helluva good time, and considering he counts Back to the Future, Reservoir Dogs, Rocky III, and Scarface as some of his most influential movies in a British Film Institute poll, it looks like the populists have found their Van Wilder in Matthew Vaughn.



Kingsman’s DNA Is Like a Mash-Up of James Bond and Kick-Ass


Kingsman

Jaap Buitendijk/20th Century Fox



Every movie started somewhere as a tiny shadow of an idea, a single-celled organism that evolved into a walking, talking, fighting, exploding, and complex being. And if we look at the DNA, if you will, of these organisms, we can trace their origins. Here is a breakdown of the primordial soup that brought Kingsman: The Secret Service to the big screen.


Antecedents


Director Matthew Vaughn has been clocking Kingsman for a long time now—starting with his love of spy cinema from the 1970s. While doing press for the movie Layer Cake all the way back in 2005, the director spoke to IGN FilmForce about his potential involvement with the 007 film Casino Royale, which he nearly directed, and has long been public about his desire to get in on the espionage game. He told Den of Geek last month “I’ve always loved Bond. There were two franchises that I would always have dropped everything to do as a director. Bond was one. Star Wars was other. And neither of them came my way, so…” So, Kingsman happened instead! And Vaughn was so set on the project he even walked away from X-Men: Days of Future Past to do it. Despite having the keys to the X kingdom after reviving the franchise with First Class, Vaughn felt Kingsman was ultimately more in line with his sensibilities.


And when we say “sensibilities” we mean Kick-Ass-meets-James-Bond-meets-Lock, Stock and Two Smoking Barrels, his first venture with fellow stylized crime enthusiast Guy Ritchie. Vaughn telegraphed his violent tendencies in Lock Stock and, with the exception of the criminally underappreciated Stardust from 2007, has stayed true to that aesthetic of cartoonishly charged brutality ever since. Fortunately for the director, he found a star-crossed companion in comics luminary and ultra-violence enthusiast, Mark Millar. Millar is the co-creator of both the Kick-Ass and The Secret Service series, making Kingsman their second collaboration. Together, the pair might be doing more work than anyone to counter-balance the dark superhero motif established by Christopher Nolan—a style that Vaughn himself has explicitly declared his boredom with. This is a man committed to putting asses in seats and delivering a helluva good time, and considering he counts Back to the Future, Reservoir Dogs, Rocky III, and Scarface as some of his most influential movies in a British Film Institute poll, it looks like the populists have found their Van Wilder in Matthew Vaughn.



The Hot Yet Little-Known Trend That’ll Supercharge AI


icloud-hack-tools-inline

Then One/WIRED



When Andrew Ng trained Google’s army of computers to identify cat videos using artificial intelligence, he hit a few snags.

Google’s worldwide network of data centers housed more computers than he needed for the job, but harnessing all that power wasn’t easy. When a server broke down—an everyday occurrence when you’re using 1,000 machines at a time—it slowed down his calculations.


According to Ng, this is one of the big unreported stories in the world of deep learning, the hottest trend these days in big data and artificial intelligence: it’s not necessarily suited to cloud computing—i.e. the techniques the Googles and the Amazons and the Facebooks typically use to run software across tens of thousands of machines.


Not long after Ng’s AI experiment, a Stanford University researcher named Adam Coates came up with a better way to do things. He used a different type of microprocessor, called a graphical processing unit, to string together a three-computer system that could do the work of Google’s 1,000-computer cloud. It was a remarkable accomplishment.


“That big disparity in resources that you needed to run these experiments is because on the one hand the GPUs are a lot faster themselves, but also because once you have a much smaller system that’s much more tightly integrated, there are sort of economies of scale,” says Coates, who now works for Andrew Ng at Chinese search giant Baidu.


Gamers know about GPUs because they often buy special GPU cards to speed up their video game experience. But even before Ng was experimenting at Google, academics knew about GPUs too. That’s because they have exceptional math-crunching abilities, which make them ideal for deep learning. Initially, researchers only wrote deep learning software for single-computer systems. What Coates had done was show how to build deep learning networks over many GPU-based computers. And his work is now trickling down to so many others. Google and Facebook are using GPUs, but so are the labs that run some of the world’s biggest supercomputers: Oak Ridge National Labs and Lawrence Livermore National Laboratory. There, they hope to take advantage of these powerful chips and the kind of ultrafast networking gear that have become widely used in supercomputers.


Supercomputers Meet Deep Learning


On the the east side of Oak Ridge National Laboratory’s Tennessee campus, there’s an 80-acre research facility called the Spallation Neutron Source, or SNS. Built in 2006, it blasts the world’s most intense beams of neutrons at materials to help physicists and chemists understand the inner structure of how materials are formed.


The SNS produces too much data to be completely analyzed, hundreds of terabytes, but Oak Ridge scientists believe that they could use deep learning algorithms to more quickly identify patterns in the data—identifying patterns is a deep learning specialty—and improve their analysis.


The issue is widespread. It’s not uncommon for scientific simulations to produce 700 terabytes of data each time they are run. That’s more than all of the information housed in the Library of Congress. “In the science world there is a big data problem,” says Robert Patton, an Oak Ridge computer scientist. “Scientists are now doing simulations that are simply producing too much data to analyze,” he says.


But GPU-powered deep learning could change things—especially when fused with the super-fast networking capabilities of high-performance computers such as Oak Ridge’s Titan supercomputer. The Titan is a little different from a Google cloud. It too spans thousands of machines, but it can more quickly swap data in and out of each machine memory and push to another machine. So, at Oak Ridge, researchers have resolved run deep learning algorithms on Titan.


Facebook uses GPUs too, but their lead deep-learning researcher, Yann LeCun isn’t writing off the CPU entirely. “We use a GPU-based infrastructure to train our deep learning models. Conventional CPUs are just too slow,” he says. “But new CPU chips—with many cores—may approach the performance of GPUs in the near future.”


The Big Rewrite


Before they can realize their AI ambitions, the Oak Ridge supercomputer geeks must write deep-learning software that works on their towering supercomputers. And that will likely take years of work, the researchers say.


Andrew Ng’s original Google work built models of cat videos that had 1 billion parameters in them—helping the algorithms to build an almost human understanding of the subtleties of the images int he videos and distinguish between, for example, a YouTube cat video and one featuring a chinchilla.


Over at Lawrence Livermore Labs, they’ve built software that includes 15 billion parameters—15 times as many as the original Google experiment—and they intend to go even higher. “We hope by the end of this project to have built the world’s largest neural network training algorithm that’s enabled by high performance computing,” says Barry Chen, a Knowledge Systems and Informatics group leader with the labs.


The Google Way


What’s Google doing? Well, it’s moving to GPUs too. But it’s taking a different route. The tech giant has already built a new and remarkable deep learning system called DistBelief, and it can run on either GPUs or CPUs within its sprawling cloud.


Google splits up the number crunching job into hundreds of smaller clusters of between 1 and 32 machines, which gradually tweak the data model that Google has assembled. It’s a giant compute operation that gradually gives Google’s software the ability to distinguish between things like a chair and a stool, or the word shift and the word ship.


So machines can fail inside Google’s data center—that’s inevitable—but when they do, the consequences aren’t so severe. In fact, the entire system is designed so that Google’s researchers won’t even notice when there is a failure, says Greg Corrado, a Google research scientist.


“The larger question of cloud computing vs HPC [high performance computing] is a matter of taste, company culture, and available resources,” he says. “I’ve done both. I’m very happy with Google’s internal system of course.”



The Best and Worst Public Transit Systems, According to Twitter


transit-twitter-ft

Jens Schott Knudsen/Getty Images



If Twitter’s good for one thing, it’s complaining. It’s the easy, free way to convert quiet muttering into griping everyone and anyone can hear, without the need to scream out loud and brand yourself a lunatic. So it’s no surprise that when Lisa Schweizter, an associate professor at USC’s Sol Price School of Public Policy, started collecting tweets for a study on how people talk about public transit agencies, she found gobs of evidence to work with.


The results of her study, published this month in the Journal of the American Planning Association, ranked 10 of the largest public transit agencies in the US and Canada by how well regarded they are on Twitter. Based on Schweitzer’s “mean sentiment score” and more than 60,000 tweets collected between 2010 and 2014, Twitter was nicest to Vancouver’s Translink, which was followed by Portland, Oregon’s TriMet and Toronto’s TTC. The harshest tweets concerned systems in Chicago, Philadelphia, Boston, and New York. For comparison, Schweitzer calculated scores for public figures (the sentiment score ranged from William Shatner to Osama Bin Laden), airlines, police departments, and welfare programs (the full chart is at the bottom of this post).


Schweitzer used text mining to pick out positive and negative words from the tweets (and manually added terms including brokedown, wtf, scam, epicfail, pervy, and unsuck). Machine learning helped spot things like parody accounts and unusually frequent tweeters. Schweitzer and her graduate students also analyzed some 5,000 tweets by hand, to ensure they lined up with the computer system’s interpretations. Reasons for complaint included delays, facilities, staff conduct, public mismanagement, and the class, race, and gender of other riders.


Here’s the funny thing: The transit system’s scores don’t line up with service quality (judged by on-time performance). But the unsurprising fact that public griping doesn’t necessarily match reality doesn’t make the data useless. Because Schweitzer did find one factor that predicts “mean sentiment”—the way the transit agencies themselves behave on Twitter.


Tweets from the people running subways and buses in New York, Los Angeles, Washington, D.C. and other cities weren’t included in the data set, but Schweitzer took them into consideration, and found they matter a great deal. Some agencies use Twitter to pump out impersonal blasts announcing service disruptions. Others make the effort to respond to user complaints and interact with them. That second approach, it turns out, makes a major difference in how the agency is perceived online. “Transit companies that respond to other social media users have statistically more favorable opinions expressed about the transit agency for just about every measure I considered,” Schweitzer writes.


The clearest example is provided by Philadelphia’s SEPTA, which in late 2011 started a @SEPTA_SOCIAL account for dialog with riders. After one year, Schweitzer found, its score went from -0.3 to nearly 0 (on a scale from -1 to 1). That’s a 70 percent improvement.


So what’s the takeaway? If you’re looking for a low investment way to improve your public image on Twitter, use Twitter as a tool for conversation, not one-way communication. It may seem that someone complaining to 18 followers that their train is late doesn’t matter, but Schweitzer makes the point that social media does influence broader public perceptions.


“If planners seek to support strong public transit systems as a key element in building equitable and sustainable communities, they should encourage positive public sentiment about the service, in part by encouraging public transit agencies to use interactive social media approaches.”


Or, you know, make the trains run on time.


CorrectedFigure1

USC Sol Price School of Public Policy




The Hot Yet Little-Known Trend That’ll Supercharge AI


icloud-hack-tools-inline

Then One/WIRED



When Andrew Ng trained Google’s army of computers to identify cat videos using artificial intelligence, he hit a few snags.

Google’s worldwide network of data centers housed more computers than he needed for the job, but harnessing all that power wasn’t easy. When a server broke down—an everyday occurrence when you’re using 1,000 machines at a time—it slowed down his calculations.


According to Ng, this is one of the big unreported stories in the world of deep learning, the hottest trend these days in big data and artificial intelligence: it’s not necessarily suited to cloud computing—i.e. the techniques the Googles and the Amazons and the Facebooks typically use to run software across tens of thousands of machines.


Not long after Ng’s AI experiment, a Stanford University researcher named Adam Coates came up with a better way to do things. He used a different type of microprocessor, called a graphical processing unit, to string together a three-computer system that could do the work of Google’s 1,000-computer cloud. It was a remarkable accomplishment.


“That big disparity in resources that you needed to run these experiments is because on the one hand the GPUs are a lot faster themselves, but also because once you have a much smaller system that’s much more tightly integrated, there are sort of economies of scale,” says Coates, who now works for Andrew Ng at Chinese search giant Baidu.


Gamers know about GPUs because they often buy special GPU cards to speed up their video game experience. But even before Ng was experimenting at Google, academics knew about GPUs too. That’s because they have exceptional math-crunching abilities, which make them ideal for deep learning. Initially, researchers only wrote deep learning software for single-computer systems. What Coates had done was show how to build deep learning networks over many GPU-based computers. And his work is now trickling down to so many others. Google and Facebook are using GPUs, but so are the labs that run some of the world’s biggest supercomputers: Oak Ridge National Labs and Lawrence Livermore National Laboratory. There, they hope to take advantage of these powerful chips and the kind of ultrafast networking gear that have become widely used in supercomputers.


Supercomputers Meet Deep Learning


On the the east side of Oak Ridge National Laboratory’s Tennessee campus, there’s an 80-acre research facility called the Spallation Neutron Source, or SNS. Built in 2006, it blasts the world’s most intense beams of neutrons at materials to help physicists and chemists understand the inner structure of how materials are formed.


The SNS produces too much data to be completely analyzed, hundreds of terabytes, but Oak Ridge scientists believe that they could use deep learning algorithms to more quickly identify patterns in the data—identifying patterns is a deep learning specialty—and improve their analysis.


The issue is widespread. It’s not uncommon for scientific simulations to produce 700 terabytes of data each time they are run. That’s more than all of the information housed in the Library of Congress. “In the science world there is a big data problem,” says Robert Patton, an Oak Ridge computer scientist. “Scientists are now doing simulations that are simply producing too much data to analyze,” he says.


But GPU-powered deep learning could change things—especially when fused with the super-fast networking capabilities of high-performance computers such as Oak Ridge’s Titan supercomputer. The Titan is a little different from a Google cloud. It too spans thousands of machines, but it can more quickly swap data in and out of each machine memory and push to another machine. So, at Oak Ridge, researchers have resolved run deep learning algorithms on Titan.


Facebook uses GPUs too, but their lead deep-learning researcher, Yann LeCun isn’t writing off the CPU entirely. “We use a GPU-based infrastructure to train our deep learning models. Conventional CPUs are just too slow,” he says. “But new CPU chips—with many cores—may approach the performance of GPUs in the near future.”


The Big Rewrite


Before they can realize their AI ambitions, the Oak Ridge supercomputer geeks must write deep-learning software that works on their towering supercomputers. And that will likely take years of work, the researchers say.


Andrew Ng’s original Google work built models of cat videos that had 1 billion parameters in them—helping the algorithms to build an almost human understanding of the subtleties of the images int he videos and distinguish between, for example, a YouTube cat video and one featuring a chinchilla.


Over at Lawrence Livermore Labs, they’ve built software that includes 15 billion parameters—15 times as many as the original Google experiment—and they intend to go even higher. “We hope by the end of this project to have built the world’s largest neural network training algorithm that’s enabled by high performance computing,” says Barry Chen, a Knowledge Systems and Informatics group leader with the labs.


The Google Way


What’s Google doing? Well, it’s moving to GPUs too. But it’s taking a different route. The tech giant has already built a new and remarkable deep learning system called DistBelief, and it can run on either GPUs or CPUs within its sprawling cloud.


Google splits up the number crunching job into hundreds of smaller clusters of between 1 and 32 machines, which gradually tweak the data model that Google has assembled. It’s a giant compute operation that gradually gives Google’s software the ability to distinguish between things like a chair and a stool, or the word shift and the word ship.


So machines can fail inside Google’s data center—that’s inevitable—but when they do, the consequences aren’t so severe. In fact, the entire system is designed so that Google’s researchers won’t even notice when there is a failure, says Greg Corrado, a Google research scientist.


“The larger question of cloud computing vs HPC [high performance computing] is a matter of taste, company culture, and available resources,” he says. “I’ve done both. I’m very happy with Google’s internal system of course.”



The Best and Worst Public Transit Systems, According to Twitter


transit-twitter-ft

Jens Schott Knudsen/Getty Images



If Twitter’s good for one thing, it’s complaining. It’s the easy, free way to convert quiet muttering into griping everyone and anyone can hear, without the need to scream out loud and brand yourself a lunatic. So it’s no surprise that when Lisa Schweizter, an associate professor at USC’s Sol Price School of Public Policy, started collecting tweets for a study on how people talk about public transit agencies, she found gobs of evidence to work with.


The results of her study, published this month in the Journal of the American Planning Association, ranked 10 of the largest public transit agencies in the US and Canada by how well regarded they are on Twitter. Based on Schweitzer’s “mean sentiment score” and more than 60,000 tweets collected between 2010 and 2014, Twitter was nicest to Vancouver’s Translink, which was followed by Portland, Oregon’s TriMet and Toronto’s TTC. The harshest tweets concerned systems in Chicago, Philadelphia, Boston, and New York. For comparison, Schweitzer calculated scores for public figures (the sentiment score ranged from William Shatner to Osama Bin Laden), airlines, police departments, and welfare programs (the full chart is at the bottom of this post).


Schweitzer used text mining to pick out positive and negative words from the tweets (and manually added terms including brokedown, wtf, scam, epicfail, pervy, and unsuck). Machine learning helped spot things like parody accounts and unusually frequent tweeters. Schweitzer and her graduate students also analyzed some 5,000 tweets by hand, to ensure they lined up with the computer system’s interpretations. Reasons for complaint included delays, facilities, staff conduct, public mismanagement, and the class, race, and gender of other riders.


Here’s the funny thing: The transit system’s scores don’t line up with service quality (judged by on-time performance). But the unsurprising fact that public griping doesn’t necessarily match reality doesn’t make the data useless. Because Schweitzer did find one factor that predicts “mean sentiment”—the way the transit agencies themselves behave on Twitter.


Tweets from the people running subways and buses in New York, Los Angeles, Washington, D.C. and other cities weren’t included in the data set, but Schweitzer took them into consideration, and found they matter a great deal. Some agencies use Twitter to pump out impersonal blasts announcing service disruptions. Others make the effort to respond to user complaints and interact with them. That second approach, it turns out, makes a major difference in how the agency is perceived online. “Transit companies that respond to other social media users have statistically more favorable opinions expressed about the transit agency for just about every measure I considered,” Schweitzer writes.


The clearest example is provided by Philadelphia’s SEPTA, which in late 2011 started a @SEPTA_SOCIAL account for dialog with riders. After one year, Schweitzer found, its score went from -0.3 to nearly 0 (on a scale from -1 to 1). That’s a 70 percent improvement.


So what’s the takeaway? If you’re looking for a low investment way to improve your public image on Twitter, use Twitter as a tool for conversation, not one-way communication. It may seem that someone complaining to 18 followers that their train is late doesn’t matter, but Schweitzer makes the point that social media does influence broader public perceptions.


“If planners seek to support strong public transit systems as a key element in building equitable and sustainable communities, they should encourage positive public sentiment about the service, in part by encouraging public transit agencies to use interactive social media approaches.”


Or, you know, make the trains run on time.


CorrectedFigure1

USC Sol Price School of Public Policy




New mechanism that controls immune responses discovered

UT Southwestern Medical Center researchers have identified a common signaling mechanism to produce interferon -- one of the main proteins used to signal the immune system when the body needs to defend itself against a virus, tumor, or other diseases.



The findings are important for understanding the body's immune defense system, searching for compounds to turn the immune system on or off, and they may help combat autoimmune diseases, in which overactive immune cells attack healthy tissues.


"Our work reveals a common mechanism by which three distinct pathways lead to the production of type-I interferons," said Dr. Zhijian "James" Chen, Professor of Molecular Biology and in the Center for the Genetics of Host Defense at UT Southwestern, and a Howard Hughes Medical Institute (HHMI) Investigator. "Ultimately, we believe that understanding this mechanism will facilitate the design and development of medications to treat human diseases such as lupus."


The findings appear online in the journal Science.


The results show how a protein called interferon regulatory factor 3 (IRF3), which controls the production of type-I interferons, is activated and how this pathway is tightly controlled. The failure of this control system can lead to autoimmune disorders such as systemic lupus erythematosus, which causes joint pain and swelling, and can damage the brain, heart, lungs, kidneys, and digestive track. Lupus affects more than 1.5 million Americans, and is more common in young and middle-aged women than in men.


A normal function of interferons is to defend the body against infections from viruses, bacteria and parasites. Previous research has identified specific pathways that induce interferons in response to distinct infectious agents, but how these different pathways converge on IRF3 to induce interferons was not understood.


Dr. Chen and his team studied a protein called MAVS, which they discovered in 2005 and showed that it is an adaptor protein essential for interferon induction by RNA viruses such as influenza virus. In the new study, they found that MAVS is modified by the addition of a phosphate group (phosphorylated) by an enzyme called TBK1 when cells are infected by a virus and that this modification is important for IRF3 activation.


Upon closer examination, they found the amino acid sequence that is phosphorylated in MAVS is very similar to those of two other adaptor proteins, STING and TRIF, which mediate interferon induction in response to DNA viruses and bacteria, respectively. Further research confirmed that all three adaptor proteins are phosphorylated at the common sequence motif and that this phosphorylation allows each of the adaptor proteins to bind IRF3, thereby facilitating IRF3 phosphorylation by TBK1. The phosphorylated IRF3 becomes activated to induce type-I interferons.


"Although TBK1 is required for IRF3 activation, TBK1 alone is not sufficient. Phosphorylation of the adaptor proteins provides a 'license' for TBK1 to phosphorylate IRF3," said Dr. Chen, who holds the George L. MacGregor Distinguished Chair in Biomedical Science. "This hitherto unrecognized mechanism ensures that type-I interferons are produced only when a proper adaptor protein is engaged in cells that are infected by pathogens."




Story Source:


The above story is based on materials provided by UT Southwestern Medical Center . Note: Materials may be edited for content and length.



Two Revolutionary New Sex Toys, Plus 14 Top-Selling Dildos


ff_hellotouch_1_hands

Adam Voorhes; Prop Styling: Robin Finlay



Hello Touch X


When it comes to pleasure peripherals, dildos aren’t for everyone. So some companies are thinking outside the (dick in a) box with new sex toys, like Jimmyjane’s powerful fingertip vibrator. Inspired by Sigourney’s cyborg suit in Aliens, it’s meant to be an extension of your body.


Two-pronged touch

Most ladies want stroking, not just poking, so for its signature model, Jimmyjane took inspiration from the popular Rabbit vibrator, building two flexible vibrating fingerpads—like rabbit ears—to deliver clitoral sensation in stereo.


Coin motors

Fingertip vibes often use large DC motors, but the X hides a 14,000-rpm coin motor, half the size of a penny, in the sleeve of each pad. Designers tested its intensity and power with an accelerometer until it outperformed the competition.


Charging pack units

The disposable AAAs in many buzzies drain quickly, forcing you to choose between your TV remote and your orgasm. The X’s lithium-ion battery delivers stronger vibrations and recharges with a USB cable.


Silicone sheath

Materials in sex toys are largely unregulated, but these fingerpads are made of nonporous, medical-grade silicone to help keep bacteria at bay. A waterproof seal around the battery pack connection protects its insides.


E-stimulation

Vibratory caresses not intense enough for you? The X has two electrical stimulation pads, one positive, one negative. Once your skin completes the circuit, up to 15 milliamps of current increase blood flow and contract muscles. Yowza!


ff_eva_1_beaver

Adam Voorhes; Prop Styling: Robin Finlay



Eva


Most vibrators need hands to keep them in place—which prevents fingers from doing other, funner things. So sex educator Alexandra Fine set out to find a hands-free fix. After a DIY attempt involving a half-dollar wrapped in Saran wrap, she teamed up with mechanical engineer Janet Lieberman to form Dame Products. The result: an innovative, low-profile couples vibrator for the cliterati. Look, ma, no hands!


Location matters

Many clit-specific vibrators are U-shaped, to hook inside—but they tend to have a numbing effect on your partner. The Eva sits directly over the clitoris, applying vibration just where it’s wanted.


Loving arms

Friends who volunteered for testing complained that the vibrator fell off while doing the deed. So Lieberman gave it wings to stay in place. “As you open and close your legs,” Fine says, “it opens and closes its wings with you.”


Flexible fit

Lieberman used a 3-D printer to iterate the flexible wings 75 times. She played with materials, curves, and angles to create a lightweight plastic form that would bend and snap back, helping the Eva stay snug.


Powerhouse

The Eva is powerful despite its size, packing 7 g’s of acceleration into a petite 1.1-inch egg. Users cycle through three pulse strengths by pressing a large, easy-to-access button in the center.


Rubber cover

To encase the Eva’s funky shape in silicone, Fine couldn’t use traditional injection molding—its high temperatures would damage the electronics inside. So she vulcanized the rubber with a cooler curing process.


Plus 14 Top-Selling Dildos


Companies like Jimmyjane and Dame may be transcending the limitations of penis imitation, but the dildo still towers over the competition. At erotic shop Adam & Eve, which sells 2 million–plus sex toys a year, 14 of the top 25 are penis-shaped (more or less).


*units sold in 2014 by Chris Philpot



Like Twitter and Netflix, LinkedIn Clips Lines To Outside Services


LinkedIn is changing the way outside services plug into its popular business-centric social network.


Under the new rules, outside apps and websites can still use LinkedIn to log you into their own services and perform other tasks that rely on the social network’s data. But after May 12, LinkedIn will shutdown access to some data. For example, outside services will no longer have access to the length and breath of your LinkedIn profile (even if you give your approval), and they won’t be allowed to send invitations to connect on LinkedIn.


If companies and developers still want access, they’ll have to apply for inclusion in an official LinkedIn partnership program.


LinkedIn says it’s trying to cut down on apps that don’t benefit its users. “We’ve obviously seen some great unique applications being built on LinkedIn,” says company spokesperson Julie Inouye. “But we’ve seen a couple that are spammy, or sites that are using LinkedIn to grow their membership, which we don’t think is in our users’ best interests.”


The move is part of a larger shift in the way web giants allow outside applications to interact with their services. Most notably, Twitter—which once allowed widespread access to its service through APIs, or application programming interfaces—is now significantly restricting what developers can do. In 2012, it limited the number of users that outside services were allowed to juggle.


More recently, Netflix shut down its public API entirely.


The result is an internet that’s not quite as fluid as it once was, an internet that’s divided into pieces. That said, even as larger players close off their APIs, the number of open APIs listed on Programmable Web, a directory of such services, continues to grow.



Bubonic bottleneck: Scientists overturn dogma on the plague

For decades, scientists have thought the bacteria that cause the bubonic plague hijack host cells at the site of a fleabite and are then taken to the lymph nodes, where the bacteria multiply and trigger severe disease. But UNC School of Medicine researchers discovered that this accepted theory is off base. The bacteria do not use host cells; they traffic to lymph nodes on their own and not in great numbers.



In fact, most of the plague-causing bacteria -- called Yersinia pestis -- get trapped in a bottleneck either in the skin, while en route to the lymph node, or in the node itself. Only a few microbes break free to infect the lymph node and cause disease.


"Anytime you find something where the host is winning, you want to exploit it," said Virginia Miller, PhD, professor of microbiology and immunology and senior author of the paper in PLoS Pathogens. "If we can understand how the host and the bacteria contribute to this bottleneck, then this could become something we'd target so we could either ramp up what's causing the bottleneck or slow down the infection."


The discovery offers much needed information about how virulent insect-borne diseases, such as plague, malaria, and dengue virus cause infection. The findings also present new routes for research on how bacterial strains cause disease despite the immune system's best efforts.


The plague, which killed millions of people during the Middle Ages, is contracted by several people each year in the western United States. Outbreaks have occurred in the recent past in India and Africa, and one is unfolding right now in Madagascar. Standard antibiotics are effective against Y. pestis if taken early enough. But infection can go undetected for days, making diagnosis difficult and antibiotics less effective the longer the bacteria take root.


There are three kinds of plague all caused by Y. pestis: bubonic, which is contracted through fleabite; pneumonic, which is contracted by breathing in the bacteria; and septicemic, which is a severe infection of blood.


Miller's team studies the pneumonic and bubonic versions. Three years ago, Rodrigo Gonzalez, PhD -- a UNC graduate student at the time and now a postdoctoral fellow at Harvard -- searched the scientific literature for data confirming the accepted notion that Y. pestis gets trafficked by human phagocytic cells from the fleabite site to the lymph nodes.


Scientists readily accepted this idea because when Y. pestis microbes are added to phagocytic cells in culture, the cells do soak up the bacteria.


Phagocytes essentially eat harmful microbes, and because these cells traffic through the lymphatic system, scientists came to the logical conclusion that phagocytes take the Y. pestis to the lymph nodes.


But Gonzalez and Miller knew that a fleabite does not penetrate all layers of skin like an injection does. The bites of fleas and mosquitoes are intradermal; they occur within the layers of skin. Gonzales and Miller agreed that testing this long-held theory was a worthy project.


Gonzalez spent months developing an accurate way to mimic the flea bite in the lab so that the proper amount of bacteria would get transferred into the skin of mice. Then Miller's team created 10 special DNA sequences and added them to the chromosome of Y. pestis to generate 10 different strains. This did not affect virulence of the bacteria but allowed Miller's team to tag the microbes so that the researchers could identify which bacteria traveled from the "bite site" to the lymph nodes.


"We found that only one or two of the 10 bacteria made it to the lymph node," Miller said. "But they got there fast -- within five or ten minutes after the bacteria were introduced. We know that if a bacterium is traveling in a host cell, it would not move that fast because host cells are slow; they kind of crawl through the lymphatic system instead of flowing through fluid like bacteria can."


Miller's team is currently conducting experiments to figure out how most of the bacteria are prevented from infecting the lymph node.


"We may have found a point of vulnerability," Miller said. "Exploiting it could lead to new ways to defeat Yersinia pestis and other insect-borne pathogens."



This iPad Case Makes Real Buttons Rise Out of Your Keyboard


IMG_7049

Tactus



There’s no need to be embarrassed. We’ve all thought it before. You love your smartphone. You adore its spacious screen and its helpful apps. Still, you miss your old Blackberry keyboard.


It’s the one place where smartphones were a big step backward: We went from sending no-look texts under the table to appending “sent from my phone, please excuse typos” disclaimers to emails. Even with clever autocorrect algorithms and fancy swipe-to-spell software, typing on a screen doesn’t compare to typing on a real keyboard. So are we stuck here forever? Maybe not. What if there was a touchscreen that could give you real, physical buttons—but only when you need them?


That’s what Tactus Technology is trying to build. The California startup’s spent nearly five years developing a technique that makes see-through buttons materialize on top of touchscreens, as if by magic. Now, it’s readying its first consumer product, an iPad Mini case called Phorm. It’s not exactly a touch screen typing revelation, but it is an intriguing look at how we might supercharge our flat glass gizmos in the future.


Aiming for a Better Typing Experience


Tactus’ shapeshifting buttons rely on a technology called microfluidics, long used in ink jet printers. In this case, it involves a transparent panel, carved with imperceptibly small grooves, that sits on top of a device’s display—“a screen protector on steroids,” as Tactus co-founder Craig Ciesla puts it. When triggered, a change in pressure sends tiny amounts of fluid through the grooves, causing a pre-determined pattern of small bubbles to rise up from the surface of the screen.



How To Ask Your BFF If She’ll Run Your Facebook Page in the Event of Your Untimely Death


Editors’ note: Today Facebook announced that you can choose an heir to run your page after you die. This is how you might go about telling your heir they’ve been chosen.


Monica, sit down. Here’s some ice tea. It’s sweetened just how you like.


No, let’s talk about book club later. First, I need to ask you something really important.


I’ve been thinking, and Mon, we’ve known each other a long time.


Yes, 8 years. Wow, we’re so old. But you look great! I’m just saying, wow, we’re old.


Speaking of old, that brings me to what I want to ask: Mon, I might die.


No, I’m not dying! I mean any more than we all are every second of every day, you know, because of the weird edge of the present moment that is constantly slicing closer and closer toward nonexistence.


No, I’m healthy and everything is great, but you know, I could die any second, and I’ve been thinking about how you are so important to me. You get me. Like, for instance, that time we went to see Sex and the City 2 with the girls, and I was so pissed afterward about how racist it was? And dumb? And they were like, shut up Emily, stop denigrating a great franchise. But you, Mon, you totally had my back.


OK, yes, I am leaving you something. Not in my will really, but–look, it might seem silly. But I really want you to have it, because I think you can handle it.


No, not my car. I mean, you can have her if you want? But she needs a new carberator, so… Look, Mon, I’m talking about Facebook.


I want you, if you can handle it, to take on the responsibility of running my Facebook page when I die.


I know this is a lot to take in. Your eye is twitching. Should I spike that tea?


Look, Mon, you can do this! You are so capable and strong and loving and insightful. You know never to post too many status updates a day. You know not to whine and complain on Facebook because nobody needs that shit. You know that only important articles are worth posting, but that every rule was made to be broken so the occasional cat video is good too! You know that petitions have no place on Facebook but pleas for justice do. You know I don’t post on other people’s walls. You know never to poke. You know to be funny, but never leave typos uncorrected. You know to be generous with my likes. You know to comment saying, “happy birthday, babe!” when it’s one of our friend’s birthdays, and to NEVER write HBD.


Do you know about how to refresh a share attachment on a post? And how to edit? Don’t worry. I’ll teach you.


You are the only one who can take this on, Mon. And I am so lucky to have you. I know you won’t take this lightly.


Oh, and I know you won’t ever post a Facebook note. No one uses those.


Don’t cry. I know I am crying, but it’s just because, I don’t know, it’s crazy that we have to think about this! We’re so old and fragile! But I feel so much better knowing my legacy is in good hands. Tell everyone I love them. But don’t make them feel weird.


OK, OK, drink your ice tea. Here, it needs a dash more vodka.


Yeah, book club is going to suck. I haven’t even started yet.



Pinterest and Apple Join Forces to Make App Discovery Easier


pinterest app pins

Pinterest



Pinterest wants to make it easier for you to find the best apps out of a sea of 1.2 million options in the Apple App Store. To do that, the digital scrapbooking social network is teaming up with Apple on a new product called App Pins.


The tool, which Pinterest announced Wednesday in a blog post, will allow the company’s users to pin apps to their Pinterest boards just like they would an apple cobbler recipe or one of those mason jar planter thingies. As part of the partnership, Apple will also be creating a few Pin boards of its own, where it will highlight top apps. Users who want to download a certain app can do so right from Pinterest on their phones.


This deal is, in many ways, part of Pinterest’s much larger ambition to become an alternative search engine. Last year, the company debuted its so-called guided search, which helps users narrow their searches by suggesting subcategories within a search that might lead to more specific search results. So, a search for “cake” might yield suggestions like “pops” or “decorating.” It’s search for people who don’t know exactly what they’re looking for, but want to do some browsing to find the answer. It’s window-shopping search. And it’s this promise of discovery—curated by a body of users with, let’s be real, frustratingly good taste—that makes Pinterest an appealing place to uncover new apps.


It’s also become an appealing place for advertisers. This year, Pinterest began allowing U.S.-based advertisers to buy Promoted Pins, after a promising pilot program with select advertisers. During that beta test, Promoted Pins got a 30-percent boost in exposure, thanks to Pinterest users re-posting the pins on their own boards. Though Pinterest told The New York Times it’s not planning on making money from App Pins, that doesn’t mean developers couldn’t begin shifting their advertising dollars to Pinterest.


With or without advertising, though, App Pins should be a welcome feature for app developers. Until now, getting exposure for their apps required either being selected and featured by Apple staffers or paying to promote the app on other platforms, like Facebook. With App Pins, developers will now have access to an army of potential promoters.



The Week’s Best Music Videos: Haim Teams With Calvin Harris to Channel Stevie Nicks


This week’s best music videos are a tribute to doing more with less. More specifically, they’re a testament to the fact that you don’t need a lot of money to bring your music to life visually, you just need a lot of heart and a handful of friends who are down for whatever. Or you need to be a witch. Being a witch also increases your odds of making something worthwhile—or at least attention-getting.


“Champagne Kisses”—Jessie Ware



Cape Watch: Angelina Jolie Just Might Direct Captain Marvel


CapeWatch24

Marvel Entertainment (left), Kodansha (center), Sony Pictures Entertainment (right)



As strange as it might seem, there really have been things happening in the world of superhero movies recently other than the Sony/Marvel Spider-Man deal. Although, admittedly, that remains one of the biggest stories around. If the almost literally unbelievable rumor about who Marvel is considering to direct Captain Marvel turns out to be true, however, all bets are off because that news would be huge. But don’t take our word for it, read on to catch up on the biggest stories of the past seven days’ worth of superhero movie news.


SUPER IDEA: Marvel Helping Out With Spider-Man


More details are emerging about the deal between Marvel and Sony over future usage of Spider-Man on the big screen, and it’s beginning to look as though things aren’t going to be as different as it originally seemed. According to Variety , plans for the three Spider-Man spin-off movies are still on track and Marvel will have no creative input on anything other than the core Spidey flicks. However, Spider-Man himself might be a bit different than what we’re used to. The Hollywood Reporter notes Sony is looking for an actor “much younger” than Andrew Garfield for the role, as well as a writer for the reboot. Interestingly enough, that report also suggests that this is the beginning of something bigger, with both Marvel reacquiring rights to the character and, impressively, Marvel parent company Disney buying Sony being mooted as possibilities. Well, if you can’t beat ‘em, buy ‘em, right? (Wait. Marvel had pretty much beat Sony in terms of superhero movies, so that doesn’t work…)

Why this is super: A lot of people are very, very excited about this prospect and the Spider-Man movies are now hotter than they’ve been in years, so clearly something’s gone right for the franchise. Shame about poor Andrew Garfield, though.


MEH IDEA: You Really Can’t Keep An Old Mutant Telepath Down, It Turns Out


If you thought that X-Men: Days of Future Past was a good send-off for Patrick Stewart’s elder statesman version of Charles Xavier, prepare to be disappointed: He’s not done yet. “What I’m very excited about is that we have been talking about a Wolverine movie, which would team Hugh Jackman and myself together,” the actor revealed in an interview earlier this week.

Why this is villainy: Patrick, you have been great in the role, there’s no denying that. But it would be really, really nice if we could have less X-Men movies that were all about Xavier, Magneto, and Wolverine, and more that were about … well, any of the many other characters that are part of the franchise. It’s time, let’s face it.


SUPER IDEA: Marvel Aims High For Its Captain Marvel Director (Maybe)


File under: “Wait, that can’t be for real.” No less an authority than OK! Magazine reports that Marvel is looking to hire Angelina Jolie as the director of 2018’s Captain Marvel, with the magazine suggesting that the studio was preparing to pay her $20 million for the pleasure. Yes, that’s right, Marvel Studios paying a director $20 million for Captain Marvel.

Why this is super: If this were in any way believable, it would be super indeed. What better way to get attention for the studio’s (long overdue) first female-led movie than to have one of the biggest female movie stars—and, as Unbroken showed, a pretty grand director—helm it? Unfortunately, it just doesn’t have the ring of truth, in large part because of that $20 million figure, which is astonishingly high even for a studio as famously stingy as Marvel. We’re all on-board with Marvel going for a female director, though. In fact, this is where we remind you Selma director Ava DuVernay is open to making a Marvel origin story movie. Get on that, Marvel.


SUPER IDEA(S): The Wacky Shortlist of Potential Deadpool Female Co-Stars


The news that Fox has drawn up a shortlist of female co-stars for Deadpool came as quite a surprise, considering that he usually goes stag (Or, at least, he did until his wedding last year). But it’s an interesting list of hopefuls, from The Red Band Society’s Rebecca Rittenhouse and Teen Wolf’s Crystal Reed to Morena Baccarin (Homeland, Firefly, a guest spot on The OC) and Orange is The New Black’s Taylor Schilling. We all know it should really go to Judy Greer, though, right? Just checking.

Why this is super: That is a really strange list of actresses up for the same part, which either means that producers have no idea what they want, or the part is open enough that different actors can bring different things to it. Either way, we’re intrigued. (Of course, we’re just a year away from the movie’s release, so hurry up and cast that role, producers.)


SUPER IDEA: Astro Boy Becoming the New Iron Man


Apparently, the appetite for superhero movies is such that an Australian company has plans to turn the classic Japanese manga series Astro Boy into the next Iron Man. “We’ve seen him as a manga, an anime, and an animated movie but we’ve never seen him as a live-action movie or him as a superhero,” producer Zareh Nalbandian told The Hollywood Reporter.

Why this is super: There’s a reason we’ve not seen him as a superhero … and that’s because he’s not really a superhero. Astro Boy (Mighty Atom, if we’re going by his original name) is an adorable boy robot who fights monsters and saves the world to impress his “dad”—which, admittedly, is kind of Iron Man, only with added adorableness. Hrm. As much as we feel cynical about this turn of events (and sad it might mean we don’t see the movie version of Pluto that was tossed around a while back), it just might work…



This App Wants You To Borrow Money From Friends, Not Banks


We've been cleaning up a bit around here and the page you're looking for may have recently moved or been renamed. If you were looking for a particular article or topic, try using our search.


If you clicked a link somewhere on our site and were led to this page, please let us know so we can fix our mistake as soon as possible.



I Wish More Games Were as Weird as This Guy’s Interactive Alphabet


N is for nose, neighbors, and neck.

N is for nose, neighbors, and neck. Vectorpark



A lot has changed in the last 15 years. The web retired its drug rug, grew up, and got a job. Smartphones arrived, bringing new possibilities and priorities. Everyone became obsessed with making things smart and seamless and intuitive.


Like a monk on a mountain, Patrick Smith seems to have been oblivious to it all. At least that’s the impression you get looking at his work. The strange little interactive things Smith is making today aren’t much different from the strange little interactive things he was making in 2000, when he first started releasing Flash games under the name Vectorpark. And they’ve always felt a little bit… different. Smith’s enigmatic games never explain what they are, or how you’re supposed to play with them. They’re just there, crisp and flat, waiting to reveal themselves—but only if you make the effort.


To his small but enthusiastic band of fans, this exquisitely crafted weirdness is what makes every Vectorpark release a cause for celebration. And Smith’s latest work, an interactive alphabet for the iPad, is yet another welcome breath of fresh weird arriving in decidedly un-weird times.


An Interactive Alphabet Without Barnacles


The new app, out today for the iPad for $4, is called Metamorphabet. It’s an interactive alphabet where each letter transforms into other things that begin with that letter. The first one you get is a big, blocky A. Swipe at it, and it morphs into an arch. Give it another swipe, and it grows antlers. Swipe again and it starts ambling across the screen. (Arch, antlers, ambling—get it?)


F is for foot.

F is for foot. Vectorpark



This might not sound incredibly exciting, and Smith is quick to admit that Metamorphabet is one of the more straightforward things he’s done. But as is always the case with his work, the magic is in the details. It’s how the arch jiggles when you touch it, or the way dragging the antlers to the side makes them creak and bend, sending the tiny blue birds alighted there scattering. It’s all the little touches that make each surreal tableau feel so convincingly alive.


Metamorphabet is probably Smith’s least game-like creation yet. Nonetheless, it’s up for the grand prize at next month’s Independent Game Festival. Generic though the concept may be,

Smith made the interactive alphabet that only he could make. Every transformation is both unexpected and perfectly fluid. Like a dream, it’s totally nonsensical, but it makes perfect sense. Getting everything just right took a few years, a process Smith refers to as “getting rid of the barnacles.”