New Dr. Who Game Teaches Kids to Code


drdalek

BBC



The Doctor Who is facing a new challenge: teaching your nine-year-old how to code.


No, it may not be as daunting as a Cybermen invasion. But for you kid, it could be as fun as it is educational.


On Wednesday, the BBC will launch a game called “The Doctor and the Dalek.” Designed by Dr. Who’s creator in tandem with Britain’s Somethin’ Else creative agency, the game aims to help children “pick up core programming principles as they play.” This means using logical reasoning, variables and loops and repetition to help the Doctor save the universe—the kind of stuff that should spell programming fun for six-to-12-year-olds.


“Getting children inspired is the big thing for us around this game. When you say ‘coding and programming’ straight away it feels like a very dry topic, but our aim was to show children you can have fun,” the BBC’s Jo Pierce told the Guardian. The show’s lead writer and executive producer Steven Moffat is also involved in the game, the Guardian reports.


The game is another step for what’s called the code literacy movement, which seems to bring programming skills to a much broader range of people. The BBC’s game aims to help kids with the new computer science curriculum laid down by the British government, which is one of the broader efforts to improve code literacy.


In the game, the Doctor joins forces with a refugee Dalek on the home world of the warlike Sontarans. “The Doctor and the Dalek” is voiced over by current Doctor, Peter Capaldi, and comes with additional educational material for parents and instructors.


But before your child will be able to play the new Doctor Who game, there is an initial puzzle to solve. The BBK’s kid’s gaming website is available only to U.K. residents. If you’re in the U.S., you’ll have to find a way around that.



New Dr. Who Game Teaches Kids to Code


drdalek

BBC



The Doctor Who is facing a new challenge: teaching your nine-year-old how to code.


No, it may not be as daunting as a Cybermen invasion. But for you kid, it could be as fun as it is educational.


On Wednesday, the BBC will launch a game called “The Doctor and the Dalek.” Designed by Dr. Who’s creator in tandem with Britain’s Somethin’ Else creative agency, the game aims to help children “pick up core programming principles as they play.” This means using logical reasoning, variables and loops and repetition to help the Doctor save the universe—the kind of stuff that should spell programming fun for six-to-12-year-olds.


“Getting children inspired is the big thing for us around this game. When you say ‘coding and programming’ straight away it feels like a very dry topic, but our aim was to show children you can have fun,” the BBC’s Jo Pierce told the Guardian. The show’s lead writer and executive producer Steven Moffat is also involved in the game, the Guardian reports.


The game is another step for what’s called the code literacy movement, which seems to bring programming skills to a much broader range of people. The BBC’s game aims to help kids with the new computer science curriculum laid down by the British government, which is one of the broader efforts to improve code literacy.


In the game, the Doctor joins forces with a refugee Dalek on the home world of the warlike Sontarans. “The Doctor and the Dalek” is voiced over by current Doctor, Peter Capaldi, and comes with additional educational material for parents and instructors.


But before your child will be able to play the new Doctor Who game, there is an initial puzzle to solve. The BBK’s kid’s gaming website is available only to U.K. residents. If you’re in the U.S., you’ll have to find a way around that.



As Online Viewing Soars, Internet TV Will Soon Be the Only TV


Roku streaming stick. Photo: Josh Valcarcel/WIRED

Josh Valcarcel/WIRED



More people are watching TV online than ever—a lot more. Viewers may not be cutting the cable cord altogether, but growth in the number who want to watch TV over a different set of pipes is surging, according to a new report from Adobe. If anyone was still wondering why HBO and CBS plan to offer an online-only option, the trend is clear: the internet is where people want to watch. In more and more homes, online TV isn’t a geeky novelty, a sidelight to the traditional version. It’s just what TV looks like now.


Adobe is in a position to know because its software runs the platform that nearly all US cable customers use to log into the online versions of their subscriptions, according to the company. Researchers tracked 165 online video views and 1.53 billion logins over a year, and they found that total TV viewing over the internet grew by 388 percent in mid-2014 compared to the same time a year earlier—a near-quintupling. And the increase is more than just a few diehards binge-watching: the number of unique viewers well more than doubled, growing 146 percent year-over-year.


Eventually cable will follow bunny ears into the basement of dead technology, and online TV will be called something else: plain old TV.


According to analyst Tamara Gaffney, three factors are drove this growth: more apps and sites for watching, more content to watch on those apps and sites, and the World Cup. Sports act as as kind of “appetizer” whetting viewers’ appetites for the flexibility and breadth of online TV, Gaffney says. The World Cup was an especially strong lure because the internet was the only way to watch so many games that traditional TV lacked the bandwidth to show. But Gaffney said once viewers came for sports, they stayed for everything else.


“Households generally connect because of sports,” she says. “But then when they start to use online television, they start to branch out.”


Back to the Big Screen


According to Adobe, viewers have branched out so much that, for the first time, viewers watched more movies online than sports. The average was 4.5 movies per month, Adobe says, versus two a year ago. Viewing of “episodic television” over the internet also saw a sharp increase. The jump is all the more remarkable since Adobe’s survey doesn’t include the main streaming services: Netflix, Amazon Prime, and Hulu. In other words, many people still paying for cable are less interested in watching TV in the cable way and more in the way Netflix has led viewers to come to expect.


“When you want to really binge-view something you didn’t know you wanted to watch until the season was over, you’re going to turn to the online option,” Gaffney says.


The turn toward online TV, however, hasn’t entirely meant a turn away from the television itself. In fact, the gadgets that showed the greatest increase in their share of use for internet-TV watching were gaming consoles and over-the-top devices such as Apple TV and Roku. Where a year ago they accounted for just 3 percent of online TV viewing, they now account for 10 percent. These devices all pipe TV from the internet to the big screen, and the one thing viewers are still most eager to watch there is sports.


“During the Olympics and World Cup, people wanted to watch on the big screens in their living room,” Gaffney says.


Broadcast Bottoms Out


But unlike traditional TV, the appeal of the online version is that it doesn’t require viewers to commit to a single piece of hardware. More than half (51 percent) of all online TV viewing happens on iOS apps, according to Adobe’s figures. And Gaffney believes the iPhone 6 and 6 Plus’ larger screens will only cement Apple’s place as the platform of choice for online TV. Apple’s timing appears to be good as smartphones, driven by the embrace of phablets, start to surpass tablets as the online video-viewing device of choice, according to Adobe’s findings.


Because the Adobe report covers TV-watching by viewers who are paying for cable, it’s hard to read the results as a sign that more cord-cutting is coming. But getting viewers habituated to the internet way of watching TV could hasten a less aggregated future, says Gaffney, one where customers may prefer the more personalized service of an HBO-only login, for example, versus one for their entire cable package.


The big loser would seem to be broadcast TV, since the traditional way the broadcast networks spread the news about their new shows is on their networks. For people watching TV online, those are ads they’ll never see. What they’re more likely to see is chatter on social media—if a new show generates enough buzz, wannabe viewers can track down old episodes online and binge-watch to catch up. More and more, this is the way TV is now, and there’s no reason the trend will stop. Eventually cable will follow bunny ears into the basement of dead technology, and online TV will be called something else: plain old TV.



Salmonella-infected mice that were given antibiotics became superspreaders

Salmonella-infected mice that were given antibiotics became sicker and began shedding far more bacteria in their feces than they had before.



Some people infected with pathogens spread their germs to others while remaining symptom-free themselves. Now, investigators at the Stanford University School of Medicine believe they may know why.


When the scientists gave oral antibiotics to mice infected with Salmonella typhimurium, a bacterial cause of food poisoning, a small minority -- so called "superspreaders" that had been shedding high numbers of salmonella in their feces for weeks -- remained healthy; they were unaffected by either the disease or the antibiotic. The rest of the mice got sicker instead of better and, oddly, started shedding like superspreaders. The findings point to a reason for superspreaders' ability to remain asymptomatic. They also pose ominous questions about the widespread, routine use of sub-therapeutic doses of antibiotics in livestock.


About 80 percent of all antibiotics used in the United States are given to livestock -- mainly cattle, pigs and chickens -- because doing so increases the animals' growth rates. Experts have already voiced concerns about how this practice contributes to the rise of drug-resistant pathogens. But the new study, published online Oct. 20 in Proceedings of the National Academy of Sciences, highlights an entirely different concern.


"We've shown that the immune state of an infected mouse given antibiotics can dictate how sick that mouse gets and also carries implications for disease transmission," said Denise Monack, PhD, associate professor of microbiology and immunology and the study's senior author. "If this holds true for livestock as well -- and I think it will -- it would have obvious public health implications. We need to think about the possibility that we're not only selecting for antibiotic-resistant microbes, but also impairing the health of our livestock and increasing the spread of contagious pathogens among them and us."


Upon invading the gut, S. typhimurium produces a powerful inflammation-inducing endotoxin, which annually results in an estimated 1 million cases of food poisoning, 19,000 hospitalizations and nearly 400 deaths in the United States. Passed from one individual to the next via fecal-oral transmission, it is known to produce a curious pattern of pathology among infected individuals: Some 70-90 percent of those infected shed fairly light amounts of bacteria (and so are not very contagious). But the remaining 10-30 percent -- superspreaders -- remain symptom-free yet shed huge amounts of bacteria, causing the great bulk of the pathogen's spread through a population. The reasons for this dichotomy have not been understood.


Evading detection


From a public health standpoint, knowing how to easily and quickly identify superspreaders could help curtail or even prevent epidemics, Monack said. Yet superspreaders don't appear to be sick, so they evade treatment. At the moment, the only way to determine which category a person or beast belongs to is by inspecting each individual's stool, a procedure that would be inconvenient at best even with livestock.


But the Stanford team has discovered that the immune systems of superspreaders and non-superspreaders are in differing states, raising the possibility of a blood test that could make identifying superspreaders more practical.


Salmonella infection in mice is not uncommon, said Monack. "Mice in a barn can be infected with salmonella for a long time and not get sick. They run around perfectly healthy. They're happy little incubators for salmonella."


In Monack's lab, more than 1 in 5 salmonella-infected mice are superspreaders. "The mice we use are inbred," she noted. "So this difference in response to salmonella infection can't be just a simple matter of genetic mutations."


The Stanford investigators had previously published work showing that giving non-superspreader mice an oral antibiotic, which kills some of the friendly microbes that ordinarily inhabit mammals' intestines and provide protection against invading pathogens, led to a rapid increase in salmonella shed in their feces.


In the new study, the scientists gave streptomycin, an antibiotic, to salmonella-infected mice. They were surprised by the results. Overnight, the majority that had been shedding relatively low levels of salmonella in their feces now evidenced very high levels of the pathogen in both their gut and their feces. And within a few days, these antibiotic-treated, formerly low-shedding mice became visibly ill. "They lost weight, had ruffled fur and hunched up the in corners of their cages," Monack said. "They also began to shed much larger quantities of bacteria." Several of them died. What was most surprising, though, was that superspreaders kept on shedding large amounts of bacteria while remaining blithely asymptomatic. Examination of the animals' intestines showed that gut concentrations of S. typhymurium in former non-superspreaders now rivaled those of superspreaders.


Giving the mice another antibiotic, neomycin, produced the same outcomes.


Symptom-free superspreaders


Postdoctoral scholar Smita Gopinath, PhD, the study's lead author, demonstrated that while all the animals harbored the pathogenic bacteria in their gut, the superspreaders -- despite carrying even higher intestinal levels of salmonella and harboring more gut inflammation than the other mice -- had a dampened immune response: Their overall systemic levels of several important pro-inflammatory signaling proteins, secreted by various types of immune cells to whip the immune system into an antimicrobial froth, were substantially lower than those of mice that had morphed from non-superspreaders to sickened superspreaders.


That explained the absence of symptoms in superspreaders, Monack said. Rather than mounting a heightened immune response to the pathogen, superspreaders appear to simply shrug off its presence. "Instead of jousting with the germ, they tolerate it," she said. "Their immune cells have been rewired and aren't responding to the inflammatory signals in the intestines the same way."


Antibiotics actually cause precisely the opposite of the intended effect in the salmonella-infected mouse population, Monack said. "The superspreaders stay healthy and keep on shedding and transmitting disease. Somehow, in an as yet unknown manner, they're coping with S. typhimurium. The others temporarily shed more bacteria than before, although they're too sick to spread much disease."


The bacteria shed in bulk by former non-superspreader mice were every bit as infectious and virulent as those shed by bona fide superspreaders.


Could it happen in humans?


The phenomenon shown in mice hasn't yet been shown in humans, but should be checked out, said Monack. "We humans shouldn't take antibiotics lightly," she said. "We need to consider whether they're always beneficial when they're given to animals across the board, or when we take them ourselves."


On the positive side, she said, "if we can figure out what leads to this immune dampening in superspreaders, it could potentially be helpful in suppressing symptoms of people with chronic inflammatory intestinal disorders, such as Crohn's syndrome or inflammatory bowel disease."


Other Stanford co-authors of the study are professor of comparative medicine Donna Bouley, DVM, PhD; assistant professor of chemical and systems biology Joshua Elias, PhD; and graduate student Joshua Lichtman.


The study was supported by the Burroughs Wellcome Fund and the National Institutes of Health (grant R01A1095396).



Fish tale: New study evaluates antibiotic content in farm-raised fish

Antibiotics -- one of modernity's great success stories -- are charms that come with a curse. Their overuse in human and animal populations can lead to the development of resistant microbial strains, posing a dire threat to global health.



In a new study, Hansa Done, PhD candidate, and Rolf Halden, PhD, researchers at Arizona State University's Biodesign Institute, examine antibiotic use in the rapidly expanding world of global aquaculture.


Done and Halden measured the presence of antibiotics in shrimp, salmon, catfish, trout, tilapia and swai, originating from 11 countries. Data showed traces of 5 of the 47 antibiotics evaluated.


The research findings and a discussion of their implications appear in the current issue of the Journal of Hazardous Materials.


Charting resistance


The menace of germs bearing resistance to our best medical defenses is reaching crisis proportions. Each year, resistant microbes sicken some 2 million people in the U.S. alone and kill about 23, 000, according to the Infectious Diseases Society of America.


On September 18, President Obama proposed the first governmental steps to address the problem, establishing a task force to be co-chaired by the secretaries of Health and Human Services, the Department of Defense, and the Department of Agriculture.


The new initiative to reign in antibiotic overuse has been welcomed in the medical community, though many believe that much more needs to be done to safeguard society. The chief complaint is that the proposed measures largely ignore the largest consumers of antibiotics -- animals farmed for human consumption, including fish.


"The threat of living in a post-antibiotic era cannot be avoided without revising current practices in the use of antibiotics in animal husbandry, including in aquaculture," says Halden.


Halden, who directs the Biodesign Institute's Center for Environmental Security, is a leading authority on the human and environmental impact of chemicals, (particularly their fate once their useful life has ended). In previous research, he has explored the intricate pathways from production to postconsumption fate of antimicrobials and the risks posed.


The new study examines the persistence of antibiotics in seafood raised by modern aquaculture. The research area is largely unexplored, as the primary focus of studies of antibiotics has been on drugs used in human medicine. The current research is the first to evaluate previously unmonitored antibiotics; it represents the largest reconnaissance conducted to date on antibiotics present in seafood.


Farming lifestyle


Aquaculture has undergone rapid growth to meet the burgeoning global demand, nearly tripling over the past 20 years to an estimated 83 million metric tons in 2013. The large increase has led to widespread antibiotic use, applied both to prevent and treat pathogens known to infect fish. The broad effects on health and the environment associated with these practices remain speculative.


Several natural mechanisms exist to help pathogenic microbes evade immune responses or develop drug resistance over time. The overuse of antibiotics, whether for human ingestion in hospitals or for agricultural or aquacultural use, can seriously exacerbate this problem, enriching microbes that bear particular genetic mutations, rendering them antibiotic resistant. In a biological arms race, antibiotics applied to combat disease run the risk of producing multi-drug resistant organisms that are increasingly difficult to kill.


In the new study, 27 seafood samples were examined for the presence of antibiotics. The samples represent five of the top 10 most consumed seafood varieties in the U.S.: shrimp, tilapia, catfish, swai, and Atlantic salmon. The National Oceanic and Atmospheric Administration (NOAA) acquired the samples from stores in Arizona and California.


Five antibiotics were present in detectable amounts: oxytetracycline in wild shrimp, farmed tilapia, farmed salmon and farmed trout; 4-epioxytetracycline in farmed salmon, sulfadimethoxine in farmed shrimp, ormetoprim in farmed salmon, and virginiamycin in farmed salmon that had been marketed as antibiotic-free.


Oxytetracycline, the most commonly used antibiotic in aquaculture, was the most prevalent in the study samples. Surprisingly, the study also detected this antibiotic in wild-caught shrimp imported from Mexico, which the authors suggest may be due to mislabeling, coastal pollution from sewage contamination, or cross-contamination during handling and processing.


On the bright side, all seafood analyzed was found to be in compliance with U.S. FDA regulations; however, the authors note that sub-regulatory antibiotic levels can promote resistance development, according to their extensive meta-analysis of existing literature. (Publications linking aquaculture with antibiotic resistance have increased more than 8-fold from 1991-2013.)


Antibiotics also have the potential to affect the animals themselves, producing alterations in how genes are turned on or off and physiological anomalies. (The latter may include malformations of the spine in trout exposed to the antibiotic oxytetracycline, though more work will be needed to clarify this association.)


Proper monitoring of antibiotic residues in seafood is particularly critical, due to the fact that many antibiotics used in aquaculture are also used in human medicine, for example amoxicillin and ampicillin -- common therapeutics for the treatment of bacterial infections, including pneumonia and gastroenteritis.


The future of fish


The use of antibiotics in aquaculture can produce a variety of unintended consequences in addition to antibiotic resistance, including antibiotic dissemination into the surrounding environment, residual concentrations remaining in seafood, and high antibiotic exposure for personnel working in aquaculture facilities.


Changes in aquaculture are needed to ensure the practice can be carried out on a large scale in a sustainable manner. Currently, massive aquaculture operations threaten the health of seas, due to large volumes of fish waste emitted, containing excess nutrients, large amounts of pathogens, and drug resistance genes.


Additionally, many types of farmed fish rely on fishmeal produced from by-catch caught in fishing nets. Several pounds of fishmeal are often required to raise a single pound of farmed fish, thereby contributing to the overfishing of the seas and depletion of ocean diversity.


The current study offers a warning that antibiotics present at levels well below regulatory limits can still promote the development of drug-resistant microorganisms. The dramatic increase in resistant and multi-drug resistant bacterial strains documented over the past three decades indicates that much more thorough monitoring of seafood supplies is needed and a better scientific understanding of the nexus of global aquaculture, antibiotic use, drug resistance emergence, and regulatory measures.



Pediatric allergology: Fresh milk keeps infections at bay

A study by researchers of Ludwig-Maximilians-Universitaet (LMU) in Munich shows that infants fed on fresh rather than UHT cow's milk are less prone to infection. The authors recommend the use of alternative processing methods to preserve the protectants found in the natural product.



A pan-European study, led by Professor Erika von Mutius, Professor of Pediatric Allergology at LMU and Head of the Asthma and Allergy Department at Dr. von Hauner's Children's Hospital, reports that fresh cow's milk protects young children from respiratory infections, febrile illness and inflammation of the middle ear. Their results appear in the Journal of Allergy and Clinical Immunology. As untreated cow's milk may itself contain pathogenic microorganisms and could pose a health risk, the researchers argue for the use of processing methods that preserve the protective agents present in raw milk.


The findings are the latest to emerge from the long-term PASTURE study, which is exploring the role of dietary and environmental factors in the development of allergic illness. The study initially recruited 1000 pregnant women who were asked to document their children's diet and state of health at weekly intervals during the first year of life. "Among children who were fed on fresh, unprocessed cow's milk the incidence of head colds and other respiratory infections, febrile and middle-ear inflammation was found to be significantly lower than in the group whose milk ration consisted of the commercially processed ultra-pasteurized product," says Dr. Georg Loss of Dr. von Hauner's Hospital, first author of the new paper. Ingestion of farm milk reduced the risk of developing these conditions by up to 30%, and the effect was diminished if the milk was heated at home before consumption. Conventionally pasteurized milk retained the ability to reduce the risk of febrile illness, while exposure to the higher temperatures used in UHT processing eliminated the effect altogether. Importantly, the positive impact of raw milk could be clearly separated from the confounding effects of other elements of the children's nutrition.


Impact on inflammation


"The effects of diverse milk treatments are presumably attributable to differentially heat-resistant components present in fresh milk. Compounds that are sensitive to heating seem to play a particularly important role in protection against respiratory-tract and ear infections," says Loss.


At the end of the first year of life, blood samples were obtained from the children enrolled in the study, and tested for biochemical indicators of immunological function. Infants fed on unprocessed milk were found to have lower levels of the C-reactive protein, which is a measure of inflammation status. "Other studies have shown that higher levels of inflammation are related to the subsequent emergence of chronic conditions such as asthma and obesity. Consumption of unprocessed milk may therefore reduce the risk of developing asthma," Loss explains.


Industrial processing of milk involves short-term heating of the raw product. Conventionally pasteurized milk has been exposed to temperatures of 72-75°C for 15 seconds, while ultra-pasteurized milk undergoes heating at around 135°C for a few seconds. The latter is also homogenized to disperse the milk fats, which prevents the formation of cream. "Consumption of unprocessed milk itself is not entirely without risk," says Loss. Indeed, untreated milk may contain pathogenic bacteria that cause serious illnesses. Examples include the enterohemorrhagic Escherichia coli (EHEC) strains that are associated with severe diarrhea and kidney failure, and the microorganisms that cause listeriosis and tuberculosis. The researchers therefore suggest that alternative processing methods are needed for the industrial treatment of raw milk. "With novel, milder treatments one could produce milk that is free of pathogenic microorganisms but retains the protective agents found in fresh milk," says Loss.


The perks of country life


In addition to fats and carbohydrates, cow's milk contains proteins that can modulate the function of the immune system. "In many respects, the composition of cow's milk is similar to that of human milk," says Loss. It has long been known that breast-feeding protects infants from infection, although how milk actually affects the early immune function remains unclear. It is possible that some of the factors involved interact directly with viruses or that they promote the development of a healthy immune system by altering the composition of the gut microflora.


Feeding young children with cow's milk is also contentious because it can provoke allergic reactions. Among the children who participated in the PASTURE study only 2% had developed an allergy to milk or other food items prior to their first birthday.


That living in the country has positive effects on the immune system has been demonstrated in several previous studies. Together these investigations show, as Erika von Mutius notes, that "children who grow up on traditional dairy farms are least likely to develop allergies."


The 1000 pregnant women involved in the PASTURE study were recruited in rural areas of Bavaria, Finland, France, Switzerland and Austria, and approximately half of them live on farms. As well as monitoring maternal nutrition during pregnancy, the study will regularly assess their children's health and developmental status during the first 10 years of life, in order to elucidate the role of environmental factors in the etiology of allergic disease. The study is being carried out by teams at LMU and the German Center for Lung Research, universities in Ulm, Marburg, Basel, Helsinki, Kuopio (Finland) and Besançon (France) and children's hospitals in St. Gallen (Switzerland) and Schwarzach (Austria).



As Online Viewing Soars, Internet TV Will Soon Be the Only TV


Roku streaming stick. Photo: Josh Valcarcel/WIRED

Josh Valcarcel/WIRED



More people are watching TV online than ever—a lot more. Viewers may not be cutting the cable cord altogether, but growth in the number who want to watch TV over a different set of pipes is surging, according to a new report from Adobe. If anyone was still wondering why HBO and CBS plan to offer an online-only option, the trend is clear: the internet is where people want to watch. In more and more homes, online TV isn’t a geeky novelty, a sidelight to the traditional version. It’s just what TV looks like now.


Adobe is in a position to know because its software runs the platform that nearly all US cable customers use to log into the online versions of their subscriptions, according to the company. Researchers tracked 165 online video views and 1.53 billion logins over a year, and they found that total TV viewing over the internet grew by 388 percent in mid-2014 compared to the same time a year earlier—a near-quintupling. And the increase is more than just a few diehards binge-watching: the number of unique viewers well more than doubled, growing 146 percent year-over-year.


Eventually cable will follow bunny ears into the basement of dead technology, and online TV will be called something else: plain old TV.


According to analyst Tamara Gaffney, three factors are drove this growth: more apps and sites for watching, more content to watch on those apps and sites, and the World Cup. Sports act as as kind of “appetizer” whetting viewers’ appetites for the flexibility and breadth of online TV, Gaffney says. The World Cup was an especially strong lure because the internet was the only way to watch so many games that traditional TV lacked the bandwidth to show. But Gaffney said once viewers came for sports, they stayed for everything else.


“Households generally connect because of sports,” she says. “But then when they start to use online television, they start to branch out.”


Back to the Big Screen


According to Adobe, viewers have branched out so much that, for the first time, viewers watched more movies online than sports. The average was 4.5 movies per month, Adobe says, versus two a year ago. Viewing of “episodic television” over the internet also saw a sharp increase. The jump is all the more remarkable since Adobe’s survey doesn’t include the main streaming services: Netflix, Amazon Prime, and Hulu. In other words, many people still paying for cable are less interested in watching TV in the cable way and more in the way Netflix has led viewers to come to expect.


“When you want to really binge-view something you didn’t know you wanted to watch until the season was over, you’re going to turn to the online option,” Gaffney says.


The turn toward online TV, however, hasn’t entirely meant a turn away from the television itself. In fact, the gadgets that showed the greatest increase in their share of use for internet-TV watching were gaming consoles and over-the-top devices such as Apple TV and Roku. Where a year ago they accounted for just 3 percent of online TV viewing, they now account for 10 percent. These devices all pipe TV from the internet to the big screen, and the one thing viewers are still most eager to watch there is sports.


“During the Olympics and World Cup, people wanted to watch on the big screens in their living room,” Gaffney says.


Broadcast Bottoms Out


But unlike traditional TV, the appeal of the online version is that it doesn’t require viewers to commit to a single piece of hardware. More than half (51 percent) of all online TV viewing happens on iOS apps, according to Adobe’s figures. And Gaffney believes the iPhone 6 and 6 Plus’ larger screens will only cement Apple’s place as the platform of choice for online TV. Apple’s timing appears to be good as smartphones, driven by the embrace of phablets, start to surpass tablets as the online video-viewing device of choice, according to Adobe’s findings.


Because the Adobe report covers TV-watching by viewers who are paying for cable, it’s hard to read the results as a sign that more cord-cutting is coming. But getting viewers habituated to the internet way of watching TV could hasten a less aggregated future, says Gaffney, one where customers may prefer the more personalized service of an HBO-only login, for example, versus one for their entire cable package.


The big loser would seem to be broadcast TV, since the traditional way the broadcast networks spread the news about their new shows is on their networks. For people watching TV online, those are ads they’ll never see. What they’re more likely to see is chatter on social media—if a new show generates enough buzz, wannabe viewers can track down old episodes online and binge-watch to catch up. More and more, this is the way TV is now, and there’s no reason the trend will stop. Eventually cable will follow bunny ears into the basement of dead technology, and online TV will be called something else: plain old TV.



IBM Stock Plummets as Company Abandons Chip Business


IBM CEO Ginni Rometty. Image: IBM

IBM CEO Ginni Rometty. Image: IBM



IBM’s stock price is on the decline, after the company agreed to pay Globalfoundries $1.5 billion to take on its ailing chipmaking business and abandoned its controversial “2015 Roadmap” to deliver $20 earnings per share by next year.


News of the Global Foundries deal arrived on Sunday, via Bloomberg, and IBM CEO Ginni Rommety confirmed the agreement during the company’s financial earning call on Monday morning, when she also deep-sixed the 2015 Roadmap. The company’s stock price is down 7 percent on the day.


The sale of the chip business is part of IBM’s longer term shift away from hardware towards higher margin businesses, and in the age of cloud computing—where businesses can do easily rent computing power over the internet without buying their own hardware—many other hardware giants have been following IBM’s lead. Earlier this month, HP announced plans to split into two separate companies, one for its desktop PC line and one for its cloud computing and enterprise software business. And last year, Dell went private last year to focus on transforming itself into a cloud and software company, while storage giant EMC spun-off its cloud computing and big data units into a new company called Pivotal.


IBM started down this path more than a decade ago when it acquired consulting company PricewaterhouseCoopers in 2002 and sold its desktop computing line to Chinese company Legend, now known as Lenovo, in 2005. But while IBM beat Dell and HP to party by several years, it has continued to struggle with its own transformation.


As reported by Bloomberg, IBM has sought a buyer for its chip business since at least last year. But the unit, which loses $1.5 billion a year, was a tough sell. The company could simply have shuttered the business, but it still depends on its chips to power its high-end mainframe line—one of the only hardware businesses IBM still retains. The company was ultimately forced to pay Globalfoundries, a former division of chipmaker AMD, to take it over.


That’s just one of many reasons that IBM wasn’t able to keep to the 2015 Roadmap promise, first made by former IBM CEO Sam Palmisano in 2010. To reach that goal, IBM has worked to transform itself into not just a consulting company, but a also provider of cloud services to rival the likes of Google and Amazon. To that end, IBM acquired cloud computing provider SoftLayer last year, and invested $1 billion into its Watson brand of analytics services.


But those initiatives have come too late to transform IBM into a cloud powerhouse in time to deliver on the 2015 roadmap, so the company has spent much of the year trying to find other ways to improve margins — most notably through a series of brutal layoffs. None of that was enough to reach its goals. Although IBM had abandoned the 2015 Roadmap, given the dumping of the chip division and the confirmation of yet another round of layoffs, it appears that IBM’s cost-cutting will be as ruthless as ever.


On brighter note, Rometty confirmed that cloud revenues are up. But, as Bloomberg points out, the company only expects cloud revenues of $3.1 billion this year, a small fraction of IBM’s $100 billion total revenue last year.



Microsoft Offers Cloud Device to Battle Google and Amazon


Microsoft CEO Satya Nadella.

Microsoft CEO Satya Nadella. Microsoft



Microsoft will soon offer a hardware appliance that will let businesses run something akin to its Azure cloud computing service inside their own data centers.


At a press event in San Francisco on Monday, Scott Guthrie, who oversees Microsoft’s cloud business, said the appliance is based on hardware from Texas-based computer-server-seller Dell, and that it will arrive sometime next month. The idea is that the hardware device will make it easier for businesses to run software both inside their own data centers and atop Azure, an online service that provides access to computing power over the net. Known as the “Microsoft Cloud Platform System,” the device will run software that mimics what’s available on Microsoft Azure.


In years past, Microsoft indicated that it would offer such an appliance through Dell and other partners, but this effort never quite came to fruition—until now. The news arrived as Guthrie and Microsoft CEO Satya Nadella laid out Microsoft’s larger strategy for competing in a modern world where businesses are increasingly moving their software onto cloud computing services.


After pioneering the modern notion of cloud computing, Amazon is still the market’s dominant player. And Google, which conceived so many of the underlying technologies that drive this market, is now pushing its own cloud services in a big way. But Nadella believes that Microsoft can challenge these web giants, not only because it can match—and perhaps even exceed—the scale of their online operations, but also because it can help businesses setup and run their own own software and hardware in their own computer data centers. “This,” Nadella said on Monday, “is something that only Microsoft does.”


The pitch is that businesses are a long way from moving all of their software onto cloud services—for reasons of privacy, security, cost, and, well, inertia, they will continue to run applications on their own hardware—and that Microsoft is in a unique position to help them do so in a way that complements their use if the cloud. Yes, companies like HP, IBM, and VMware have made a similar pitch, but Nadella’s argument is that none of these companies run cloud services at “hyperscale.” In other words, because Microsoft operates such a wide network of data centers, it can compete with Amazon and Google on price, and it can reach businesses in more locations worldwide.


Reaching more locations is important, not only to provide greater performance for local customers but to let them store data within local borders. Some business are reluctant to store data overseas, and some government forbid it.


According to the run rate disclosed in its latest financial earnings report, Microsoft’s cloud business is pulling in about $4.5 billion a year. That includes revenues from its online office software, Office365, as well as from Microsoft Azure, where businesses can rent computing power, and still, this represents only a small fraction of the more than $86 billion generated by all its products and services. But the hope is that this will become a much larger share of the company’s overall business in the years to come.


It will have to if the company is to remain relevant in the modern tech world. It’s telling that Nadella was on hand for today’s event, but not for the event last month where Microsoft unveiled its early work on its latest operating system, Windows 10. The Windows franchise is still hugely important to Microsoft, across all sorts of devices, from computer servers to desktops, tablets, and wearables. But Nadella realizes that in order for Microsoft to thrive, it’s vitally important to offer software and cloud computing services that can be accessed across the internet.


By offering an appliance alongside its cloud services, Microsoft hopes to encourage businesses to choose Azure over services from the likes of Amazon and Google, which do not over such hardware. And at the same time, the company has laid additional groundwork in an effort to compete with these giants.


Set to open a new set of data centers in Australia later this month, Microsoft will soon operate cloud services in 19 regions across the globe, and in recent years, it has revamped Azure to ensure that it can run most any software, including the Linux open source operating system and the many tools that run atop it. This is in some ways a big change for the company, which traditionally tried to shut out Linux for fear that it would eat into the market share of Windows, and it’s a necessary one. Linux is now hugely popular among those building modern online software.


Indeed, Microsoft revealed today that about 20 percent of the computing power served up by Azure is now used to run Linux. As Nadella put it: “Microsoft loves Linux.”


Update: This story has been updated with additional information from Microsoft’s press event and to correct the figures describing the revenue generated by the company’s products and services.



How to Stop Apple From Snooping on Your OS X Yosemite Searches


yosemite7

Christina Bonnington / WIRED



Today’s web users have grudgingly accepted that search terms they type into Google are far from private. But over the weekend, users of Apple’s latest operating system discovered OS X Yosemite pushes the limits of data collection tolerance one step further: its desktop search tool Spotlight uploads your search terms in real time to Apple’s remote servers, by default.


Fortunately for Apple’s angry users, however, this is one privacy invasion that’s easy to cut short.


Apple describes the new “feature” as an effort to include search results in Spotlight from iTunes, its App Store, and the Internet. If the user has enabled “Location Services” on his or her Mac, the computer’s location will be siphoned up to Apple, too, “to make suggestions more relevant to you.” And Apple notes on a Spotlight preferences description that the search terms will also be shared with Microsoft’s Bing search engine, an even more surprising destination for queries that Mac users likely believed they were typing in the privacy of their own computer.


“This is a very disappointing move for Apple,” said Runa Sandvik, a privacy-focused developer for the Freedom of the Press Foundation and a former developer for the anonymity software Tor. Why is this such a problem? She points to the hypothetical example of a journalist searching for sensitive files on his or her own computer, words which would then be shared with both Apple and Microsoft.


Sandvik notes that Apple doesn’t collect the private results of those desktop searches, and that Microsoft receives only common search terms from Spotlight without any personally identifying information about users. But given that Yosemite’s search-term-sucking setting is enabled by default, many users won’t even be aware of it. “For Apple to automatically learn about your location and your search terms when you’re using your computer normally isn’t something a lot of people would approve of if they knew about it,” Sandvik says.


Screen Shot 2014-10-20 at 1.17.53 PM

A screenshot of the Spotlight settings. Turning off functions 19, 20 and 21 will prevent Spotlight search terms from being shared with Apple and Microsoft. Credit: Ashkan Soltani



Luckily, Yosemite’s search-snooping can be switched off in seconds. In Mac OS X’s System Preferences, the functions can be found under “Spotlight” and then “Search Results.” From there you need to disable “Spotlight Suggestions,” “Bookmarks and History,” and “Bing Web Searches.” If you use Safari you will then need to disable the same “Spotlight Suggestions” function in the browser (under “Preferences” and then “Search”) to avoid having terms you type into its address bar shared with Apple by default too.


To make that privacy fix even simpler, developer Landon Fuller has written it into a simple Python script that he calls “Fix-MacOSX,” which he’s made available for download. “Mac OS X has always respected user privacy by default, and Mac OS X Yosemite should too,” the site reads. “Since it doesn’t, you can use the code to the left to disable the parts of Mac OS X which are invasive to your privacy.” The script is only the first step in what Fuller describes as a continuing project to identify ways that Yosemite “phones home” to Apple and to plug those privacy leaks.


As easy as the fix for Apple’s new Spotlight leaks may be, it’s unlikely most people will change their default settings, says Sandvik. That could potentially make their search and location data available to marketers or even law enforcement. She contrasts Apple’s aggressive new desktop data collection with its move to encrypt iOS devices so that even police with a warrant can’t force Apple to unlock them—a change widely applauded by privacy advocates. “Apple is talking about encryption in iOS on the one hand, and then they make this move with OS X, to enable all this logging and tracking by default,” she says. “It’s something not a lot of users are going to be aware of.”


We have reached out to Apple for comment and will update if we hear back.



How the Amazon Fire Phone Could Transform Customer Service


Amazon's Mayday...

Amazon’s Mayday… Amazon.com



Amazon, ever the customer-centric company, recently released its first smartphone, the Amazon Fire Phone. With it, the world’s largest online retailer and creator of the highly successful Kindle e-readers and tablets, took on the challenge to redefine the way people interact with their smartphone. Instead of tapping and touching, users are able to experience gesture control, mainly with face and eye tracking capabilities. Customer service is more immediate, with apps that give virtual customer service agents the ability to make adjustments on the phone in real-time. According to Jeff Bezos, Amazon’s CEO, the Fire Phone is not only about the unique and sophisticated hardware, but also the unique user experiences and superior customer service that it makes possible.


However, it has become clear that Amazon has chosen a difficult path. With the steep price of $649 (without a contract), the Fire Phone hasn’t yet created huge waves in the market since its launch. This could be due to the fact that it’s only offered through the AT&T and Amazon websites. And even if the product is purchased via Amazon, the SIM is locked to prevent it from being used on other wireless carriers.


As with all new market entries, low sales volume could also result in supply chain difficulties. Accessories and cases may be more difficult to obtain and selections will be limited. Additionally, few locations will service the phone and parts could be difficult to source. To make it easier for its customers, Amazon will most likely swap out phones rather than repair them, which many customers will prefer. From a hardware perspective, broken cameras are a typical problem with smartphones and the Fire Phone has four. Therefore, it might potentially raise the number of repairs, or in this case exchanges, that would be needed for each new phone.


The product reviews of the device have been mixed, especially compared to the launch of competitive products like the Galaxy and recent iPhones. The hardware design and performance is good yet the FireOS software is what is truly impressive, with a great new user interface that is truly unique. It’s a device for those tech lovers seeking something really new, even if they have to wait for a wide selection of apps and accessories to use with it.


Amazon’s Firefly app is one of the most innovative customer-centric and retail-focused applications introduced through the Fire Phone. With more than 10 million items in its database, it provides superior shopping support and makes life easier for the sophisticated shopper. Its open SDK’s are another competitive advantage, allowing developers to build apps of their own. One company, MyFitnessPal, has already built a camera feature into its health-conscious app to allow users to determine the nutritional content of packaged foods by snapping a photo of the label.


Mayday is the customer service highlight of the Fire Phone. It connects users with a tech support agent within 10-20 seconds and agents can access the device to walk users through each step of the troubleshooting process. Moreover, agents can highlight what they’re doing in real-time by drawing on the user’s screen. Amazon agents stay on the line and answer questions or provide tips on how to use the device until the customer is transferred to billing or until their problem is resolved.


These customer support apps are important breakthroughs for Amazon and the Fire Phone, as it will take time for users to become comfortable with a new OS and UI. Although Amazon is still iterating and perfecting the Fire Phone based on user feedback, Bezos is surely paying attention to the initial market feedback and will build those considerations into future versions. So far, it appears that Amazon’s customer-centric innovations will help to drive its success in the smartphone market, and may become the model for smartphone customer service across the industry.


Raul Sfat is Vice President of Sales and Marketing at B2X. Bobby Penn, who co-authored this post, is Vice President of Business Development at B2X.



Are Digital Magazines Dead?


Print's days may be numbered, but are digital's?

Print’s days may be numbered, but are digital’s? desbyrnephotos/Flickr



When pondering the future of digital magazines, the “I’m not dead yet” scene in Monty Python and the Holy Grail may come to mind. Is the digital magazine industry ready to be carted off with the rest of the dead? Gregg Hano, CEO of MAG+, wrote a great piece pointing out the fact that we are actually just in the infancy of digital magazines. Digital magazines at the moment only represent a small portion of total magazine circulation, but their subscriber base doubled from 2012 to 2013 (AAM semiannual periodical snapshot report). Coincidentally, there is a rise in the number of digital magazines published each year, especially in international markets.


It is often forgotten that the digital publication industry has only been around since 2010. This should come as no surprise considering it is also the birth year of the modern tablet industry. As is to be expected with any emerging market, it takes several years for the pioneers of the digital magazine age to develop an earnest understanding of the underlying technologies. At the same time, digital magazines are far less static than traditional publications, given the devices they are viewed on and the intimacy of the user experience. Understanding how to properly produce content for such a new, yet familiar medium has been an exercise in passion and patience requiring a set of skills that takes years to develop.


Digital publications must also deal with a number of barriers that other publishing avenues have never encountered. Unlike their print counterparts, these publications have to abide by the consumer uptake of a small subset of digital devices. A mere 3% of the US population owned a tablet following the initial iPad release in 2010. In the first part of 2013 that number approached 34% (Pew Research Internet Project). The barriers for digital magazine distribution are thus decreasing. At the same time digitizing platforms are broadening the scope of where digital magazines can be published, such as within websites and on smartphones.


Other industries are beginning to recognize the advantages of entering the digital magazine publication realm — namely the ease with which they are able to distribute content and capture a unique set of data. Of all the interested industries, retailers are the most vehement. They are drifting away from the traditional catalog layout towards lookbooks, which offer a more lifestyle-oriented experience to consumers. In an attempt to increase productivity and decrease production costs, brands are also beginning to use digital publications to showcase new products to retailers. The growth of business to business digital magazines is even greater than that of consumer focused magazines, reinforcing the point that we are just beginning in what should be a continued expansion of the digital magazine market.


With viewership increasing year after year and more organizations producing content, it looks as though we are only witnessing a beginning in this form of communication. It is, however, still far too early to speculate what that path of its evolution will look like. As barriers continue to fall, the adoption rate for digital magazines should see growth. Digital magazines provide a unique experience in an age where information flows quickly and readers jump around between content. They have the opportunity to captivate audiences in ways that many of their counterparts cannot. Creativity, therefore, will reward digital magazine publishers well.


Ryan Jones is co-founder and CEO of Pixbi.



IBM Stock Plummets as Company Abandons Chip Business


IBM CEO Ginni Rometty. Image: IBM

IBM CEO Ginni Rometty. Image: IBM



IBM’s stock price is on the decline, after the company agreed to pay Globalfoundries $1.5 billion to take on its ailing chipmaking business and abandoned its controversial “2015 Roadmap” to deliver $20 earnings per share by next year.


News of the Global Foundries deal arrived on Sunday, via Bloomberg, and IBM CEO Ginni Rommety confirmed the agreement during the company’s financial earning call on Monday morning, when she also deep-sixed the 2015 Roadmap. The company’s stock price is down 7 percent on the day.


The sale of the chip business is part of IBM’s longer term shift away from hardware towards higher margin businesses, and in the age of cloud computing—where businesses can do easily rent computing power over the internet without buying their own hardware—many other hardware giants have been following IBM’s lead. Earlier this month, HP announced plans to split into two separate companies, one for its desktop PC line and one for its cloud computing and enterprise software business. And last year, Dell went private last year to focus on transforming itself into a cloud and software company, while storage giant EMC spun-off its cloud computing and big data units into a new company called Pivotal.


IBM started down this path more than a decade ago when it acquired consulting company PricewaterhouseCoopers in 2002 and sold its desktop computing line to Chinese company Legend, now known as Lenovo, in 2005. But while IBM beat Dell and HP to party by several years, it has continued to struggle with its own transformation.


As reported by Bloomberg, IBM has sought a buyer for its chip business since at least last year. But the unit, which loses $1.5 billion a year, was a tough sell. The company could simply have shuttered the business, but it still depends on its chips to power its high-end mainframe line—one of the only hardware businesses IBM still retains. The company was ultimately forced to pay Globalfoundries, a former division of chipmaker AMD, to take it over.


That’s just one of many reasons that IBM wasn’t able to keep to the 2015 Roadmap promise, first made by former IBM CEO Sam Palmisano in 2010. To reach that goal, IBM has worked to transform itself into not just a consulting company, but a also provider of cloud services to rival the likes of Google and Amazon. To that end, IBM acquired cloud computing provider SoftLayer last year, and invested $1 billion into its Watson brand of analytics services.


But those initiatives have come too late to transform IBM into a cloud powerhouse in time to deliver on the 2015 roadmap, so the company has spent much of the year trying to find other ways to improve margins — most notably through a series of brutal layoffs. None of that was enough to reach its goals. Although IBM had abandoned the 2015 Roadmap, given the dumping of the chip division and the confirmation of yet another round of layoffs, it appears that IBM’s cost-cutting will be as ruthless as ever.


On brighter note, Rometty confirmed that cloud revenues are up. But, as Bloomberg points out, the company only expects cloud revenues of $3.1 billion this year, a small fraction of IBM’s $100 billion total revenue last year.



How the Amazon Fire Phone Could Transform Customer Service


Amazon's Mayday...

Amazon’s Mayday… Amazon.com



Amazon, ever the customer-centric company, recently released its first smartphone, the Amazon Fire Phone. With it, the world’s largest online retailer and creator of the highly successful Kindle e-readers and tablets, took on the challenge to redefine the way people interact with their smartphone. Instead of tapping and touching, users are able to experience gesture control, mainly with face and eye tracking capabilities. Customer service is more immediate, with apps that give virtual customer service agents the ability to make adjustments on the phone in real-time. According to Jeff Bezos, Amazon’s CEO, the Fire Phone is not only about the unique and sophisticated hardware, but also the unique user experiences and superior customer service that it makes possible.


However, it has become clear that Amazon has chosen a difficult path. With the steep price of $649 (without a contract), the Fire Phone hasn’t yet created huge waves in the market since its launch. This could be due to the fact that it’s only offered through the AT&T and Amazon websites. And even if the product is purchased via Amazon, the SIM is locked to prevent it from being used on other wireless carriers.


As with all new market entries, low sales volume could also result in supply chain difficulties. Accessories and cases may be more difficult to obtain and selections will be limited. Additionally, few locations will service the phone and parts could be difficult to source. To make it easier for its customers, Amazon will most likely swap out phones rather than repair them, which many customers will prefer. From a hardware perspective, broken cameras are a typical problem with smartphones and the Fire Phone has four. Therefore, it might potentially raise the number of repairs, or in this case exchanges, that would be needed for each new phone.


The product reviews of the device have been mixed, especially compared to the launch of competitive products like the Galaxy and recent iPhones. The hardware design and performance is good yet the FireOS software is what is truly impressive, with a great new user interface that is truly unique. It’s a device for those tech lovers seeking something really new, even if they have to wait for a wide selection of apps and accessories to use with it.


Amazon’s Firefly app is one of the most innovative customer-centric and retail-focused applications introduced through the Fire Phone. With more than 10 million items in its database, it provides superior shopping support and makes life easier for the sophisticated shopper. Its open SDK’s are another competitive advantage, allowing developers to build apps of their own. One company, MyFitnessPal, has already built a camera feature into its health-conscious app to allow users to determine the nutritional content of packaged foods by snapping a photo of the label.


Mayday is the customer service highlight of the Fire Phone. It connects users with a tech support agent within 10-20 seconds and agents can access the device to walk users through each step of the troubleshooting process. Moreover, agents can highlight what they’re doing in real-time by drawing on the user’s screen. Amazon agents stay on the line and answer questions or provide tips on how to use the device until the customer is transferred to billing or until their problem is resolved.


These customer support apps are important breakthroughs for Amazon and the Fire Phone, as it will take time for users to become comfortable with a new OS and UI. Although Amazon is still iterating and perfecting the Fire Phone based on user feedback, Bezos is surely paying attention to the initial market feedback and will build those considerations into future versions. So far, it appears that Amazon’s customer-centric innovations will help to drive its success in the smartphone market, and may become the model for smartphone customer service across the industry.


Raul Sfat is Vice President of Sales and Marketing at B2X. Bobby Penn, who co-authored this post, is Vice President of Business Development at B2X.



Microsoft Offers Cloud Device to Battle Google and Amazon


Microsoft CEO Satya Nadella.

Microsoft CEO Satya Nadella. Microsoft



Microsoft will soon offer a hardware appliance that will let businesses run something akin to its Azure cloud computing service inside their own data centers.


At a press event in San Francisco, Scott Guthrie, who oversees Microsoft’s cloud business, said that the appliance is based on hardware from Texas-based computer-server-seller Dell, and that it will arrive sometime next month. The idea is that the hardware device will make it easier for businesses to run software both inside their own data centers and atop Azure, an online service that provides access to computing power over the net.


In years past, Microsoft indicated that it would offer such an appliance through Dell and other partners, but this effort never quite came to fruition—until now. The news arrived as Guthrie and Microsoft CEO Satya Nadella laid out Microsoft’s larger strategy for competing in a modern world where businesses are increasingly moving their software onto cloud computing services.


After pioneering the modern notion of cloud computing, Amazon is still the market’s dominant player. And Google, which conceived so many of the underlying technologies that drive this market, is now pushing its own cloud services in a big way. But Nadella believes that Microsoft can challenge these web giants, not only because it can match—and perhaps even exceed—the scale of their online operations, but also because it can help businesses setup and run their own own software and hardware in their own computer data centers. “This,” Nadella said on Monday, “is something that only Microsoft does.”


The pitch is that businesses are a long way from moving all of their software onto cloud services—for reasons of privacy, security, cost, and, well, inertia, they will continue to run applications on their own hardware—and that Microsoft is in a unique position to help them do so in a way that complements their use if the cloud. Yes, companies like HP, IBM, and VMware have made a similar pitch, but Nadella’s argument is that none of these companies run cloud services at “hyperscale.” In other words, because Microsoft operates such a wide network of data centers, it can compete with Amazon and Google on price, and it can reach businesses in more locations worldwide.


According to its latest financial earnings report, Microsoft’s cloud business is pulling in about $4.5 billion a year. That includes revenues from its online office software, Office365, as well as from Microsoft Azure, where businesses can rent computing power, and still, this represents only a small fraction of the $70 billion the company makes each year for all its products and services. But the hope is that this will become a much larger share of the company’s overall business in the years to come.


It will have to if the company is to remain relevant in the modern tech world. It’s telling that Nadella was on hand for today’s event, but not for the event last month where Microsoft unveiled its early work on its latest operating system, Windows 10. The Windows franchise is still hugely important to Microsoft, across all sorts of devices, from computer servers to desktops, tablets, and wearables. But Nadella realizes that in order for Microsoft to thrive, it’s vitally important to offer software and cloud computing services that can be accessed across the internet.


By offering an appliance alongside its cloud services, Microsoft hopes to encourage businesses to choose Azure over services from the likes of Amazon and Google, which do not over such hardware. And at the same time, the company has laid additional groundwork in an effort to compete with these giants.


Set to open a new set of data centers in Australia later this month, Microsoft will soon operate cloud services in 19 regions across the globe, and in recent years, it has revamped Azure to ensure that it can run most any software, including the Linux open source operating system and the many tools that run atop it. This is in some ways a big change for the company, which traditionally tried to shut out Linux for fear that it would eat into the market share of Windows, and it’s a necessary one. Linux is now hugely popular among those building modern online software.


Indeed, Microsoft revealed today that about 20 percent of the computing power served up by Azure is now used to run Linux. As Nadella put it: “Microsoft loves Linux.”



How to Stop Apple From Snooping on Your OS X Yosemite Searches


yosemite7

Christina Bonnington / WIRED



Today’s web users have grudgingly accepted that search terms they type into Google are far from private. But over the weekend, user’s of Apple’s latest operating system discovered OS X Yosemite pushes the limits of data collection tolerance one step further: its desktop search tool Spotlight uploads your search terms in real time to Apple’s remote servers, by default.


Fortunately for Apple’s angry users, however, this is one privacy invasion that’s easy to cut short.


Apple describes the new “feature” as an effort to include search results in Spotlight from iTunes, its App Store, and the Internet. If the user has enabled “Location Services” on his or her Mac, the computer’s location will be siphoned up to Apple, too, “to make suggestions more relevant to you.” And Apple notes on a Spotlight preferences description that the search terms will also be shared with Microsoft’s Bing search engine, an even more surprising destination for queries that Mac users likely believed they were typing in the privacy of their own computer.


“This is a very disappointing move for Apple,” said Runa Sandvik, a privacy-focused developer for the Freedom of the Press Foundation and a former developer for the anonymity software Tor. Why is this such a problem? She points to the hypothetical example of a journalist searching for sensitive files on his or her own computer, words which would then be shared with both Apple and Microsoft.


Sandvik notes that Apple doesn’t collect the private results of those desktop searches, and that Microsoft receives only common search terms from Spotlight without any personally identifying information about users. But given that Yosemite’s search-term-sucking setting is enabled by default, many users won’t even be aware of it. “For Apple to automatically learn about your location and your search terms when you’re using your computer normally isn’t something a lot of people would approve of if they knew about it,” Sandvik says.


Screen Shot 2014-10-20 at 1.17.53 PM

A screenshot of the Spotlight settings. Turning off functions 19, 20 and 21 will prevent Spotlight search terms from being shared with Apple and Microsoft. Credit: Ashkan Soltani



Luckily, Yosemite’s search-snooping can be switched off in seconds. In Mac OS X’s System Preferences, the functions can be found under “Spotlight” and then “Search Results.” From there you need to disable “Spotlight Suggestions,” “Bookmarks and History,” and “Bing Web Searches.” If you use Safari you will then need to disable the same “Spotlight Suggestions” function in the browser (under “Preferences” and then “Search”) to avoid having terms you type into its address bar shared with Apple by default too.


To make that privacy fix even simpler, developer Landon Fuller has written it into a simple Python script that he calls “Fix-MacOSX,” which he’s made available for download. “Mac OS X has always respected user privacy by default, and Mac OS X Yosemite should too,” the site reads. “Since it doesn’t, you can use the code to the left to disable the parts of Mac OS X which are invasive to your privacy.” The script is only the first step in what Fuller describes as a continuing project to identify ways that Yosemite “phones home” to Apple and to plug those privacy leaks.


As easy as the fix for Apple’s new Spotlight leaks may be, it’s unlikely most people will change their default settings, says Sandvik. That could potentially make their search and location data available to marketers or even law enforcement. She contrasts Apple’s aggressive new desktop data collection with its move to encrypt iOS devices so that even police with a warrant can’t force Apple to unlock them—a change widely applauded by privacy advocates. “Apple is talking about encryption in iOS on the one hand, and then they make this move with OS X, to enable all this logging and tracking by default,” she says. “It’s something not a lot of users are going to be aware of.”


We have reached out to Apple for comment and will update if we hear back.



Are Digital Magazines Dead?


Print's days may be numbered, but are digital's?

Print’s days may be numbered, but are digital’s? desbyrnephotos/Flickr



When pondering the future of digital magazines, the “I’m not dead yet” scene in Monty Python and the Holy Grail may come to mind. Is the digital magazine industry ready to be carted off with the rest of the dead? Gregg Hano, CEO of MAG+, wrote a great piece pointing out the fact that we are actually just in the infancy of digital magazines. Digital magazines at the moment only represent a small portion of total magazine circulation, but their subscriber base doubled from 2012 to 2013 (AAM semiannual periodical snapshot report). Coincidentally, there is a rise in the number of digital magazines published each year, especially in international markets.


It is often forgotten that the digital publication industry has only been around since 2010. This should come as no surprise considering it is also the birth year of the modern tablet industry. As is to be expected with any emerging market, it takes several years for the pioneers of the digital magazine age to develop an earnest understanding of the underlying technologies. At the same time, digital magazines are far less static than traditional publications, given the devices they are viewed on and the intimacy of the user experience. Understanding how to properly produce content for such a new, yet familiar medium has been an exercise in passion and patience requiring a set of skills that takes years to develop.


Digital publications must also deal with a number of barriers that other publishing avenues have never encountered. Unlike their print counterparts, these publications have to abide by the consumer uptake of a small subset of digital devices. A mere 3% of the US population owned a tablet following the initial iPad release in 2010. In the first part of 2013 that number approached 34% (Pew Research Internet Project). The barriers for digital magazine distribution are thus decreasing. At the same time digitizing platforms are broadening the scope of where digital magazines can be published, such as within websites and on smartphones.


Other industries are beginning to recognize the advantages of entering the digital magazine publication realm — namely the ease with which they are able to distribute content and capture a unique set of data. Of all the interested industries, retailers are the most vehement. They are drifting away from the traditional catalog layout towards lookbooks, which offer a more lifestyle-oriented experience to consumers. In an attempt to increase productivity and decrease production costs, brands are also beginning to use digital publications to showcase new products to retailers. The growth of business to business digital magazines is even greater than that of consumer focused magazines, reinforcing the point that we are just beginning in what should be a continued expansion of the digital magazine market.


With viewership increasing year after year and more organizations producing content, it looks as though we are only witnessing a beginning in this form of communication. It is, however, still far too early to speculate what that path of its evolution will look like. As barriers continue to fall, the adoption rate for digital magazines should see growth. Digital magazines provide a unique experience in an age where information flows quickly and readers jump around between content. They have the opportunity to captivate audiences in ways that many of their counterparts cannot. Creativity, therefore, will reward digital magazine publishers well.


Ryan Jones is co-founder and CEO of Pixbi.



All the New Things in iOS 8.1


With Apple Pay, you can use Touch ID to complete mobile payments.

With Apple Pay, you can use Touch ID to complete mobile payments. Josh Valcarcel/WIRED



Apple launched iOS 8 with some major features absent. Some were meant to work with the company’s yet-to-be-released desktop OS, Yosemite. Others were presumably still being fine-tuned. Now, with the official release of iOS 8.1, Apple is finally bringing these holdouts to its mobile platform.


First, some caveats. You’ll need to have OS X Yosemite installed to take advantage of some of these features. Others require a device with Touch ID, so an iPhone 5s or higher, or a new iPad Air 2 or third generation iPad mini. If you meet those requirements, here’s what to look forward to with the free iOS 8.1 download.


Apple Pay


Apple’s long-rumored payment platform finally arrives on iOS 8-running devices with the 8.1 update.


You add a card to Apple Pay by taking a picture of it. After verification from your bank, the card is added to Passbook, where you can access it for future purchases using an iPhone 6 or 6 Plus. Owners can quickly open their card Passbook to use in retail stores and pay by tapping their phone at NFC payment terminals. A finger confirmation to your Touch ID sensor completes the purchase.


There are hundreds of apps and retailers already ready for Apple Pay, including Nike, Petco, Whole Foods, and Bloomingdale’s. But Apple expects most Apple Pay usage to come on the digital front at first, through in-app purchases from titles like Instacart, AirBnB, and StubHub. Since they have Touch ID on their devices, iPhone 5s and new iPad owners will also be able to take advantage of the in-app purchases of Apple Pay.


Instant Hotspot


No Wi-Fi connection? Not a problem with Instant Hotspot. Instant Hotspot lets your Mac remotely access your iPhone’s personal hotspot automatically if they’re located near one another. Your iPhone appears as a network in your Mac’s Wi-Fi dropdown menu, so you can connect to it with a tap. Once connected, your Mac displays your phone’s battery life and signal strength at the top of the Wi-Fi menu. Hotspot also automatically disconnects when you quit browsing.


Unfortunately, this feature doesn’t work for iOS users on some carriers. If your plan doesn’t allow tethering, you will get a message saying you should visit your carrier to turn on this hotspot feature.


Beta iCloud Photo Library Access


Apple’s iCloud Photo Library will be available starting as a beta in iOS 8.1. It looks and functions similarly to the Photo app on iOS, except it’s available for the browser through iCloud.com. If you didn’t already switch it on when you downloaded iOS 8, you can enable access by going into your device settings and navigating to iCloud > Photos. As it’s still a beta, you’ll want to back up your photos before enabling the feature.


Previously, Photo Stream, Apple’s free cloud-based photo storage, only stored your last 1,000 photos for 30 days. iCloud Photo Library is different. It stores as much as you want, granted you’ve got that much space in iCloud. Thus, all photos uploaded to iCloud Photo Library will count towards your iCloud storage. Upgrade pricing is listed here, with options ranging from $0.99 for 20 GB of storage to $20 for 1 TB of iCloud storage. You can upgrade your storage level right on your iOS device.


If you’re not interested in online storage and also not happy with the Photo Stream version in iOS 8, Apple is bringing back the option of Camera Roll in 8.1.


SMS Relay


With iOS 8.1, another Continuity feature, SMS Relay, gets the green light. SMS Relay lets you send and receive SMS messages on either your iOS device or your Mac (given your Mac is running OS X Yosemite). This means you can chat with your friends and family regardless of the Apple device you and your family and friends are using.



A Click Above




“Thank God Amazon created a new flagship e-reader. The Kindle Paperwhite is a terrible piece of crap.”


“For too long have we suffered the Tyranny of the Kindle Paperwhite and its myriad flaws.”


“I’ve resisted buying an e-reader for years because the best available example, the Kindle Paperwhite, is utter garbage.”


These are fictional excerpts from imaginary reviews. Nobody wrote any of that stuff, because the two-year owner of the e-reader crown—the Kindle Paperwhite—is an exceptional gadget. Just last year, we gave it a 9/10, which is effectively the highest rating WIRED gives out.


And with good reason: You have to use the Kindle Paperwhite a LOT to find flaws. Sure, its touchscreen isn’t the most accurate input device on the market. While we’re picking at micro-scabs, the recessed screen feels a little dated in an era of flush-faced, full-color tablets that are dollar-bill-thin.


And yet Amazon didn’t just update the Paperwhite to refresh the top of its e-reader heap, the company created a completely different product: the Kindle Voyage.


Here is what you want to know: Yes, the Kindle Voyage is better than the Kindle Paperwhite. It’s thinner, faster, brighter, lighter, newer, has a better screen, has more memory (4GB vs last year’s Paperwhite’s 2GB) commands more magical elf armies, owns a Ferrari, and is nicer to your grandmother.


The Screen


With a resolution of 300ppi, the Voyage’s 16-level grayscale e-ink screen is on a density par with smartphone screens a generation or so ago. It’s not going to blow your brain out of your ears with an incredible facsimile of the real world, but you are only looking at words. The letters that make up those words are very smoothly rendered. Whereas you can discern the pixels of rounded letters and diagonal lines on the 212ppi Paperwhite with the aid of a magnifying loupe, you have to squint to see them on the Voyage in similar circumstances.


The driving point of the Voyage project internally, according to Amazon, was to get even closer to the actual experience of reading on paper. The boosted contrast and sharpness really helps that, but the screen is further enhanced: It sits closer to the surface of the frontside glass. The letters almost appear printed on the underside of the glass.


That glass is etched to further resemble dead tree—in look as well as feel. Its matte surface is intended to cut glare. It does, but not noticeably better than its predecessor. The roughened glass also aims to feel like a page in an actual book. OK. If you’re being very very picky, the Paperwhite’s plastic screen actually feels more like paper than the Voyage’s glass front.


Who cares. Physical books are inferior to e-readers. (That is my opinion. Bring on the debate in the comments, and we will respectfully disagree with one another.) If you want to read a book in the middle of the night, you need to turn on the lamp and wake up your bedmate or make a sheet-tent and bust out the flashlight that will then have a dead battery the next time the power goes out or there’s a disaster or you are searching under your bed for a missing sock.


Tablets and illuminated e-readers thrust us into the future with their LEDs, and the Voyage has the best glowing screen yet. Its light is a scosh cooler than previous Kindles, and, if you let it, it will automatically brighten or dim the screen to match the ambient light in the room. This is a wonderful feature, and really helps eye fatigue when you’re reading in a dark room.


The Reading Experience


There’s only one major step forward on the Voyage, and, fortunately, it directly affects the reading experience. The three-zone touchscreen on the Paperwhite (and its oft-forgotten predecessor, the Kindle Touch), was never its strongest attribute. While the Voyage retains the touch zones—and their ability to let you tap forward, backwards, or into a menu—it also adds dedicated page-turn button-y touchstrips on the left- and right-hand bezels.


These are great, and they make one-handed reading much easier. They aren’t perfect, though. Their extremely shallow click is aided by a haptic buzz, which is a pretty important UX touch. It soothes some of the did-I-hit-that-right nervousness that accompanies any form of touchscreen. The buzz works quite well on a naked tablet, but not so much if you have the Voyage in one its available cases. (Easy fix: Skip the cases—they’re like 50 bucks, and this thing is tough as a cactus.)


The other point against the haptic buttons is that both sides trigger the same forward page-turn—even though the one on the left sits adjacent to the zone on the touchscreen that sends you in reverse. This is probably a boon for left-handed readers, but it would be great if they were at least customizable. Some readers like to flip back and forth.


Extra Points


Testing battery life on any new e-reader is essentially impossible. After charging our test-Voyage fully, I turned on both the Wi-Fi and 3G radios and read a book. Then I gave the thing to WIRED’s edit fellow, Max, and asked him to flip pages rapid-fire for an hour straight. (Follow him on Twitter, he’s cool.) Then I read another book. At the end of a week of brutality, the battery is about two-thirds-full.


That effectively means the thing has unlimited battery life. Most people will probably be fine if they remember to plug the Voyage in once a month.


And if you should happen to drop the Voyage while fumbling with the charger, don’t fret. This is no delicate tab. Even though it’s only 7.6mm thick (vs the Paperwhite’s 9.1) and 6.4 ounces (vs the Paperwhite’s 7.5) it takes falls like a stuntman. I threw it onto my hardwood floor 100 times (sorry, Neighbors!) and barely even nicked the thing. Then I took it to work and threw it around the office like a crazy person. No damage. I carried it to and from work in my bag with the cover off for most of a week, and I can’t find a scratch on the screen. It’s a beast.


The engineering that went into making the Paperwhite was, according to Amazon, some of the hardest work its hardware engineers have ever done. The effort shows. The screen is a real step forward for e-ink readers, and the magnesium back, faceted with the same design language of the Fire HDX, displays some of the best fit and finish in the Amazon product family. In fact, it’s hard to find fault with the either the direction or execution of the Voyage.


But not impossible.


Why isn’t this thing waterproof? It doesn’t take a Hugo-winner to imagine a situation in which a Kindle would have to withstand a substance that falls from the sky somewhat unpredictably and covers some 71-percent of our planet’s surface. You get caught in a rainstorm and your bag isn’t waterproof. You’re reading at the beach. You’re reading in the tub. You’re done reading in bed, and, when calling it a night, you by mistake knock over your bedside glass of water with your Kindle.


These aren’t extenuating circumstances, they’re everyday occurrences. So it’s great that companies like Waterfi will sell you a customized waterproof Kindle (works great, highly recommend), and points to Kobo and Pocketbook for their off-the-shelf waterproof readers, but Amazon needs to tackle this problem as well.


This shortcoming becomes especially clear in light of the Voyage’s price. While you can get an excellent Paperwhite starting at $119, the Voyage starts at $199; it almost hits the $300 mark when you add 3G and delete the special offers. With that kind of a price discrepancy, you have to either be fanatically devoted to having The Newest Thing or a serious reader to choose the Voyage over the Paperwhite. That said, if you have the scratch, pony up: Once you get used to those clicky strips on the Voyage’s bezel, it is pretty hard to go back.


Update 10:23AM: Updated to acknowledge Kobo and Pocketbook’s waterproof readers.