4K TV’s Biggest Content Problem: No Live Broadcasts Anytime Soon


Samsung SUHD 4K TV

Alex Washburn/WIRED



Every story about 4K TV has a cookie-cutter plot: The new 4K TVs look great. They’re affordable now, and a 4K future is inevitable. Great story! Except you won’t find 4K content on the broadcast dial any time soon. You have to stream it, which requires gobs of bandwidth. And there really isn’t much to watch in 4K yet anyway.


By the end of 2015, that last part shouldn’t be a problem. Netflix and Amazon are busy bulking up their content coffers. Streaming won’t be the only option for long, because 4K Blu-ray is due by the holidays. But while we’ll have more tack-sharp content to watch, it’ll only be the canned variety: streamed video, video on demand, video on discs. Movies and series, not live 4K TV.


The basic scenario that used to be synonymous with “watching TV”—you turn on your 4K TV, surf the channel guide, and choose between live 4K ESPN and live 4K NBC and live 4K Food Network using a 4K remote that you 4King hate—is at least a year away. Probably closer to seven years away. It may never even happen.


Does it matter? Since season one of House of Cards landed on Netflix in one glorious chunk, our viewing habits have shifted from “appointment TV” to “binge-watch on the weekend.” Old-school TV is an anachronism now that media services have adapted to our schedules, not the other way around.


That’s not to say live TV isn’t still important for a limited pool of content: live sports, political events, and the really big shows like the Super Bowl, the Oscars, the World Cup, the Olympics, and so on. This is a level of programming that needs live video to be relevant. Which is why these events will dictate how live 4K is delivered.


The 4K Front Lines: Live NBA Tests


On January 15, the Milwaukee Bucks took on the New York Knicks in London. True to form, the Knicks trailed 14-0 at the outset and lost their 16th straight game. But this was much more than another depressing showing by Knicks—it was the first trans-Atlantic Ultra HD streaming test for a live NBA game.


It was an in-house test for the NBA in a partnership with BT Sport, which has a broadcast deal with the NBA in the UK and Ireland. BT Sport is a major player in these early stages of live 4K testing, having conducted similar tests for rugby games and golf’s Ryder Cup.


At London’s O2 Arena, BT Sport arranged the crew and hardware, including eight 4K cameras and a dedicated 4K broadcast truck. In the truck, the feed was encoded, compressed and streamed out at 15Mbps. That stream sailed the Atlantic with help from an Akamai content delivery network. Nearly 3,500 miles away, at NBA headquarters in Manhattan, employees gathered in a lunchroom to watch the game on a 4K TV. To inject a bit of real-world uncertainty into the test, the footage was streamed over the office’s public Internet.


When the feed worked, it looked great. Employees could see a real difference between the 4K feed and the same game in HD on other TVs in the lunchroom. The extra resolution was especially noticeable when scanning the crowd and the players on the bench; you could see facial features on people several rows deep.


But there were intermittent outages, as well as a noticeable delay between the 4K feed and the HD feed. This was no surprise; that’s why they do in-house tests. According to Steve Hellmuth, Executive Vice President of Operations and Technology for NBA Entertainment, feed latency was a known issue going in.


“There’s definitely a bit of a delay in the live feed due to encoding,” says Hellmuth, who notes that the same issues apply to HD video. “When I do streaming in the United States in HD, there’s at least a six-second delay.”


Hellmuth says that a 4K feed to O2 Arena’s suites went very smoothly. There was no streaming involved with that footage; it was a direct feed from the 4K switcher on site. So while the promise of live 4K streaming to the home is strong, there are still puzzles to be solved before making it a viable reality: Encoding times, general stability, and the varying residential connection speeds.


Live 4K streams can be compressed down to less than 20Mbps. That’s still much fatter than a 1080p live stream, typically 6Mbps, and too burdensome for most home broadband connections.


There was another NBA test, too. Time Warner SportsNet, Time Warner Cable, and Cisco collaborated on their own Ultra HD testing for a December 23 game between the Los Angeles Lakers and Golden State Warriors. This one involved piping a massive 12Gbps feed from the Staples Center in western Los Angeles about a dozen miles south to the city of El Segundo, where it was encoded to HEVC at a SportsNet facility, then fed back as a 19Mbps stream to monitors at the Staples Center. The feat was pulled off using a dedicated high-bandwidth Time Warner network.


Ken Dumont, manager of business development at Cisco Service Provider Video Software & Solutions, says it went well.


“It was fabulous video, the most impressive thing I’ve ever seen video-wise,” he says.


Considering Time Warner and Cisco used a dedicated network for that delivery, this Lakers game test was more of an internal “let’s see what we can do” rather than a real-world demonstration. 19Mbps is a pretty fat stream. Netflix’s “Super HD” 1080p streams, currently used for HD shows or movies, have a 6Mbps bitrate, so we’re talking about a stream with a bitrate more than three times that of 1080p HD.


If it were available to the public, watching December’s Lakers game on your own couch would have required a very fast household download speed. And even if you can get 20 or 25Mbps to your home, there are other factors—network bottlenecks, old hardware along the data pipe, multiple devices connected to your home network—that could keep the stream from coming in cleanly.


That high-bitrate live video may be problematic for U.S. homes now, but Dumont sees potential for live 4K video to gain traction sooner in countries with excellent Internet speeds.


“South Korea is once again leading the world with this,” Dumont says. “They have the bandwidth in their individual homes and they are moving that way. We have sold 4K (encoders) to a broadcaster within Korea, and they’re about to place more orders. So they’re moving.”


Beyond the NBA: Smaller Bitrates for the Masses


Comcast performed its own informal tests during the 2014 Winter Olympics in Sochi, but the company says that test was an early pass at the possibilities rather than a real-world case study. They used cinema 4K cameras, which didn’t have autofocus systems—not ideal cameras for sports.


While Olympic Broadcast Services plans to launch its own live video channels in time for the 2016 Summer Olympics in Rio de Janeiro, according to the Hollywood Reporter, 4K video will not be part of those offerings. That leaves the 2018 World Cup and the 2018 Winter Olympics as the first time we may see live 4K broadcasts on a global scale.


The NFL uses 4K cameras to capture goal-line and sideline plays for sharper replays, but has no definitive plans to broadcast live games in 4K just yet.


A couple of U.S. events may get there first. Both the Oscars and the Super Bowl seem like excellent matches for live 4K, but any plans to stream them in Ultra HD are hush-hush. Disney/ABC Television says it doesn’t have anything to announce regarding plans for 4K Oscars. The NFL Network does a combined production with CBS for Thursday Night Football games, and they already use 4K cameras to capture goal-line and sideline plays for sharper enlarged replays. Still, the NFL says there are no definitive plans to shoot and broadcast live games in 4K yet.


Live-streaming the Oscars will likely be easier than live-streaming a sporting event. One of the hurdles with the NBA tests were the high frame rates (50fps for the London test and 60fps for the Lakers game), which translate to higher bitrates for the video stream. This higher frame-rate stuff also might not work with some early 4K sets: 4K video at 24fps is compatible with the HDMI 1.4 spec, but 50fps and 60fps 4K feeds require an HDMI 2.0-capable set.


Comcast says it is concentrating on streaming 24p video first. The company may have a first-generation box for episodic content and movies (4K at 24fps), and a second one for sports (4K at 60fps). The faster frame rate for sports means the video bitrate is around 18 or 19 Mbps, while for 24fps movies and episodic content, it’s around 12Mbps—much more manageable given the current network infrastructure, and much easier to deliver to a wider audience. By using new types of compression and a new workflow, the company thinks it can shrink those bitrates even more without compromising picture quality.


Infrastructure: The Big Picture


Even after the 4K streams have been optimized at the source, they’ll still require at least two to three times the bandwidth you’d need today to watch a 1080p HD feed. This is a problem that the industry can only solve by reorganizing its infrastructure, something that requires not only a significant capital investment, but also a lot of time. It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


The analog reclamation process of just a few years ago, when cable companies repurposed parts of the analog broadcast spectrum for the digital-TV switchover, took years. There are also costly 4K upgrades for broadcasters to consider in order to handle live broadcasts in 4K—encoding, switching, and other hardware. Many of them may be waiting for 8K (the next big leap in picture resolution, which quadruples the pixel count) to make those kinds of expensive upgrades. They don’t want to have to do it twice, back-to-back.


Paul O’Donovan, principal research analyst at Gartner, says the costs will be less for the players in the delivery ecosystem who don’t have to deal with as many cables and routers. Like satellite, for instance.


“I think satellite will explode with a wealth of 4K content very soon, around the globe,” says O’Donovan. “It’s not an issue of bandwidth or capacity, it’s more to do with adding or replacing equipment along the distribution channel. This is more expensive for cable operators and for network TV companies than it is for satellite pay-TV operators or Internet-delivery systems.”


It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


But until satellite services and 4K set-top boxes are actually available, we’ll be stuck with the growing “app-ification” of television. With no 4K equivalent to devices like Roku boxes and sticks, the Amazon Fire TV box and stick, and Apple TV, the primary delivery mechanism for UHD video on a 4K TV right now is through TV-installed apps. There’s quite a bit of fragmentation on that front at the moment, making the brand of set you buy a gateway to the content you’ll get.


Gartner’s O’Donovan thinks app exclusivity is merely a temporary issue.


“This is initial product differentiation by manufacturers and service providers to gain early market share,” says O’Donovan. “These product ties will dilute quickly as more 4K content comes… But the old pay-TV model via (set-top box) will still continue because there will always be deals to limit access to certain content through a single provider.”


Pulling in a 4K signal over the air should also be possible, but it will take years if it happens at all. First, major networks will need to decide to broadcast content in 4K and upgrade their equipment. Then they’ll need to get on the same page regarding next-generation broadcasting technologies.


The most promising of those is ATSC 3.0, a proposed standard for television tuners that would not only allow over-the-air 4K broadcasts, but could also broadcast directly to mobile devices and add interactive elements to broadcast TV. That’s at least a few years out, and not all major networks are fully behind ATSC 3.0. Also, because ATSC 3.0 isn’t backwards-compatible with the ATSC tuners in today’s TVs, you’ll need new hardware.


But new hardware is the easy part. Shiny new 4K sets are nudging the old stuff off the shelves. In fact, if you want to buy a new 60-inch TV now, good luck finding a new model will all the latest features that isn’t a 4K TV. And when you take it home and turn it on, be prepared to get your 4K goodness through a built-in app. Because you certainly won’t be watching anything live in 4K. Not for years.



4K TV’s Biggest Content Problem: No Live Broadcasts Anytime Soon


Samsung SUHD 4K TV

Alex Washburn/WIRED



Every story about 4K TV has a cookie-cutter plot: The new 4K TVs look great. They’re affordable now, and a 4K future is inevitable. Great story! Except you won’t find 4K content on the broadcast dial any time soon. You have to stream it, which requires gobs of bandwidth. And there really isn’t much to watch in 4K yet anyway.


By the end of 2015, that last part shouldn’t be a problem. Netflix and Amazon are busy bulking up their content coffers. Streaming won’t be the only option for long, because 4K Blu-ray is due by the holidays. But while we’ll have more tack-sharp content to watch, it’ll only be the canned variety: streamed video, video on demand, video on discs. Movies and series, not live 4K TV.


The basic scenario that used to be synonymous with “watching TV”—you turn on your 4K TV, surf the channel guide, and choose between live 4K ESPN and live 4K NBC and live 4K Food Network using a 4K remote that you 4King hate—is at least a year away. Probably closer to seven years away. It may never even happen.


Does it matter? Since season one of House of Cards landed on Netflix in one glorious chunk, our viewing habits have shifted from “appointment TV” to “binge-watch on the weekend.” Old-school TV is an anachronism now that media services have adapted to our schedules, not the other way around.


That’s not to say live TV isn’t still important for a limited pool of content: live sports, political events, and the really big shows like the Super Bowl, the Oscars, the World Cup, the Olympics, and so on. This is a level of programming that needs live video to be relevant. Which is why these events will dictate how live 4K is delivered.


The 4K Front Lines: Live NBA Tests


On January 15, the Milwaukee Bucks took on the New York Knicks in London. True to form, the Knicks trailed 14-0 at the outset and lost their 16th straight game. But this was much more than another depressing showing by Knicks—it was the first trans-Atlantic Ultra HD streaming test for a live NBA game.


It was an in-house test for the NBA in a partnership with BT Sport, which has a broadcast deal with the NBA in the UK and Ireland. BT Sport is a major player in these early stages of live 4K testing, having conducted similar tests for rugby games and golf’s Ryder Cup.


At London’s O2 Arena, BT Sport arranged the crew and hardware, including eight 4K cameras and a dedicated 4K broadcast truck. In the truck, the feed was encoded, compressed and streamed out at 15Mbps. That stream sailed the Atlantic with help from an Akamai content delivery network. Nearly 3,500 miles away, at NBA headquarters in Manhattan, employees gathered in a lunchroom to watch the game on a 4K TV. To inject a bit of real-world uncertainty into the test, the footage was streamed over the office’s public Internet.


When the feed worked, it looked great. Employees could see a real difference between the 4K feed and the same game in HD on other TVs in the lunchroom. The extra resolution was especially noticeable when scanning the crowd and the players on the bench; you could see facial features on people several rows deep.


But there were intermittent outages, as well as a noticeable delay between the 4K feed and the HD feed. This was no surprise; that’s why they do in-house tests. According to Steve Hellmuth, Executive Vice President of Operations and Technology for NBA Entertainment, feed latency was a known issue going in.


“There’s definitely a bit of a delay in the live feed due to encoding,” says Hellmuth, who notes that the same issues apply to HD video. “When I do streaming in the United States in HD, there’s at least a six-second delay.”


Hellmuth says that a 4K feed to O2 Arena’s suites went very smoothly. There was no streaming involved with that footage; it was a direct feed from the 4K switcher on site. So while the promise of live 4K streaming to the home is strong, there are still puzzles to be solved before making it a viable reality: Encoding times, general stability, and the varying residential connection speeds.


Live 4K streams can be compressed down to less than 20Mbps. That’s still much fatter than a 1080p live stream, typically 6Mbps, and too burdensome for most home broadband connections.


There was another NBA test, too. Time Warner SportsNet, Time Warner Cable, and Cisco collaborated on their own Ultra HD testing for a December 23 game between the Los Angeles Lakers and Golden State Warriors. This one involved piping a massive 12Gbps feed from the Staples Center in western Los Angeles about a dozen miles south to the city of El Segundo, where it was encoded to HEVC at a SportsNet facility, then fed back as a 19Mbps stream to monitors at the Staples Center. The feat was pulled off using a dedicated high-bandwidth Time Warner network.


Ken Dumont, manager of business development at Cisco Service Provider Video Software & Solutions, says it went well.


“It was fabulous video, the most impressive thing I’ve ever seen video-wise,” he says.


Considering Time Warner and Cisco used a dedicated network for that delivery, this Lakers game test was more of an internal “let’s see what we can do” rather than a real-world demonstration. 19Mbps is a pretty fat stream. Netflix’s “Super HD” 1080p streams, currently used for HD shows or movies, have a 6Mbps bitrate, so we’re talking about a stream with a bitrate more than three times that of 1080p HD.


If it were available to the public, watching December’s Lakers game on your own couch would have required a very fast household download speed. And even if you can get 20 or 25Mbps to your home, there are other factors—network bottlenecks, old hardware along the data pipe, multiple devices connected to your home network—that could keep the stream from coming in cleanly.


That high-bitrate live video may be problematic for U.S. homes now, but Dumont sees potential for live 4K video to gain traction sooner in countries with excellent Internet speeds.


“South Korea is once again leading the world with this,” Dumont says. “They have the bandwidth in their individual homes and they are moving that way. We have sold 4K (encoders) to a broadcaster within Korea, and they’re about to place more orders. So they’re moving.”


Beyond the NBA: Smaller Bitrates for the Masses


Comcast performed its own informal tests during the 2014 Winter Olympics in Sochi, but the company says that test was an early pass at the possibilities rather than a real-world case study. They used cinema 4K cameras, which didn’t have autofocus systems—not ideal cameras for sports.


While Olympic Broadcast Services plans to launch its own live video channels in time for the 2016 Summer Olympics in Rio de Janeiro, according to the Hollywood Reporter, 4K video will not be part of those offerings. That leaves the 2018 World Cup and the 2018 Winter Olympics as the first time we may see live 4K broadcasts on a global scale.


The NFL uses 4K cameras to capture goal-line and sideline plays for sharper replays, but has no definitive plans to broadcast live games in 4K just yet.


A couple of U.S. events may get there first. Both the Oscars and the Super Bowl seem like excellent matches for live 4K, but any plans to stream them in Ultra HD are hush-hush. Disney/ABC Television says it doesn’t have anything to announce regarding plans for 4K Oscars. The NFL Network does a combined production with CBS for Thursday Night Football games, and they already use 4K cameras to capture goal-line and sideline plays for sharper enlarged replays. Still, the NFL says there are no definitive plans to shoot and broadcast live games in 4K yet.


Live-streaming the Oscars will likely be easier than live-streaming a sporting event. One of the hurdles with the NBA tests were the high frame rates (50fps for the London test and 60fps for the Lakers game), which translate to higher bitrates for the video stream. This higher frame-rate stuff also might not work with some early 4K sets: 4K video at 24fps is compatible with the HDMI 1.4 spec, but 50fps and 60fps 4K feeds require an HDMI 2.0-capable set.


Comcast says it is concentrating on streaming 24p video first. The company may have a first-generation box for episodic content and movies (4K at 24fps), and a second one for sports (4K at 60fps). The faster frame rate for sports means the video bitrate is around 18 or 19 Mbps, while for 24fps movies and episodic content, it’s around 12Mbps—much more manageable given the current network infrastructure, and much easier to deliver to a wider audience. By using new types of compression and a new workflow, the company thinks it can shrink those bitrates even more without compromising picture quality.


Infrastructure: The Big Picture


Even after the 4K streams have been optimized at the source, they’ll still require at least two to three times the bandwidth you’d need today to watch a 1080p HD feed. This is a problem that the industry can only solve by reorganizing its infrastructure, something that requires not only a significant capital investment, but also a lot of time. It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


The analog reclamation process of just a few years ago, when cable companies repurposed parts of the analog broadcast spectrum for the digital-TV switchover, took years. There are also costly 4K upgrades for broadcasters to consider in order to handle live broadcasts in 4K—encoding, switching, and other hardware. Many of them may be waiting for 8K (the next big leap in picture resolution, which quadruples the pixel count) to make those kinds of expensive upgrades. They don’t want to have to do it twice, back-to-back.


Paul O’Donovan, principal research analyst at Gartner, says the costs will be less for the players in the delivery ecosystem who don’t have to deal with as many cables and routers. Like satellite, for instance.


“I think satellite will explode with a wealth of 4K content very soon, around the globe,” says O’Donovan. “It’s not an issue of bandwidth or capacity, it’s more to do with adding or replacing equipment along the distribution channel. This is more expensive for cable operators and for network TV companies than it is for satellite pay-TV operators or Internet-delivery systems.”


It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


But until satellite services and 4K set-top boxes are actually available, we’ll be stuck with the growing “app-ification” of television. With no 4K equivalent to devices like Roku boxes and sticks, the Amazon Fire TV box and stick, and Apple TV, the primary delivery mechanism for UHD video on a 4K TV right now is through TV-installed apps. There’s quite a bit of fragmentation on that front at the moment, making the brand of set you buy a gateway to the content you’ll get.


Gartner’s O’Donovan thinks app exclusivity is merely a temporary issue.


“This is initial product differentiation by manufacturers and service providers to gain early market share,” says O’Donovan. “These product ties will dilute quickly as more 4K content comes… But the old pay-TV model via (set-top box) will still continue because there will always be deals to limit access to certain content through a single provider.”


Pulling in a 4K signal over the air should also be possible, but it will take years if it happens at all. First, major networks will need to decide to broadcast content in 4K and upgrade their equipment. Then they’ll need to get on the same page regarding next-generation broadcasting technologies.


The most promising of those is ATSC 3.0, a proposed standard for television tuners that would not only allow over-the-air 4K broadcasts, but could also broadcast directly to mobile devices and add interactive elements to broadcast TV. That’s at least a few years out, and not all major networks are fully behind ATSC 3.0. Also, because ATSC 3.0 isn’t backwards-compatible with the ATSC tuners in today’s TVs, you’ll need new hardware.


But new hardware is the easy part. Shiny new 4K sets are nudging the old stuff off the shelves. In fact, if you want to buy a new 60-inch TV now, good luck finding a new model will all the latest features that isn’t a 4K TV. And when you take it home and turn it on, be prepared to get your 4K goodness through a built-in app. Because you certainly won’t be watching anything live in 4K. Not for years.



4K TV’s Biggest Content Problem: No Live Broadcasts Anytime Soon


Samsung SUHD 4K TV

Alex Washburn/WIRED



Every story about 4K TV has a cookie-cutter plot: The new 4K TVs look great. They’re affordable now, and a 4K future is inevitable. Great story! Except you won’t find 4K content on the broadcast dial any time soon. You have to stream it, which requires gobs of bandwidth. And there really isn’t much to watch in 4K yet anyway.


By the end of 2015, that last part shouldn’t be a problem. Netflix and Amazon are busy bulking up their content coffers. Streaming won’t be the only option for long, because 4K Blu-ray is due by the holidays. But while we’ll have more tack-sharp content to watch, it’ll only be the canned variety: streamed video, video on demand, video on discs. Movies and series, not live 4K TV.


The basic scenario that used to be synonymous with “watching TV”—you turn on your 4K TV, surf the channel guide, and choose between live 4K ESPN and live 4K NBC and live 4K Food Network using a 4K remote that you 4King hate—is at least a year away. Probably closer to seven years away. It may never even happen.


Does it matter? Since season one of House of Cards landed on Netflix in one glorious chunk, our viewing habits have shifted from “appointment TV” to “binge-watch on the weekend.” Old-school TV is an anachronism now that media services have adapted to our schedules, not the other way around.


That’s not to say live TV isn’t still important for a limited pool of content: live sports, political events, and the really big shows like the Super Bowl, the Oscars, the World Cup, the Olympics, and so on. This is a level of programming that needs live video to be relevant. Which is why these events will dictate how live 4K is delivered.


The 4K Front Lines: Live NBA Tests


On January 15, the Milwaukee Bucks took on the New York Knicks in London. True to form, the Knicks trailed 14-0 at the outset and lost their 16th straight game. But this was much more than another depressing showing by Knicks—it was the first trans-Atlantic Ultra HD streaming test for a live NBA game.


It was an in-house test for the NBA in a partnership with BT Sport, which has a broadcast deal with the NBA in the UK and Ireland. BT Sport is a major player in these early stages of live 4K testing, having conducted similar tests for rugby games and golf’s Ryder Cup.


At London’s O2 Arena, BT Sport arranged the crew and hardware, including eight 4K cameras and a dedicated 4K broadcast truck. In the truck, the feed was encoded, compressed and streamed out at 15Mbps. That stream sailed the Atlantic with help from an Akamai content delivery network. Nearly 3,500 miles away, at NBA headquarters in Manhattan, employees gathered in a lunchroom to watch the game on a 4K TV. To inject a bit of real-world uncertainty into the test, the footage was streamed over the office’s public Internet.


When the feed worked, it looked great. Employees could see a real difference between the 4K feed and the same game in HD on other TVs in the lunchroom. The extra resolution was especially noticeable when scanning the crowd and the players on the bench; you could see facial features on people several rows deep.


But there were intermittent outages, as well as a noticeable delay between the 4K feed and the HD feed. This was no surprise; that’s why they do in-house tests. According to Steve Hellmuth, Executive Vice President of Operations and Technology for NBA Entertainment, feed latency was a known issue going in.


“There’s definitely a bit of a delay in the live feed due to encoding,” says Hellmuth, who notes that the same issues apply to HD video. “When I do streaming in the United States in HD, there’s at least a six-second delay.”


Hellmuth says that a 4K feed to O2 Arena’s suites went very smoothly. There was no streaming involved with that footage; it was a direct feed from the 4K switcher on site. So while the promise of live 4K streaming to the home is strong, there are still puzzles to be solved before making it a viable reality: Encoding times, general stability, and the varying residential connection speeds.


Live 4K streams can be compressed down to less than 20Mbps. That’s still much fatter than a 1080p live stream, typically 6Mbps, and too burdensome for most home broadband connections.


There was another NBA test, too. Time Warner SportsNet, Time Warner Cable, and Cisco collaborated on their own Ultra HD testing for a December 23 game between the Los Angeles Lakers and Golden State Warriors. This one involved piping a massive 12Gbps feed from the Staples Center in western Los Angeles about a dozen miles south to the city of El Segundo, where it was encoded to HEVC at a SportsNet facility, then fed back as a 19Mbps stream to monitors at the Staples Center. The feat was pulled off using a dedicated high-bandwidth Time Warner network.


Ken Dumont, manager of business development at Cisco Service Provider Video Software & Solutions, says it went well.


“It was fabulous video, the most impressive thing I’ve ever seen video-wise,” he says.


Considering Time Warner and Cisco used a dedicated network for that delivery, this Lakers game test was more of an internal “let’s see what we can do” rather than a real-world demonstration. 19Mbps is a pretty fat stream. Netflix’s “Super HD” 1080p streams, currently used for HD shows or movies, have a 6Mbps bitrate, so we’re talking about a stream with a bitrate more than three times that of 1080p HD.


If it were available to the public, watching December’s Lakers game on your own couch would have required a very fast household download speed. And even if you can get 20 or 25Mbps to your home, there are other factors—network bottlenecks, old hardware along the data pipe, multiple devices connected to your home network—that could keep the stream from coming in cleanly.


That high-bitrate live video may be problematic for U.S. homes now, but Dumont sees potential for live 4K video to gain traction sooner in countries with excellent Internet speeds.


“South Korea is once again leading the world with this,” Dumont says. “They have the bandwidth in their individual homes and they are moving that way. We have sold 4K (encoders) to a broadcaster within Korea, and they’re about to place more orders. So they’re moving.”


Beyond the NBA: Smaller Bitrates for the Masses


Comcast performed its own informal tests during the 2014 Winter Olympics in Sochi, but the company says that test was an early pass at the possibilities rather than a real-world case study. They used cinema 4K cameras, which didn’t have autofocus systems—not ideal cameras for sports.


While Olympic Broadcast Services plans to launch its own live video channels in time for the 2016 Summer Olympics in Rio de Janeiro, according to the Hollywood Reporter, 4K video will not be part of those offerings. That leaves the 2018 World Cup and the 2018 Winter Olympics as the first time we may see live 4K broadcasts on a global scale.


The NFL uses 4K cameras to capture goal-line and sideline plays for sharper replays, but has no definitive plans to broadcast live games in 4K just yet.


A couple of U.S. events may get there first. Both the Oscars and the Super Bowl seem like excellent matches for live 4K, but any plans to stream them in Ultra HD are hush-hush. Disney/ABC Television says it doesn’t have anything to announce regarding plans for 4K Oscars. The NFL Network does a combined production with CBS for Thursday Night Football games, and they already use 4K cameras to capture goal-line and sideline plays for sharper enlarged replays. Still, the NFL says there are no definitive plans to shoot and broadcast live games in 4K yet.


Live-streaming the Oscars will likely be easier than live-streaming a sporting event. One of the hurdles with the NBA tests were the high frame rates (50fps for the London test and 60fps for the Lakers game), which translate to higher bitrates for the video stream. This higher frame-rate stuff also might not work with some early 4K sets: 4K video at 24fps is compatible with the HDMI 1.4 spec, but 50fps and 60fps 4K feeds require an HDMI 2.0-capable set.


Comcast says it is concentrating on streaming 24p video first. The company may have a first-generation box for episodic content and movies (4K at 24fps), and a second one for sports (4K at 60fps). The faster frame rate for sports means the video bitrate is around 18 or 19 Mbps, while for 24fps movies and episodic content, it’s around 12Mbps—much more manageable given the current network infrastructure, and much easier to deliver to a wider audience. By using new types of compression and a new workflow, the company thinks it can shrink those bitrates even more without compromising picture quality.


Infrastructure: The Big Picture


Even after the 4K streams have been optimized at the source, they’ll still require at least two to three times the bandwidth you’d need today to watch a 1080p HD feed. This is a problem that the industry can only solve by reorganizing its infrastructure, something that requires not only a significant capital investment, but also a lot of time. It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


The analog reclamation process of just a few years ago, when cable companies repurposed parts of the analog broadcast spectrum for the digital-TV switchover, took years. There are also costly 4K upgrades for broadcasters to consider in order to handle live broadcasts in 4K—encoding, switching, and other hardware. Many of them may be waiting for 8K (the next big leap in picture resolution, which quadruples the pixel count) to make those kinds of expensive upgrades. They don’t want to have to do it twice, back-to-back.


Paul O’Donovan, principal research analyst at Gartner, says the costs will be less for the players in the delivery ecosystem who don’t have to deal with as many cables and routers. Like satellite, for instance.


“I think satellite will explode with a wealth of 4K content very soon, around the globe,” says O’Donovan. “It’s not an issue of bandwidth or capacity, it’s more to do with adding or replacing equipment along the distribution channel. This is more expensive for cable operators and for network TV companies than it is for satellite pay-TV operators or Internet-delivery systems.”


It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks.


But until satellite services and 4K set-top boxes are actually available, we’ll be stuck with the growing “app-ification” of television. With no 4K equivalent to devices like Roku boxes and sticks, the Amazon Fire TV box and stick, and Apple TV, the primary delivery mechanism for UHD video on a 4K TV right now is through TV-installed apps. There’s quite a bit of fragmentation on that front at the moment, making the brand of set you buy a gateway to the content you’ll get.


Gartner’s O’Donovan thinks app exclusivity is merely a temporary issue.


“This is initial product differentiation by manufacturers and service providers to gain early market share,” says O’Donovan. “These product ties will dilute quickly as more 4K content comes… But the old pay-TV model via (set-top box) will still continue because there will always be deals to limit access to certain content through a single provider.”


Pulling in a 4K signal over the air should also be possible, but it will take years if it happens at all. First, major networks will need to decide to broadcast content in 4K and upgrade their equipment. Then they’ll need to get on the same page regarding next-generation broadcasting technologies.


The most promising of those is ATSC 3.0, a proposed standard for television tuners that would not only allow over-the-air 4K broadcasts, but could also broadcast directly to mobile devices and add interactive elements to broadcast TV. That’s at least a few years out, and not all major networks are fully behind ATSC 3.0. Also, because ATSC 3.0 isn’t backwards-compatible with the ATSC tuners in today’s TVs, you’ll need new hardware.


But new hardware is the easy part. Shiny new 4K sets are nudging the old stuff off the shelves. In fact, if you want to buy a new 60-inch TV now, good luck finding a new model will all the latest features that isn’t a 4K TV. And when you take it home and turn it on, be prepared to get your 4K goodness through a built-in app. Because you certainly won’t be watching anything live in 4K. Not for years.



Gemalto Confirms It Was Hacked But Insists the NSA Didn’t Get Its Crypto Keys


Gemalto CEO Olivier Piou (C) arrives for a press conference in Paris, February 25, 2015.

Gemalto CEO Olivier Piou (C) arrives for a press conference in Paris, February 25, 2015. Kenzo Tribouillard/AFP/Getty



Gemalto, the Dutch maker of billions of mobile phone SIM cards, confirmed this morning that it was the target of attacks in 2010 and 2011—attacks likely perpetrated by the NSA and British spy agency GCHQ. But even as the the company confirmed the hacks, it downplayed their significance, insisting that the attackers failed to get inside the network where cryptographic keys are stored that protect mobile communications.

Gemalto came to this conclusion after just a weeklong investigation following a news report that the NSA and GCHQ had hacked into the firm’s network in 2011. The news was reported by The Intercept last week, which said the agencies had gained access to huge cache of the cryptographic keys used with its SIM cards.


“The investigation into the intrusion methods described in the document and the sophisticated attacks that Gemalto detected in 2010 and 2011 give us reasonable grounds to believe that an operation by NSA and GCHQ probably happened,” Gemalto wrote in a press release on Wednesday. But, the company said, “The attacks against Gemalto only breached its office networks and could not have resulted in a massive theft of SIM encryption keys.”


Many in the information security community ridiculed Gemalto for asserting this after such a short investigation, particularly since the NSA has been known to deploy malware and techniques capable of completely erasing any signs of an intrusion after the fact to thwart forensic discovery of a breach.


“Very impressive, Gemalto had no idea of any attacks in 2010, one week ago. Now they know exactly what happened,” French developer and security researcher Matt Suiche wrote on Twitter.


Chris Soghoian, chief technologist for the American Civil Liberties Union had the same reaction.


“Gemalto, a company that operates in 85 countries, has figured out how to do a thorough security audit of their systems in 6 days. Remarkable,” he tweeted.


The Intercept alleged in its story that the spy agencies had targeted employees of the Dutch firm, reading their siphoned emails and scouring their Facebook posts to obtain information that would let them hack employee machines. Once on Gemalto’s network, The Intecept reported, the spy agencies planted backdoors and other tools to give them a persistent foothold. We “believe we have their entire network,” boasted the author of a government PowerPoint slide that was leaked by Snowden to journalist Glenn Greenwald.


If true, this would be a damning breach. Gemalto is one of the leading makers of SIM cards; its cards are used in part to help secure the communications of billions of customers phones around the world on AT&T, T-Mobile, Verizon, Sprint and more than 400 other wireless carriers in 85 countries. Stealing the crypto keys would allow the spy agencies to wiretap and decipher encrypted phone communications between mobile handsets and cell towers without the assistance of telecom carriers or the oversight of a court or government.


Edward Snowden criticized the agencies for the hack in an Ask Me Anything session for Reddit on Monday. “When the NSA and GCHQ compromised the security of potentially billions of phones (3g/4g encryption relies on the shared secret resident on the sim),” Snowden wrote, “they not only screwed the manufacturer, they screwed all of us, because the only way to address the security compromise is to recall and replace every SIM sold by Gemalto.”


In its statement on Wednesday, however, Gemalto said the intrusions it detected during the relevant time period were not successful, apparently contradicting the NSA slide asserting that the spy agencies had taken over “their entire network.” Gemalto said that in June 2010 it had detected suspicious activity aimed at one of its French outlets “where a third party was trying to spy on the office network.” But the company said “action was immediately taken to counter the threat.”


The following month, the company wrote, a second incident occurred involving a phishing attack, with fake emails sent to one of Gemalto’s mobile operator customers that appeared to come from legitimate Gemalto email addresses. Gemalto said it had “immediately informed the customer and also notified the relevant authorities both of the incident itself and the type of malware used.”


Gemalto also said that the hacking operations of the NSA and GCHQ, as described by The Intercept, were aimed at intercepting encryption keys as they were exchanged between mobile operators and their suppliers, but by 2010, when the hacks occurred, Gemalto had “already widely deployed a secure transfer system with its customers and only rare exceptions to this scheme could have led to theft.” Even then, it noted, the number of keys stolen would have been small and their use to the spy agencies would have been limited.


“In the case of an eventual key theft,” Gemalto said Wednesday, “the intelligence services would only be able to spy on communications on second generation 2G mobile networks. 3G and 4G networks are not vulnerable to this type of attack.”



Could an HIV drug beat strep throat, flesh-eating bacteria?

With antibiotic resistance on the rise, scientists are looking for innovative ways to combat bacterial infections. The pathogen that causes conditions from strep throat to flesh-eating disease is among them, but scientists have now found a tool that could help them fight it: a drug approved to treat HIV. Their work, appearing in the journal ACS Chemical Biology, could someday lead to new treatments.



Douglas A. Mitchell and colleagues point out that Streptococcus pyogenes is responsible for more than 600 million illnesses and 500,000 deaths globally every year. A major factor in the pathogen's ability to cause disease is its production of a toxin called streptolysin S, or SLS. If scientists could figure out a way to jam the bacterial machinery that makes the compound, they could develop new therapies to fight the pathogen and slow the spread of antibiotic resistance. But not much is known about how S. pyogenes makes SLS. Mitchell's team wanted to start filling in the blanks.


The researchers turned to an HIV drug called nelfinavir. Although the drug's target is an HIV protein, it is also known to incidentally block a key enzyme in patients. That enzyme is related to one in S. pyogenes that is critical for producing SLS. The scientists made several nelfinavir-like compounds that stopped the bacteria from making the toxin in lab tests. They conclude that the drug and its variants could help future efforts to understand how the deadly bacteria works and how to stop it.


The authors acknowledge funding from the National Institutes of Health.




Story Source:


The above story is based on materials provided by American Chemical Society . Note: Materials may be edited for content and length.



Could an HIV drug beat strep throat, flesh-eating bacteria?

With antibiotic resistance on the rise, scientists are looking for innovative ways to combat bacterial infections. The pathogen that causes conditions from strep throat to flesh-eating disease is among them, but scientists have now found a tool that could help them fight it: a drug approved to treat HIV. Their work, appearing in the journal ACS Chemical Biology, could someday lead to new treatments.



Douglas A. Mitchell and colleagues point out that Streptococcus pyogenes is responsible for more than 600 million illnesses and 500,000 deaths globally every year. A major factor in the pathogen's ability to cause disease is its production of a toxin called streptolysin S, or SLS. If scientists could figure out a way to jam the bacterial machinery that makes the compound, they could develop new therapies to fight the pathogen and slow the spread of antibiotic resistance. But not much is known about how S. pyogenes makes SLS. Mitchell's team wanted to start filling in the blanks.


The researchers turned to an HIV drug called nelfinavir. Although the drug's target is an HIV protein, it is also known to incidentally block a key enzyme in patients. That enzyme is related to one in S. pyogenes that is critical for producing SLS. The scientists made several nelfinavir-like compounds that stopped the bacteria from making the toxin in lab tests. They conclude that the drug and its variants could help future efforts to understand how the deadly bacteria works and how to stop it.


The authors acknowledge funding from the National Institutes of Health.




Story Source:


The above story is based on materials provided by American Chemical Society . Note: Materials may be edited for content and length.



Widely used food additives promotes colitis, obesity and metabolic syndrome, shows study of emulsifiers

Emulsifiers, which are added to most processed foods to aid texture and extend shelf life, can alter the gut microbiota composition and localization to induce intestinal inflammation that promotes the development of inflammatory bowel disease and metabolic syndrome, new research shows.



The research, published Feb. 25 in Nature, was led by Georgia State University Institute for Biomedical Sciences' researchers Drs. Benoit Chassaing and Andrew T. Gewirtz, and included contributions from Emory University, Cornell University and Bar-Ilan University in Israel.


Inflammatory bowel disease (IBD), which includes Crohn's disease and ulcerative colitis, afflicts millions of people and is often severe and debilitating. Metabolic syndrome is a group of very common obesity-related disorders that can lead to type-2 diabetes, cardiovascular and/or liver diseases. Incidence of IBD and metabolic syndrome has been markedly increasing since the mid-20th century.


The term "gut microbiota" refers to the diverse population of 100 trillion bacteria that inhabit the intestinal tract. Gut microbiota are disturbed in IBD and metabolic syndrome. Chassaing and Gewirtz's findings suggest emulsifiers might be partially responsible for this disturbance and the increased incidence of these diseases.


"A key feature of these modern plagues is alteration of the gut microbiota in a manner that promotes inflammation," says Gewirtz.


"The dramatic increase in these diseases has occurred despite consistent human genetics, suggesting a pivotal role for an environmental factor," says Chassaing. "Food interacts intimately with the microbiota so we considered what modern additions to the food supply might possibly make gut bacteria more pro-inflammatory."


Addition of emulsifiers to food seemed to fit the time frame and had been shown to promote bacterial translocation across epithelial cells. Chassaing and Gewirtz hypothesized that emulsifiers might affect the gut microbiota to promote these inflammatory diseases and designed experiments in mice to test this possibility.


The team fed mice two very commonly used emulsifiers, polysorbate 80 and carboxymethylcellulsose, at doses seeking to model the broad consumption of the numerous emulsifiers that are incorporated into almost all processed foods. They observed that emulsifier consumption changed the species composition of the gut microbiota and did so in a manner that made it more pro-inflammatory. The altered microbiota had enhanced capacity to digest and infiltrate the dense mucus layer that lines the intestine, which is normally, largely devoid of bacteria. Alterations in bacterial species resulted in bacteria expressing more flagellin and lipopolysaccharide, which can activate pro-inflammatory gene expression by the immune system.


Such changes in bacteria triggered chronic colitis in mice genetically prone to this disorder, due to abnormal immune systems. In contrast, in mice with normal immune systems, emulsifiers induced low-grade or mild intestinal inflammation and metabolic syndrome, characterized by increased levels of food consumption, obesity, hyperglycemia and insulin resistance.


The effects of emulsifier consumption were eliminated in germ-free mice, which lack a microbiota. Transplant of microbiota from emulsifiers-treated mice to germ-free mice was sufficient to transfer some parameters of low-grade inflammation and metabolic syndrome, indicating a central role for the microbiota in mediating the adverse effect of emulsifiers.


The team is now testing additional emulsifiers and designing experiments to investigate how emulsifiers affect humans. If similar results are obtained, it would indicate a role for this class of food additive in driving the epidemic of obesity, its inter-related consequences and a range of diseases associated with chronic gut inflammation.


While detailed mechanisms underlying the effect of emulsifiers on metabolism remain under study, the team points out that avoiding excess food consumption is of paramount importance.


"We do not disagree with the commonly held assumption that over-eating is a central cause of obesity and metabolic syndrome," Gewirtz says. "Rather, our findings reinforce the concept suggested by earlier work that low-grade inflammation resulting from an altered microbiota can be an underlying cause of excess eating."


The team notes that the results of their study suggest that current means of testing and approving food additives may not be adequate to prevent use of chemicals that promote diseases driven by low-grade inflammation and/or which will cause disease primarily in susceptible hosts.


This study was funded by the National Institutes of Health and Crohn's & Colitis Foundation of America.




Story Source:


The above story is based on materials provided by Georgia State University . Note: Materials may be edited for content and length.



Beyond Wearables: New Frontiers in Interactive Tech


sensabubble_660

Bristol University researchers’ SensaBubble can project images, text and emoji onto the bubbles it releases. Courtesy Sensabubble



In the final months of 2014, wearable technology sparked significant media and consumer attention – not least thanks to the announcement of the Apple Watch. But as wearables move from the margins into the mainstream, it’s time to consider the next wave of interactive technology.


Smartwatches shift existing technology to a new location – from the pocket to the wrist. More exciting are second-skin devices such as embeddables, ingestibles and hearables. These will include flexible technologies that blend into our skin; devices that are controlled by eye-motion sensors; and earbuds that measure and respond to our heart rate. All are part of the ongoing journey to create technology that is so deeply interwoven with our lives that it becomes almost invisible. The benefits of this transition will be better usability and genuinely groundbreaking technology that goes beyond incremental developments.


Pioneering concepts are already pushing the boundaries of interactive technology. I want to explore some of my favourite examples – those that reveal what we might be able to expect in the near future.


Technology and Emotion


The rise of big data is helping companies understand their customers better than ever before. In the future, consumer technology will communicate and translate its owner’s emotions, giving brands the chance to respond with relevant services – before they are even asked for.


Researchers from Korea’s Advanced Institute of Science and Technology (KAIST) are developing a sensor capable of recording goosebumps on the wearer’s skin. This could be used to measure emotional responses to a wide range of stimuli, enabling advertisers and entertainment companies to monitor consumer engagement more accurately. Here in the UK, British Airways has experimented with a blanket that reflects the wearer’s emotional state by subtly shifting colours. BA says the blanket will allow it to provide more responsive service by quickly identifying when customers are relaxed or nervous.


Rebooting Interaction Design


Increasingly, developers are taking technology beyond personal devices into less tangible realms. Created by researchers at Bristol University, the SensaBubble can project images, text and even emoji onto the surfaces of the bubbles it releases, transforming them into a playful – and transient – form of sensory branding. Lead researcher Diego Martinez told us that air or empty space could be the next frontier for interaction design. He envisions a future where shopping malls might be fitted with SensaBubbles that prompt passers-by to pop sensory advertisements – which could range from a trending hashtag to a preview of the latest Chanel perfume.


Some brands are embracing this idea by layering existing technologies. Rebecca Minkoff, for example, has opened its first stores featuring interactive, over-sized screens, where customers can browse and request products to try on. Customers will then receive a text message to alert them to a free changing room. Once inside, they can use an interactive mirror to ask for different sizes or items. This new application of technology will change the consumer’s retail experience, while at the same time giving stores valuable information on customer preferences.


Extraordinary Everyday Objects


2014 also saw the first signs that embeddable electronics are becoming viable products. Motorola unveiled a super-thin, smartphone-unlocking digital tattoo that could spell the end of the password. Apple, meanwhile, is exploring the potential of smart earbuds (or hearables): the tech giant has filed a patent for headphones that can measure the wearer’s temperature, activity levels, heart rate and perspiration.


Soon, technology will creep into the fabric of our clothes and homes. Studio XO, a technology, fashion and music lab, is already weaving sound-responsive light-emitting diodes (LEDs) into clothing, enabling the likes of Lady Gaga and the Black Eyed Peas to glow to the beat. Integrated tech will unlock the much talked-about Internet of Things, a concept that relies on everyday objects having the ability to communicate with their owners.


So, as consumers, we can look forward to more and more ways to interact with technology, brands and our surroundings, as the border between the physical and the digital is eroded by products that straddle both worlds. Devices such as the smartwatch represent the intermediary step before the next wave of groundbreaking technology, which will further assimilate itself within our lives and homes. Technologists are constantly exploring ways in which gadgets, apps and products can acknowledge and respond to multiple stimuli – which go way beyond the touch of a button.


Hayley Ard is Head of Consumer Lifestyle at Stylus Media Group.



Watch Two Women Find Out They Might Be Twins Thanks to a YouTube Video


The Internet has made it so that you can find anything online: a date, a new TV, illicit substances. But what if it accidentally helped you find your family?


In the documentary Twinsters , a young French fashion student named Anaïs Bordier accidentally finds a YouTube video of a woman who looks just like her. That girl, Samantha Futerman, was an aspiring actress living in Los Angeles. She also happened to be born the same day as Bordier in Busan, South Korea and also put up for adoption.


After her discovery, Bordier sent Futureman a message on Facebook and the two spent the ensuing months corresponding and eventually meeting to discover if they were truly separated at birth. Twinsters, directed by Futerman and Ryan Miyamoto and premiering at the South by Southwest film festival, follows what happens when Futerman goes to London to meet her possible twin and eventually take the test that will prove if they are a biological match.


Check out the premiere of the trailer for Twinsters above. Check out the movie Twinsters to see if it’s possible to find a biological needle in a haystack on YouTube.



Google’s ‘Android for Work’ Gives Your Phone a Split Personality


android-phone-inline

Screenshot: Google



Google has unveiled a new set of applications, online services, and industry partnerships designed to promote the use of its Android mobile operating system in the workplace.


Known as Android for Work, this rather broad effort is meant to drive the use of Android not only on smartphones used inside the world’s businesses, but on digital payment kiosks that serve consumers inside cafes and retail stores. “We believe that Android is the right solution and the right platform to bring mobility at work to more people,” says Rajen Sheth, the father of Google Apps, the company’s suite of office applications, who now oversees its efforts to push Android and Chrome OS, Google’s laptop operating system, into businesses.


Basically, Google is offering a way for companies and workers to securely separate their work apps from their personal apps on a single device. On the same phone, for instance, you run one incarnation of Evernote for personal use and another for business use. Through partner companies, the internet giant is offering a single piece of software that lets businesses and individuals create this separation on existing phones, and in the future, Sheth tells WIRED, handset makers will offer phones preloaded with the software.


With this Android for Work program—which the company first discussed this past summer—Google hopes to challenge Apple, which has quietly pushed iPhones and iPads into the world’s businesses, and Microsoft, whose Windows Phone OS is largely intended for use in the workplace. Like Apple, Google is first and foremost a company that offers products and services to consumers, but through its Google for Work organization, it often repackages its consumer tools for use in the business world.


Using technology it acquired in buying a startup called Divide, Sheth says, Google has built an tool it call the Android for Work app. But that doesn’t really do the thing justice. Sheth describes it as “an app of apps.”


In short, when you run this tool, it creates what the company calls Work Profiles—essentially the ability to digitally separate work applications and data from personal software—while installing a few business-centric-apps and a new incarnation of the Android app store, known as Google Play for Work, that will offer all sorts of other business software


With this new tool, Sheth says, Google hopes to reach the “bring-your-own-device” market, the many people who aim to use a single phone for both personal and work tasks. In the past, he says, it was often difficult for businesses and individuals to securely and reliably separate work data from businesses, but Android for Work aims to change that. It lets individuals access work and personal software through separate usernames and passwords, for instance, and it lets companies remove business software from phones without touching personal data. On each phone, Work apps appear alongside personal apps, but they’re tagged with their own identifying icon (see image above).


In order to push these tools and other business applications onto phones—and get these phones into businesses—Google is partnering with a wide range of vendors, including everyone from enterprise mobile management (EMM) companies—companies that help businesses manage their fleet of mobile phones—to handsets makers like Samsung and HTC and software makers like Salesforce.com and Box.com.


According to Sheth, EMMs—such as MobileIron—are now offering Android for Work software to businesses who wish to use it today, and as time goes on, handsets makers will offer phones preloaded with these tools. Meanwhile, other partners are building apps that will be available through the Google Play for Work store.


Google and its partners have already tested much of this technology with some businesses, including venerable retailer Woolworths and insurance company Guardian Life, and now, for the first time, the same tools are available to the business world at large.



Google AI Plays Atari Like the Pros


holomind-google-storyart

Google DeepMind



Last year Google shelled out an estimated $400 million for a little-known artificial intelligence company called DeepMind. Since then, the company has been pretty tight-lipped about what’s been going on behind DeepMind’s closed doors, but here’s one thing we know for sure: There’s a professional videogame tester who’s pitted himself against DeepMind’s AI software in a kind of digital battle royale.

The battlefield was classic videogames. And according to new research published today in the science magazine Nature, Google’s software did pretty well, smoking its human competitor in a range of Atari 2600 games like Breakout, Video Pinball, and Space Invaders and playing at pretty close to the human’s level most of the time.


Google didn’t spend hundreds of millions of dollars because it’s expecting an Atari revival, but this new research does offer a hint as to what Google hopes to achieve with DeepMind. The DeepMind software uses two AI techniques—one called deep learning; and the other, deep reinforcement learning. Deep-learning techniques are already widely used at Google, and also at companies such as Facebook and Microsoft. They help with perception—helping Android understand what you’re saying, and Facebook know who’s photo you just uploaded. But until now, nobody has really matched Google’s success at merging deep learning with reinforcement learning—those are algorithms that make the software improve over time, using a system of rewards.


By merging these two techniques, Google has built a “a general-learning algorithm that should be applicable to many other tasks,” says Koray Kavukcuoglu, a Google researcher. The DeepMind team says they’re still scoping out the possibilities, but clearly improved search and smartphone apps are on the radar.


But there are other interesting areas as well. Google engineering guru Jeff Dean says that AI techniques being explored by Google—and other companies—could ultimately benefit the kinds of technologies that are being incubated in the Google X research labs. “There are potential application in robots and self-driving-car kinds of things,” he says. “Those are all things where computer vision is pretty important.”


Google says that its AI software, which it’s dubbed the “Deep Q network agent,” got 75 percent of the score of its professional tester in 29 of the 49 games it tried out. It did best in Video Pinball.


Deep Q works best when it lives in the moment—bouncing balls in Break Out, or trading blows in video boxing—but it doesn’t do so well when it needs to plan things out in the long-term: climbing down ladders and then jumping skeletons in order to retrieve keys in Montezuma’s Revenge, for example. Poor old Deep Q scored a big fat zero in that game.



The New Moto E: Super-Cheap Phones Are Getting Really Good


Motorola’s commitment to choice goes much further than letting you color your own Moto X. The way the company sees it, owning a smartphone at all should be a choice—and the Moto E was built explicitly to make that choice easier. It’s made to be one thing, and one thing only: cheap as hell.


Today, just ahead of Mobile World Congress, Motorola slyly announced a new version of the Moto E by messengering a mysterious box to a few members of the press. It came in this adorable box made to look like a diorama of a product announcement—but Motorola wanted to keep this a little more personal, and just let us use the phone.


I’ve been using it all morning, and the craziest thing about it isn’t that it’s only $149, or that it has LTE and a newer version of Android than even most flagship smartphones. No, the craziest thing about the Moto E is that it seems like a pretty great phone.


The biggest downside is the screen. In almost every case, the first way manufacturers save money on a smartphone is to downsize and down-res the screen. The new Moto E has a 4.5-inch, 960 x 540 display that looks blurry and dim next to almost any recent model, and its desaturated color profile that has me wiping off the screen every ten seconds trying to make it clearer. I mean, come on, though: it’s $149. Unlocked. Off contract. The screen’s fine.


Like last year’s model, most of the rest of the Moto E is good, and not just for a super-cheap smartphone. The 5-megapixel camera takes decent pictures, though none of Motorola’s devices are exactly at the vanguard of smartphone photography. There’s a front-facing camera now, too, for all your selfie-taking. (One thing I keep hearing over and over from phone makers: everyone, especially in developing markets, is obsessed with selfies. Everyone.) The Qualcomm Snapdragon 410 processor is probably noticeably slower than the chip inside your phone, but it does a decent job. The plastic body is a little thick, but its trademark curved back is sturdy and comfortable.


There are also these really clever interchangeable bumpers: you’re not changing the whole color of the phone, just the accents around the side. My white E shipped with black, white, yellow, and blue bumpers, and there are a few other options as well; I’m using the blue. This would have been a flagship phone a couple of years ago. It’s a reminder of how quickly technology improves, and of the fact that smartphones have been pretty good for a while now.


Moto E_2nd Gen_Lunch

Motorola



This year, the E got more access to some of Motorola’s clever software enhancements. It uses Moto Display to show notifications and the time without turning the screen on, and has the nifty wrist-flicking camera motion. Best of all, it runs Android 5.0 Lollipop, which very few phones at any price can claim.


In 2014, the average selling price of a smartphone was estimated to be $297. Motorola undercut that once, with the Moto G, which quickly became the company’s best-selling phone of all time. The Moto E went even further down the pricing pyramid, and the new model makes that phone a much more powerful option. LTE is turning on in markets around the world, and at $149 the price is hard to beat. (There’s a $129 version, too, which is the same minus LTE capability.)


Are you going to ditch your iPhone for the Moto E? Nah. But for the billions of people who have never been able to afford a smartphone, the options are simultaneously becoming more affordable and more impressive. That’s a remarkable combination.



McLaren’s Latest Supercar Is a Stripped Down Track Warrior


EMBARGO-until-Feb.-25-at-8AM-EST---McLaren-675LT---front-(high-res)

McLaren is bringing its latest supercar, the 675LT, to the Geneva Motor Show next month. McLaren



“It was a car that needed to be built,” says JP Canton, a spokesman for McLaren. “The market demanded it.”


The car in this case is the McLaren 675LT, a powered-up, stripped-down, track-focused take on the 650S supercar. And the market is everyone for whom the 650S—which starts at $265,000— just isn’t enough. That’s saying something: The 650S delivers 641 horsepower from a twin-turbocharged 3.8-liter V8. It has a top speed of 207 mph and runs from 0 to 60 mph in three seconds. It’s engineered to make anyone in loafers feel like a pro racer.


But some customers wanted a version that was more focused on performance but still street-legal (with things like headlights and airbags), so McLaren obliged. It modified the V12 to deliver a perfectly demonic 666 horsepower (that’s 675 PS, to use the British system) and cut out bits like the air conditioner to cut the car’s weight by more than 200 pounds, a 7 percent drop. Those who can’t handle sweat-soaked seersucker can have the A/C put back in at no cost.


EMBARGO-until-Feb.-25-at-8AM-EST---McLaren-675LT---front-3q-(high-res)

It’s a track-focused by street-legal version of the 650S. McLaren



A full 33 percent of the parts used in the 675LT are new, and it’s all about improving performance. McLaren stripped down the interior and swapped out the leather-wrapped, cushy seats for racing versions that are 33 pounds lighter. It made the turbochargers from titanium, not stainless steel, to save weight. It used aluminum for the bolts on the 7-speed transmission. It set up the engine to switch gears faster and deliver better acceleration. The result is a track-destined machine that offers 1 horsepower for every four pounds it weighs, an impressive ratio.


The 675LT (LT is for “Longtail,” from the McLaren F1 GTR Long Tail that raced at the 24 Hours of Le Mans) posts some wild stats. It hits 60 mph in 2.9 seconds, and 124 mph in 7.9. Top speed is actually down a touch to 205 mph, a downside of gearing that favors acceleration instead (which we prefer, since you’ll definitely do the 0 to 60 run, but probably never reach top speed). If you drive it carefully, it’ll go 20 miles on a gallon. Please never do this, because McLaren promises a limited run and we don’t want to regret letting your getting one.


We’ll know more about pricing (Spoiler: way more than $265K) and how many 675LTs McLaren plans to sell next month at the Geneva Motor Show.


As for looks, there’s nothing too crazy: The 675LT resembles basically every other car in its class, as individual design loses out to the physics or aerodynamics and immutable government safety regulations. But in the absence of a design for the ages, we’ll gladly take a vehicle that takes something good like the 650S and makes it even better.


EMBARGO-until-Feb.-25-at-8AM-EST---McLaren-675LT---rear-3q-(high-res)

Pricing hasn’t been revealed yet, but expect this thing to cost a bunch more than the $265K 650S. McLaren




Magic: The Story of an Accidentally Founded, Wildly Viral Startup


magic-ft

WIRED



When I moved to Brooklyn, my roommate hooked me up with his weed dealer. He laid out shockingly specific instructions for contacting the guy. First, I had to text a certain number with the message “Is this mm?” Those capitalizations, that punctuation. If I was cool, I’d get a text back from a different (and ever-changing) number with the menu for the day. The prices were non-negotiable, there were no questions; I was to reply with a product name and a number of grams. It was complicated and felt somehow dangerous, but it worked: With just two texts, I summoned a carnival of cannabis to my front door.


Magic, a new company that was never meant to launch but this weekend went completely viral, is that text-message delivery system gone global (and legit). This is an awful cliche and I hate myself for writing this, but it really is Uber for everything. As in, you want something, they get it for you. There’s no brilliant algorithm behind it, no clever hack—just a bunch of people using every service, trick, and tool they can find. They’ll order from Instacart or Seamless; they’ll call the manufacturer or go to the store. Magic is the aggregator of aggregators, a dispatcher for every service and store on earth. Depending on how you read it, it’s either an impossibly cynical commentary on the sad state of our own resourcefulness, or an earth-shattering productivity tool.


You just text a number and tell it what you want. You can ask for anything: a burger for dinner, or four tickets to the game tonight. Magic can find you the part you need to fix your bike. It can literally buy a car and charge it to your credit card. Think about all the times you’ve ever said “I would give anything to not be doing this right now.” Magic will do that thing for a few bucks.


Personally, I wanted headphones. So I found the number on the website, and wrote “I need a pair of JayBird BlueBuds X.” (I hear they’re good.) Then I stared at my phone as if it might suddenly sprout wireless earbuds. Instead, I got a text back: “Welcome to Magic! Due to high demand, you’ve been placed on our waitlist.” I was #178 in line. Another text promised that I could tweet and Facebook about Magic to move up the list, so of course I did because what else is social media for? A few minutes later I was still #178.


earbuds-bluebird-x

Jaybird



Magic sucks at waiting lists, but, then again, it wasn’t supposed to have one. It wasn’t even supposed to be a company. It was just something Mike Chen and his buddies made to see if it would work.


Chen is CEO of Bettir, a fledgling company developing a blood-pressure app inside the Y Combinator startup accelerator. The app is all about personalized coaching, chatting with you to explain what your blood pressure means and how you can improve it. Building their back-end system led Chen and his four co-founders—Ben Godlove, Nic Novak, Michael Rubin, and David Merriman—to wonder: how much can you really do over text? What if you could just say “I want a pepperoni pizza” or “I need the soonest possible reservation at Nobu,” and it would just happen? You wouldn’t have to worry about the how, the where, or the by whom—that would be somebody else’s problem. All you’d have to do is send a text.


So he registered a phone number using the business-friendly internet calling service Twilio, and texted it. “We made this number as a test,” Chen told me, “and I had a few people standing by.” Chen’s first request was a demo for his co-founders. He texted the number asking for chicken fried rice, and his co-founder Michael Rubin made sure it got delivered. Their exchange amounted to paying your buddy a few bucks to order Seamless for you.


Magic’s first order: chicken fried rice


It worked, so Chen sent the number to a few of his friends for them to use, too. Then he made a simple website, so he didn’t have to explain it to each person individually. “Very quickly,” he says, “they emailed me back, saying ‘hey is it okay if I share this with a few of my friends?'” He said, “Sure,” and started thinking about doing some user studies to develop this idea into a real product. Then, this past Saturday, his site mysteriously appeared on Product Hunt, that curator of cool products in Silicon Valley. It also hit Hacker News and immediately jumped to the top. In a matter of minutes, Magic went from 30 users to thousands.


The team scrambled, enlisting friends and family members—and at least one co-founder’s girlfriend—to start answering requests. There are still just 10 to 20 people on the team, according to Chen, though he says others have probably been hired while we’ve been on the phone. Still, though: If you text Magic a pizza order right now, odds are good one of its founders is calling Papa John’s.


Or, in my case, Insomnia Cookies. After I paid $50 to jump 178 spots in line, I got a VIP number and texted again. By now I was hungry, so I ordered a dozen chocolate chip cookies. Five minutes and $26 later, they were supposedly on their way—but they never arrived, and I never heard why. I never got my money back, either.


In Magic’s quest to have everything “just handled like magic,” as Chen described it (with the third or fourth accidental pun of our conversation), Magic’s people apparently forgot to give the delivery guy my number. Since all I’d ever done was send a text to a mysterious number, I had no way to find out.


Okay, so Magic is not to be trusted with my $26. Clearly, the best thing to do was try and give them $26,000. To test Magic’s limits, I then came frighteningly close to buying a 2015 VW Jetta.


jetta-inline

Volkswagen



“I want to buy a 2015 VW Jetta,” I wrote. The very friendly person on the other side of my text conversation said sure, they could help, and asked me a dozen questions about colors, and interiors, where am I located, and do I want a CD player. Suddenly I realized: Oh my God, they’re actually going to buy me a car. I’m excited about the future, but not yet ready to blow my credit limit with a single text.


They would have done it, though. As my magician—which is really what Magic’s workers are called—reminded me, Magic’s goal is to do anything, anytime, as long as it’s legal. Need someone to stand in line for you, clean your apartment, help you plan your party, or find you a tiger? The tiny team at Magic is down. (They’re looking into the dubious legality of tiger-purchase now, and I get the sense Chen really hopes Magic gets to buy one.)


“You know what?” Chen says, dead serious. “People do want cars, and they do want helicopters to Vegas right now. We’re not here to say what you should or shouldn’t want, we’re here to make it possible to have.” If this company has a mission statement, that’s it.


“People do want cars, and they do want helicopters to Vegas right now.”


Luckily for Magic, Chen says most people’s immediate needs are more “Thai food” than “tiger.” When I tried the app again a few hours later, still in the midst of the company’s viral explosion, I just wanted dinner. I asked for “a good bottle of Cabernet Sauvignon” between $17-$25, plus a butternut squash and chorizo soup from the diner downstairs, which definitely exists, but isn’t exactly on the menu. After a few texts about my address and my chorizo nationality preferences, the conversation just went silent. An hour later I gave up and ordered Seamless for myself, the old-fashioned way. My magician eventually resurfaced, full of apologies and promises for future chorizo, but not until I was already finished eating.


I’ve never once actually gotten what I wanted from Magic.


Let me repeat that. I’ve never actually gotten anything from Magic. But Chen assured me most people are having far better experiences, and that seems to be true, but the service is clearly already slowing against the influx of requests. Food is the most common type of request, and the easiest to fulfill thanks to apps like Seamless and DoorDash. But it’s only going to get tougher for Magic. “Uber for everything” is easy enough to pull off when you’re a bunch of funded entrepreneurs with money to spare and a hankering for fried rice, but making Magic work for anything, anywhere and fore everyone is much harder. Even with all the helpful dry-cleaning services and food delivery apps it can turn to, scaling with this exponential demand is nigh impossible.


If they can somehow figure it out, though, there’s always money to be made selling convenience. The company’s business model is simple: Get you what you want and charge you a little extra for the service. Magic makes you pay before it does anything, and the cost represents the price of your order plus the time and effort spent to acquire it. I never negotiated the price with a magician, though I’m sure you could. Everything is paid using Stripe, another Y Combinator startup, which you sign up for from a link in Magic’s response to your first text. Chen says Magic never even sees your credit card information. (His response to my question about what the company does with all the private information it asks you to send in a text—names, addresses—is less convincing, and consists mostly of not allowing photos in the office. Clearly, Magic’s viral hit happened so fast the team hasn’t given privacy a real thought yet.)


The basic model can work. Chen says he’s already seen overwhelmingly that people are willing to pay a little more to just not worry about anything. There’s no system for figuring out how much to charge for an order, though, which has already cost the company money when projects were harder than expected. Every single task and interaction is currently handled by a person, which will quickly become untenable. As he describes building dozens of software iterations just in the last 48 hours, Chen already sounds like he’s forgotten what his bed feels like.


But he’s sold on Magic, and Bettir is rapidly getting out of the blood-pressure business. He’s even still using the service himself, as crazy as it sounds. And he knows he’s riding a rare wave of virality, so he’s committed to getting Magic right and doing it quickly. He has to figure out how to scale, how to automate, how to hire enough contractors all over the country and pay them well enough that they’ll stick around long enough to upend an entire industry with a little technology and a lot of elbow grease. It sounds insane. It also sounds like Uber.


I never did get that Cabernet. But at 11:22 on Monday night, I got a text. There’s a place with the perfect wine for me, it said, but it’s closed. “But I will be in touch with you tomorrow,” my magician wrote, “with a tasty treat. :)”


I saved the number.