Acer Bets Big on Chromebooks with First 15.6-Inch Model


The Acer Chromebook 15 features a 1920 x 1080 resolution 15.6-inch display.

The Acer Chromebook 15 features a 1920 x 1080 resolution 15.6-inch display. Acer



Chromebooks were a hot item this holiday season, and it looks like PC makers will push that momentum further in 2015. Acer, the world’s second largest Chromebook seller according to stats from Gartner, hopes to woo more onto the web-based notebook platform with the world’s first 15.6-inch Chromebook, among other announcements for the new year.


The Acer Chromebook 15 has modest specs. It houses a fifth-gen Intel Celeron processor and features a 1920 x 1080 HD resolution display (buyers can alternatively opt for a cheaper 1366 x 768 resolution display). As for connectivity, it’s got 802.11ac Wi-Fi and Bluetooth LE, along with USB 3.0, USB 2.0 and HDMI ports. Weighing in at 4.85 pounds, it promises eight hours of battery life.


While the availability date is TBD, the Chromebook 15 will start at a very reasonable $250 and come in 16 or 32 GB SSD variants with up to 4 GB of RAM. Acer also updated its smaller Chromebook 13 with a touchscreen display, but we’ll get more details about when that’s available at a later date.


Aspire V17 Nitro Black Edition VN7-791_backlit keyboard

Acer



Acer is also updating its Aspire V 17 Nitro Notebook PC series with Intel 3D cameras. The camera, composed of a traditional camera, an IR camera, and an IR laser projector, can interpret 3D movements so you can control the screen without ever having to touch a mouse or keyboard, and it can also act as a 3D scanner, which is useful for generating 3D models and for 3D printing. Acer has a couple of included programs that take advantage of these capabilities.


Other than this fancy front-facing shooter, the Aspire V 17 comes packed with a Core i7 processor, Nvidia GeForce graphics (with up to 4 GB of dedicated video RAM for super smooth gameplay), a 128GB or 256GB SSD, and up to 16 GB of memory. The notebook also includes a redesigned fan intended to be quieter, keep your PC cooler, and eliminate dust build-up. You can also opt for a Blu-Ray or DVD player on your machine. The Aspire V 17 Nitro goes on sale in January, price as yet unspecified.


And in the home theater arena, Acer’s also adding a new short-throw projector (the H7550ST) to its collection. It has a Chromecast built in, 2D to 3D conversion, and the ability to stream HD audio over Bluetooth to speakers or a headset. It’ll be available in March for $1,000.


We’ll get a chance to go hands-on with these new products (and many others) beginning Sunday at CES in Las Vegas.



Walter Isaacson on The Imitation Game and Making Alan Turing Famous


Isaacson_PatriceGilbert

Patrice Gilbert



The recent film The Imitation Game stars Benedict Cumberbatch as Alan Turing, a British mathematical genius who helped the Allies win World War II by working to break the German Enigma code. After the war Turing was persecuted for his homosexuality, and subjected to cruel and degrading treatment that led him to take his own life. Last year Turing received a posthumous pardon from the Queen, and his legacy endures in such areas as mathematics, computer science, and artificial intelligence. One of his admirers is Walter Isaacson, whose new book The Innovators profiles Turing and other digital pioneers.


“One of the reasons I wrote this book is because I wanted to make people like Alan Turing famous,” Isaacson says in Episode 131 of the Geek’s Guide to the Galaxy podcast. “And now I must admit that Benedict Cumberbatch, by playing him, has done that a thousand times better than I ever could have.”


Isaacson is famous for his biographies of such figures as Benjamin Franklin, Albert Einstein, and Steve Jobs. But lately he’s come to feel that the biography format puts too much emphasis on individual personalities. The Innovators tries to show that great breakthroughs mostly come from team efforts, something The Imitation Game conveys very well.


“What the movie does show clearly is that Turing comes to the realization that you can’t do it alone, you’ve got to collaborate and be part of a team,” Isaacson says.


Isaacson hopes the film will inspire audiences to seek out more information about the real-life story of Turing, whether that means turning to The Innovators or to other works of nonfiction such as Alan Turing: The Enigma by Andrew Hodges.


“The movie does get to some real truths by taking literary license, but also the real story of Alan Turing is just a beautiful, heroic, and tragic story,” he says.


Listen to our complete interview with Walter Isaacson in Episode 131 of the Geek’s Guide to the Galaxy podcast (above), in which he discusses the work of Alan Turing and other digital pioneers, and check out some highlights from the discussion below.


Walter Isaacson on Ada Lovelace:


“She was Lord Byron’s daughter, and thus she was kind of poetic, but her mother was a mathematician, so she developed what she called ‘poetical science,’ and she loved looking at how punchcards were instructing the looms of industrial England in the 1830s to make beautiful patterns. She had a friend, Charles Babbage, who was making a numerical calculator, and she realized that with punch cards that calculator could do anything—art, music, words, as well as numbers. And so to me she’s a patron saint of the revolution. … So I think that women have been at the forefront of pioneering the art of programming, but they’ve been written out of histrory a bit, and they really haven’t had as much of a role since then as they should have. … My daughter first introduced me to the importance of Ada Lovelace, because she was 15 and a computer geek, and she said that the only computer programmer who was a woman she’d ever heard of was Oracle in the Batman comics. And then she heard of Ada Lovelace, so she got excited, because she realized that real women could be programmers.”


Walter Isaacson on the creation of the Internet:


“When I was at Time magazine, we wrote the story that it was done to survive a nuclear attack, and we got a letter from Steve Crocker, who was in charge of what was called the ‘Request for Comments’—these were the ideas and rules and protocols for doing the Internet. And he sent us a letter saying, ‘No, that’s not why the Internet was created. It was created because we wanted to decentralize control over it.’ And Time magazine was very arrogant back in those days, so it sent a letter back to Steve Crocker saying, ‘No, we’re not going to print your letter, because we have better sources than you about why it was done.’ And I thought, ‘Well, that’s ridiculous.’ But when I was doing this book I still had the right to go back rummaging through the archives at Time magazine, and I tried to find out who was the better source—it turned out to have been Steve Lukasik, who had become the head of ARPAnet, and Steve Lukasik said, indeed, that’s how he got the money from the colonels in the Pentagon, or Congress, by emphasizing it would survive a Russian attack. And he said, ‘You can tell Steve Crocker that he was on the bottom and I was on the top, so he didn’t really know what was happening.’ When I sat and had coffee with Steve Crocker, interviewing him for this book, I told him that, and he strokes his chin, and he said, ‘You can tell Steve Lukasik that I was on the bottom and he was on the top, so he didn’t know what was happening.'”


Walter Isaacson on “Al Gore invented the Internet”:


“It got a little annoying after a while, because people would laugh and think, ‘Ha ha, what an original joke.’ And so I did do a bit on why Al Gore was important. When I was running digital management for Time magazine in the early 1990s, you could not as an average person go right onto the Internet. You could only go on the Internet if you were part of a university or a research group, something like that. And in 1992, Al Gore passes the Gore Act of 1992, which opens up the Internet so that anybody who can dial up with a modem and get to an online service like AOL or CompuServe or Prodigy, or just wants to dial up, can go directly onto the Internet. This transforms the digital revolution. It makes it not just a network of research centers, but it makes it into the Internet we have today. At that time, speaking of WIRED and Time magazine, Louis Rossetto and I were friends—he had founded WIRED—and we were both on AOL and CompuServe, these proprietary services. And it was in late 1993, I remember talking to him about, ‘Why don’t we go directly onto the Internet?’ Especially since the World Wide Web had been developed by Tim Berners-Lee, which made it easier to navigate to places on the Internet. And that was a big transforming thing that happens in 1992-1994 where the number of websites goes from zero to 10,000 in one year, and it’s largely because of the Gore Act of 1992, which opens up the Internet to the general public.”


Walter Isaacson on artificial intelligence:


“It always seems to be 20 years away. In fact, at the beginning of this year, if you just search it, you’ll find stories in the New York Times saying that neuromorphic chips are being developed that’ll mimic the human mind, and in 20 years we’ll have artificial intelligence. It always seems to be a bit of a mirage, and it always seems that things like Google or Wikipedia that combine human creativity with machine power always make greater advances than machine power alone does. … This is something that Gary Kasparov figures out when he gets beaten by the IBM machine Deep Blue. He decides to create a contest in which humans working with computers can play either the best computer or against the best human grand master. And in all of these contests, the combination of the human and machine—even if it’s amateur players working with laptop machines—tends to beat the grand master or the best computer. And this is a game—chess—which you have to remember is simply an algorithmic rule-driven game, so eventually computers should be able to crack that totally. On far more complicated things like ‘Should the NSA be allowed to eavesdrop?’ that’s a question I don’t think machines will ever be able to answer as well as a combination of machines and humans could.”



Acer Bets Big on Chromebooks with First 15.6-Inch Model


The Acer Chromebook 15 features a 1920 x 1080 resolution 15.6-inch display.

The Acer Chromebook 15 features a 1920 x 1080 resolution 15.6-inch display. Acer



Chromebooks were a hot item this holiday season, and it looks like PC makers will push that momentum further in 2015. Acer, the world’s second largest Chromebook seller according to stats from Gartner, hopes to woo more onto the web-based notebook platform with the world’s first 15.6-inch Chromebook, among other announcements for the new year.


The Acer Chromebook 15 has modest specs. It houses a fifth-gen Intel Celeron processor and features a 1920 x 1080 HD resolution display (buyers can alternatively opt for a cheaper 1366 x 768 resolution display). As for connectivity, it’s got 802.11ac Wi-Fi and Bluetooth LE, along with USB 3.0, USB 2.0 and HDMI ports. Weighing in at 4.85 pounds, it promises eight hours of battery life.


While the availability date is TBD, the Chromebook 15 will start at a very reasonable $250 and come in 16 or 32 GB SSD variants with up to 4 GB of RAM. Acer also updated its smaller Chromebook 13 with a touchscreen display, but we’ll get more details about when that’s available at a later date.


Aspire V17 Nitro Black Edition VN7-791_backlit keyboard

Acer



Acer is also updating its Aspire V 17 Nitro Notebook PC series with Intel 3D cameras. The camera, composed of a traditional camera, an IR camera, and an IR laser projector, can interpret 3D movements so you can control the screen without ever having to touch a mouse or keyboard, and it can also act as a 3D scanner, which is useful for generating 3D models and for 3D printing. Acer has a couple of included programs that take advantage of these capabilities.


Other than this fancy front-facing shooter, the Aspire V 17 comes packed with a Core i7 processor, Nvidia GeForce graphics (with up to 4 GB of dedicated video RAM for super smooth gameplay), a 128GB or 256GB SSD, and up to 16 GB of memory. The notebook also includes a redesigned fan intended to be quieter, keep your PC cooler, and eliminate dust build-up. You can also opt for a Blu-Ray or DVD player on your machine. The Aspire V 17 Nitro goes on sale in January, price as yet unspecified.


And in the home theater arena, Acer’s also adding a new short-throw projector (the H7550ST) to its collection. It has a Chromecast built in, 2D to 3D conversion, and the ability to stream HD audio over Bluetooth to speakers or a headset. It’ll be available in March for $1,000.


We’ll get a chance to go hands-on with these new products (and many others) beginning Sunday at CES in Las Vegas.



Walter Isaacson on The Imitation Game and Making Alan Turing Famous


Isaacson_PatriceGilbert

Patrice Gilbert



The recent film The Imitation Game stars Benedict Cumberbatch as Alan Turing, a British mathematical genius who helped the Allies win World War II by working to break the German Enigma code. After the war Turing was persecuted for his homosexuality, and subjected to cruel and degrading treatment that led him to take his own life. Last year Turing received a posthumous pardon from the Queen, and his legacy endures in such areas as mathematics, computer science, and artificial intelligence. One of his admirers is Walter Isaacson, whose new book The Innovators profiles Turing and other digital pioneers.


“One of the reasons I wrote this book is because I wanted to make people like Alan Turing famous,” Isaacson says in Episode 131 of the Geek’s Guide to the Galaxy podcast. “And now I must admit that Benedict Cumberbatch, by playing him, has done that a thousand times better than I ever could have.”


Isaacson is famous for his biographies of such figures as Benjamin Franklin, Albert Einstein, and Steve Jobs. But lately he’s come to feel that the biography format puts too much emphasis on individual personalities. The Innovators tries to show that great breakthroughs mostly come from team efforts, something The Imitation Game conveys very well.


“What the movie does show clearly is that Turing comes to the realization that you can’t do it alone, you’ve got to collaborate and be part of a team,” Isaacson says.


Isaacson hopes the film will inspire audiences to seek out more information about the real-life story of Turing, whether that means turning to The Innovators or to other works of nonfiction such as Alan Turing: The Enigma by Andrew Hodges.


“The movie does get to some real truths by taking literary license, but also the real story of Alan Turing is just a beautiful, heroic, and tragic story,” he says.


Listen to our complete interview with Walter Isaacson in Episode 131 of the Geek’s Guide to the Galaxy podcast (above), in which he discusses the work of Alan Turing and other digital pioneers, and check out some highlights from the discussion below.


Walter Isaacson on Ada Lovelace:


“She was Lord Byron’s daughter, and thus she was kind of poetic, but her mother was a mathematician, so she developed what she called ‘poetical science,’ and she loved looking at how punchcards were instructing the looms of industrial England in the 1830s to make beautiful patterns. She had a friend, Charles Babbage, who was making a numerical calculator, and she realized that with punch cards that calculator could do anything—art, music, words, as well as numbers. And so to me she’s a patron saint of the revolution. … So I think that women have been at the forefront of pioneering the art of programming, but they’ve been written out of histrory a bit, and they really haven’t had as much of a role since then as they should have. … My daughter first introduced me to the importance of Ada Lovelace, because she was 15 and a computer geek, and she said that the only computer programmer who was a woman she’d ever heard of was Oracle in the Batman comics. And then she heard of Ada Lovelace, so she got excited, because she realized that real women could be programmers.”


Walter Isaacson on the creation of the Internet:


“When I was at Time magazine, we wrote the story that it was done to survive a nuclear attack, and we got a letter from Steve Crocker, who was in charge of what was called the ‘Request for Comments’—these were the ideas and rules and protocols for doing the Internet. And he sent us a letter saying, ‘No, that’s not why the Internet was created. It was created because we wanted to decentralize control over it.’ And Time magazine was very arrogant back in those days, so it sent a letter back to Steve Crocker saying, ‘No, we’re not going to print your letter, because we have better sources than you about why it was done.’ And I thought, ‘Well, that’s ridiculous.’ But when I was doing this book I still had the right to go back rummaging through the archives at Time magazine, and I tried to find out who was the better source—it turned out to have been Steve Lukasik, who had become the head of ARPAnet, and Steve Lukasik said, indeed, that’s how he got the money from the colonels in the Pentagon, or Congress, by emphasizing it would survive a Russian attack. And he said, ‘You can tell Steve Crocker that he was on the bottom and I was on the top, so he didn’t really know what was happening.’ When I sat and had coffee with Steve Crocker, interviewing him for this book, I told him that, and he strokes his chin, and he said, ‘You can tell Steve Lukasik that I was on the bottom and he was on the top, so he didn’t know what was happening.'”


Walter Isaacson on “Al Gore invented the Internet”:


“It got a little annoying after a while, because people would laugh and think, ‘Ha ha, what an original joke.’ And so I did do a bit on why Al Gore was important. When I was running digital management for Time magazine in the early 1990s, you could not as an average person go right onto the Internet. You could only go on the Internet if you were part of a university or a research group, something like that. And in 1992, Al Gore passes the Gore Act of 1992, which opens up the Internet so that anybody who can dial up with a modem and get to an online service like AOL or CompuServe or Prodigy, or just wants to dial up, can go directly onto the Internet. This transforms the digital revolution. It makes it not just a network of research centers, but it makes it into the Internet we have today. At that time, speaking of WIRED and Time magazine, Louis Rossetto and I were friends—he had founded WIRED—and we were both on AOL and CompuServe, these proprietary services. And it was in late 1993, I remember talking to him about, ‘Why don’t we go directly onto the Internet?’ Especially since the World Wide Web had been developed by Tim Berners-Lee, which made it easier to navigate to places on the Internet. And that was a big transforming thing that happens in 1992-1994 where the number of websites goes from zero to 10,000 in one year, and it’s largely because of the Gore Act of 1992, which opens up the Internet to the general public.”


Walter Isaacson on artificial intelligence:


“It always seems to be 20 years away. In fact, at the beginning of this year, if you just search it, you’ll find stories in the New York Times saying that neuromorphic chips are being developed that’ll mimic the human mind, and in 20 years we’ll have artificial intelligence. It always seems to be a bit of a mirage, and it always seems that things like Google or Wikipedia that combine human creativity with machine power always make greater advances than machine power alone does. … This is something that Gary Kasparov figures out when he gets beaten by the IBM machine Deep Blue. He decides to create a contest in which humans working with computers can play either the best computer or against the best human grand master. And in all of these contests, the combination of the human and machine—even if it’s amateur players working with laptop machines—tends to beat the grand master or the best computer. And this is a game—chess—which you have to remember is simply an algorithmic rule-driven game, so eventually computers should be able to crack that totally. On far more complicated things like ‘Should the NSA be allowed to eavesdrop?’ that’s a question I don’t think machines will ever be able to answer as well as a combination of machines and humans could.”