SAR11, oceans' most abundant organism, has ability to create methane

The oxygen-rich surface waters of the world's major oceans are supersaturated with methane -- a powerful greenhouse gas that is roughly 20 times more potent than carbon dioxide -- yet little is known about the source of this methane.



Now a new study by researchers at Oregon State University demonstrates the ability of some strains of the oceans' most abundant organism -- SAR11 -- to generate methane as a byproduct of breaking down a compound for its phosphorus.


Results of the study are being published this week in Nature Communications. It was funded by the National Science Foundation and the Gordon and Betty Moore Foundation.


"Anaerobic methane biogenesis was the only process known to produce methane in the oceans and that requires environments with very low levels of oxygen," said Angelicque "Angel" White, a researcher in OSU's College of Earth, Ocean, and Atmospheric Sciences and co-author on the study. "In the vast central gyres of the Pacific and Atlantic oceans, the surface waters have lots of oxygen from mixing with the atmosphere -- and yet they also have lots of methane, hence the term 'marine methane paradox.'


"We've now learned that certain strains of SAR11, when starved for phosphorus, turn to a compound known as methylphosphonic acid," White added. "The organisms produce enzymes that can break this compound apart, freeing up phosphorus that can be used for growth -- and leaving methane behind."


The discovery is an important piece of the puzzle in understanding Earth's methane cycle, scientists say. It builds on a series of studies conducted by researchers from several institutions around the world over the past several years.


Previous research has shown that adding methylphosphonic acid, or MPn, to seawater produces methane, though no one knew exactly how. Then a laboratory study led by David Karl of the University of Hawaii and OSU's White found that an organism called Trichodesmium could break down MPn and thus it could be a potential source of phosphorus, which is a critical mineral essential to every living organism.


However, Trichodesmium are rare in the marine environment and unlikely to be the only source for vast methane deposits in the surface waters.


So White turned to Steve Giovannoni, a distinguished professor of microbiology at OSU, who not only maintains the world's largest bank of SAR11 strains, but who also discovered and identified SAR11 in 1990. In a series of experiments, White, Giovannoni, and graduate students Paul Carini and Emily Campbell tested the capacity of different SAR11 strains to consume MPn and cleave off methane.


"We found that some did produce a methane byproduct, and some didn't," White said. "Just as some humans have a different capacity for breaking down compounds for nutrition than others, so do these organisms. The bottom line is that this shows phosphate-starved bacterioplankton have the capability of producing methane and doing so in oxygen-rich waters."


SAR11 is the smallest free-living cell known and also has the smallest genome, or genetic structure, of any independent cell. Yet it dominates life in the oceans, thrives where most other cells would die, and plays a huge role in the cycling of carbon on Earth.


These bacteria are so dominant that their combined weight exceeds that of all the fish in the world's oceans, scientists say. In a marine environment that's low in nutrients and other resources, they are able to survive and replicate in extraordinary numbers -- a milliliter of seawater, for instance, might contain 500,000 of these cells.


"The ocean is a competitive environment and these bacteria apparently won the race," said Giovannoni, a professor in OSU's College of Science. "Our analysis of the SAR11 genome indicates that they became the dominant life form in the oceans largely by being the simplest."


"Their ability to cleave off methane is an interesting finding because it provides a partial explanation for why methane is so abundant in the high-oxygen waters of the mid-ocean regions," Giovannoni added. "Just how much they contribute to the methane budget still needs to be determined."


Since the discovery of SAR11, scientists have been interested in their role in Earth's carbon budget. Now their possible implication in methane creation gives the study of these bacteria new importance.



Mechanism that prevents lethal bacteria from causing invasive disease is revealed

An important development in understanding how the bacterium that causes pneumonia, meningitis and septicaemia remains harmlessly in the nose and throat has been discovered at the University of Liverpool's Institute of Infection and Global Health.



Streptococcus pneumoniae is a 'commensal', which can live harmlessly in the nasopharynx as part of the body's natural bacterial flora. However, in the very young and old it can invade the rest of the body, leading to serious diseases such as pneumonia, sepsis and meningitis, which claim up to a million lives every year worldwide.


However, the conditions that drive this bacterium from harmless commensal to major pathogen are not understood.


Scientists at the University have now uncovered the mechanisms by which this occurs and how it is regulated by the host immune system.


They found that a specialised group of white blood cells called T regulatory cells are activated by the pneumococcus and move to dampen down a damaging pro-inflammatory response from the host immune system.


When white blood cells attack bacteria they cause inflammation and, if this inflammation is uncontrolled it can become excessive and damage host tissues, allowing the bacteria to spread into the rest of the respiratory system and other organs in the body.


The first author of the study, immunologist Dr Daniel Neill said: "These bacteria are quite happy to live in your nose and it's not in their interests to spread and kill their host. This is why they activate T regulatory cells: to keep the immune system in check and ensure their own survival.


"Our findings suggest induction of T regulatory cell responses in the upper airways reduces the risk of inflammatory damage that could lead to bacterial invasion and the development of disease.


"Understanding this process can now lead us to investigate how the bacteria go from this state to causing lethal infections."


The senior author of the study, Professor Aras Kadioglu said: "Vaccines are an essential part of our fight against this disease and have been highly successful.


"However, they do not protect us against all strains of pneumococci. Therefore, understanding the key immunological interactions with the pneumococcus, in the very first site they enter and colonise the human body is crucial to future development of better vaccines.


"In this study we have revealed how there is a delicate balance between the ability of the pneumococcus to colonise the host nasopharynx and the critical need of the immune system to prevent damaging inflammation in this key site.


"We hope that this will lead to developing novel therapies based on modulating the host immune system to prevent subsequent invasive disease."


The paper 'Density and Duration of Pneumococcal Carriage Is Maintained by Transforming Growth Factor b1 and T Regulatory Cells', was published in the American Journal of Respiratory and Critical Care Medicine.




Story Source:


The above story is based on materials provided by University of Liverpool . Note: Materials may be edited for content and length.



From antibiotics to yeast: Latest student science heads for space

Astronauts on future missions may nibble on lettuce and grow their own antibiotics, depending on the results of research that student scientists plan to conduct on the International Space Station.



Mission 5 of the Student Spaceflight Experiments Program (SSEP) is scheduled to launch to the space station on July 11. A total of 1,344 proposals yielded 15 selected investigations for the flight. These investigations represent a diversity of subject matter from bacteria to tadpole shrimp and locations from Massachusetts to Arizona.


The provision of food in space proved popular, the focus of four studies. Students from Riebli Elementary and Mark West Charter School in California examine whether Triops longicaudatus, or tadpole shrimp, could be grown in microgravity as a food source for long-term missions. The species' small size and high-protein content make that an attractive possibility.


Ninth-graders from Cesar Chavez Public Charter School for Public Policy in Washington, focused their attention on whether radish roots and shoots will grow differently in microgravity. Students from Cottage Lane Elementary in Rockland County, New York and Hillsborough County, Florida, envisioned astronauts growing their own lettuce. The Cottage Lane students hope to determine how long the plant takes to germinate in microgravity, while the Florida group looks at the frequency of lettuce seed germination in space.


Several investigations have potential applications for keeping humans healthy in orbit and on the ground. Northland Preparatory Academy students in Flagstaff, Arizona, will analyze how microgravity affects onion cell DNA replication. Onion cell mutations could have ramifications for other organisms, including astronauts. The team at Academy at Shawnee in Kentucky wonders whether microgravity would increase the rate of yeast fermentation in honey. Yeast fermentation on Earth is used to produce alcohol, which could be used as antiseptics or in food production in space.


A study by Murray Hill Middle School in Maryland investigates the effects of microgravity on microencapsulation, a process that could be used to help control the rate at which a drug is released in the body. If you cut a Dugesia Planarian worm would it grow back in microgravity? Sixth graders at North Attleborough Middle School in Massachusetts want to know the answer, which could eventually be put to use healing wounds in space and on Earth.


Students at Mendenhall Middle School in North Carolina will examine whether calcium sulfate crystals grown in space differ in size from those on Earth. Crystal formation may cause jellyfish born in microgravity to lose their sense of direction and could potentially affect humans in the same way.


Other student groups focused on fungus and bacteria. A team from Brookhaven Academy in Mississippi will determine whether the bacteria Ralstonia eutropha maintains its ability to produce polyhydroxyalkanoates (PHA) in microgravity. A biodegradable polyester created by bacterial fermentation, PHA can be used to make things such as skin grafts and valve replacements, which would come in handy up in space.


Eighth graders at Pennsauken Phifer Middle School in New Jersey will examine the growth rate in microgravity of penicillium, which future astronauts could grow as an antibiotic to treat infections. Montachusett Regional Vocational Technical School in Massachusetts has a team monitoring the effect of microgravity on the growth of Bacillus subtilis, also useful as an antibiotic.


Mold also is on the mind of students at New Explorations into Science, Technology and Math High School in New York. Their investigation of the effect of microgravity on the growth of mold on white bread will show the amount of mold present in dust on the station and the ground.


Two teams are interested in rust in space. St. Peter's School students in Kansas City, Missouri, want to determine how microgravity affects oxidation, which could rust the interior and exterior of spacecraft. Milton L. Olive Middle School in New York evaluates the effectiveness of a commercial spray corrosion inhibitor, Rust-Oleum's 'Stops Rust,' in microgravity.


This wide range of subjects illustrates the diversity of the space station as a microgravity research laboratory. These are also experiments that can't easily be done anywhere else, as NCESS director Jeff Goldstein, Ph.D., explains. "Typically, researchers change one variable, hold everything else constant, and then see what happens. Gravity is a key variable, one we take for granted and also one that we don't have the ability to change at will." The space station offers students the ability to ask, what system would I like to explore with gravity seemingly turned off in order to assess its role?


Schools submit proposals to compete for a spot, with selection based on protocols patterned after what NASA uses with the professional community.


So far, more than 30,000 students have had the experience of designing experiments for microgravity through the SSEP program, which is part of the National Center for Earth and Space Science Education (NCESSE) in the U.S. and the Arthur C. Clarke Institute for Space Education internationally. A strategic partnership with NanoRacks LLC, working with NASA under a Space Act Agreement, makes the space station available as a student laboratory. Participation by nine of the Mission 5 communities was possible in part thanks to a grant to NCESSE from the Center for the Advancement of Science in Space (CASIS), a National Partner on SSEP.


Eighty-five communities across the U.S. and Canada have participated in SSEP, 19 of them in multiple flights. Students share their results at the Annual SSEP National Conference in Washington each July. Whether or not any of them continue conducting science investigations in space when they grow up, the astronauts of the future will be grateful for their hard work now.




Story Source:


The above story is based on materials provided by NASA . Note: Materials may be edited for content and length.



High-quality gene catalog of human gut microbiome created

Researchers from BGI, working within the Metagenomics of the Human Intestinal Tract (MetaHIT) project, and in collaboration with other institutions around the world , have established the highest quality integrated gene set for the human gut microbiome to date- a close-to-complete catalogue of the microbes that reside inside us and massively outnumber our own cells. While the roughly 20,000 genes in the human genome have been available for over a decade, the gene catalog of the microbiome, our much larger "other genome," has to date been much more poorly understood and characterized.



The data released from this study should facilitate further research on the interactions between human and microbial genomes, and brings us closer to an understanding of how to maintain the microbial balance that keeps us healthy. The latest study was published online today in the journal Nature Biotechnology.


Each of our guts is colonized by more than 3 pounds of microorganisms that can break down toxins, manufacture vitamins and essential amino acids, and form a barrier against invaders. However, until now there has been a lack of comprehensive and uniformly processed database resources cataloging the human gut microbiota around the world, which has hindered our knowledge of the genetic and functional mechanism of human gut microbes.


In this study, researchers established a catalog of the human gut microbial genes by processing 249 newly sequenced samples and 1,018 published samples from MetaHIT, Human Microbiome Project (HMP) and a large diabetes study from China, as well as 511 sequenced genomes of gut-related bacteria and archaea. This expanded research is at least three times larger than the cohorts used for previous gene catalogs.


Based upon the catalog, researchers investigated the gut microbiota of healthy Chinese and Danish adults, and found the two cohorts greatly differed in nutrient metabolism as well as xenobiotic detoxification, which might be influenced by the differences in diet and environment. In addition, they observed enrichment in possible antibiotic resistance genes both at the population level (penicillin resistance in Danes and multidrug resistance in Chinese) and in the individual-specific genes, which highlighted the need for close monitoring of direct and indirect exposure to antibiotics.


Individual-specific genes contributed overwhelmingly to the increased total gene number in the integrated gene catalog and were overrepresented in genes responsible for the synthesis of cell wall components, DNA-related functions such as transposases, endonucleases and DNA methylases and encoding phage-related proteins. Such individual-specific genes likely reflect adaptation and might reflect the distinct combination of genetic, nutritional and medical factors in a host.


This nonredundant reference catalog of over 9.8 million genes is freely accessible through the website and the data have also been deposited in BGI's GigaScience Database, GigaDB and the SRA. It provides a much expanded and invaluable resource for global researchers to more deeply explore the geographical, genetic, temporal and physiological characteristics of gut microbes.


Junhua Li, Research Scientist from BGI, said, "Catalogs of reference genes in the human gut microbiome should facilitate quantitative characterization of multi-omic data from the gut microbiome to understand its variation across populations in human health and disease."




Story Source:


The above story is based on materials provided by BGI Shenzhen . Note: Materials may be edited for content and length.



Revolutionary approach to studying intestinal microbiota

An international research team within the MetaHIT consortium coordinated by INRA and involving teams from CEA, CNRS and Université d'Evry, has developed a new method to analyse the global genome, or the metagenome of the intestinal microbiota[1]. This method markedly simplifies microbiome analysis and renders it more powerful. The scientists have thus been able to sequence and assemble the complete genome of 238 intestinal bacteria, 75% of which were previously unknown. This work is being published on the 6th of July 2014 in Nature Biotechnology.



Research carried out in recent years on the intestinal microbiota has completely overturned our vision of the human gut ecosystem. Indeed, from "simple digesters" of food, these bacteria have become major factors in understanding certain diseases such as obesity, type 2 diabetes, or Crohn's disease. Important and direct links have also been demonstrated between these bacteria and the immune system, as well as with the brain. It is estimated that 100,000 billion bacteria populate the gut of each individual (or 10 to 100 times more than the number of cells in the human body), and their diversity is considerable, estimated to around a thousand different bacterial species in the intestinal human metagenome. However, because only 15% of these bacteria were previously isolated and characterized by genome sequencing, an immense number of the microbial genes previously identified still need to be assigned to a given species.


Researchers from INRA, together with teams from CEA (Genoscope), CNRS and Université d'Evry in France, and scientists from other countries, have developed a new method that can markedly facilitate analysis of the gut metagenome, while at the same time improving the quality of the data obtained. To achieve this, they based themselves on a simple hypothesis:



  • Within a bacterial species harboured by the gut of an individual, the abundance of genes remains constant, since every bacterium of a same species have the same genes

  • However, the relative abundance of these different species can vary markedly between individuals, from 10-fold to 1000-fold, so that of course the abundance of genes harboured by an individual varies to the same extent.

  • By measuring the abundance of bacterial genes in different individuals, it would therefore be possible to group the genes of a specific bacterial species, because their abundance is the same in a particular individual but differs between individuals.


An analysis of 396 stool samples from Danish and Spanish individuals allowed the researchers to cluster these millions of genes into 7381 co-abundance groups of genes. Approximately 10% of these groups (741) corresponded to bacterial species referred to as metagenomic species (MGS); the others corresponded to bacterial viruses (848 bacteriophages were discovered), plasmids (circular, bacterial DNA fragments) or genes which protected bacteria from viral attack (known as CRISPR sequences). 85% of these MGS constituted unknown bacteria species (or ~630 species).


Using this new approach, the researchers succeeded in reconstituting the complete genome of 238 of these unknown species, without prior culture of these bacteria. Living without oxygen, in an environment that is difficult to characterise and reproduce, most of these gut bacteria cannot be cultured in the laboratory. And until now, analysis of the metagenome was based on comparing the genes detected in a sample with those listed in catalogues of genes from bacteria that were known and could be cultivated in a laboratory (or 15% of gut bacteria), so that it was impossible to assign genes to non-cultivable bacteria.


The authors also demonstrated more than 800 dependent relationships within the 7381 gene co-abundance groups; this was the case, for example, of phages which require the presence of a bacterium to survive. These dependent relationships thus enable a clearer understanding of the survival mechanisms of a micro-organism in its ecosystem. It is also the first time that an analysis has clarified the relationships between different biological entities in the gut microbiota, which will facilitate their detection, isolation and culture.


This study throws unequalled and very detailed light on microbial communities in humans. The method thus developed enables considerably simpler analysis of genes in the gut microbiota; it is now possible to study just a few thousand genetic elements, or hundreds of species, rather than the millions of genes that make up the metagenome. The method also markedly improves the reliability and accuracy of statistical analyses.




Story Source:


The above story is based on materials provided by INRA-France . Note: Materials may be edited for content and length.



New knowledge about intestines uncovered: microorganisms, bacterial viruses in intestinal flora identified

Researchers from DTU Systems Biology have mapped 500 previously unknown microorganisms in human intestinal flora as well as 800 also unknown bacterial viruses (also called bacteriophages) which attack intestinal bacteria.



To map the microorganisms, the researchers have developed a new principle for analysing DNA sequence data, which they have named the co-abundance principle. A principle which basically assumes that different pieces of DNA from the same organism will occur in the same amount in a sample, and that this amount will vary over a series of samples.


"Using our method, researchers are now able to identify and collect genomes from previously unknown microorganisms in even highly complex microbial societies. This provides us with an overview we have not enjoyed previously," says Professor Søren Brunak who has co-headed the study together with Associate Professor Henrik Bjørn Nielsen.


So far, 200-300 intestinal bacterial species have been mapped. Now, the number will be more than doubled, which could significantly improve our understanding and treatment of a large number of diseases such as type 2 diabetes, asthma and obesity.


Viruses -- not antimicrobial agents.


The two researchers have also studied the mutual relations between bacteria and viruses.


"Our study tells us which bacterial viruses attack which bacteria, something which has a noticeable effect on whether the attacked bacteria will survive in the intestinal system in the long term," says Henrik Bjørn Nielsen


Previously, bacteria were studied individually in the laboratory, but researchers are becoming increasingly aware that in order to understand the intestinal flora, you need to look at the interaction between the many different bacteria found.


And when we know the intestinal bacteria interactions, we can potentially develop a more selective way to treat a number of diseases.


"Ideally we will be able to add or remove specific bacteria in the intestinal system and in this way induce a healthier intestinal flora," says Søren Brunak.


It is particularly interesting in relation to the increasing problem of antimicrobial resistance which many consider a real threat to global health.


"We have previously been experimenting with using bacteria and viruses to fight disease, but this was shelved because antimicrobial agents have been so effective in combating many infectious diseases. If we can learn more about who attacks who, then bacterial viruses could be a viable alternative to antimicrobial agents. It is therefore extremely important that we now can identify and describe far more relations between bacteria and the viruses that attack them," says Henrik Bjørn Nielsen.




Story Source:


The above story is based on materials provided by Technical University of Denmark . The original article was written by Signe Gry Braad. Note: Materials may be edited for content and length.



Why Was This Year’s July 4 Box Office So Bad? Blame Michael Bay


Image: michaelbay.com

Image: michaelbay.com



Few holiday weekends mean as much to the film industry as July Fourth: the holy trifecta of summertime schedules, long weekends, and family togetherness has traditionally made it one of the prime box-office opportunities of the year. Yet, this weekend was…well, to say it was tepid is an understatement: the top 12 releases managed a cumulative domestic gross of just $120 million, down 46 percent from last year.


On the surface, the causes are obvious. First, the holiday landed on a Friday, shortening what’s typically at least a four-day release window in which studios can rake in the cash with stunts like Wednesday show times. Second, Hurricane Arthur rained fury down on the Northeast corridor. And third, there were no true tentpole movies. Yes, America loves Melissa McCarthy, but road tripping with her and Susan Sarandon in Tammy doesn’t quite fit the historical algorithm for Independence Day-levels of boffo box office. (And we’ll just leave Deliver Us From Evil alone, since that’s clearly a February release that lost its way.)


But that’s just looking at what happened on the weekend in question. And it’s only by looking at the previous weekend that tells you everything you need to know: Michael Bay tanked everyone else’s holiday weekend.


You can’t change the calendar, but you can sure as hell pick your release date based around it—and that’s exactly what Paramount did with Transformers: Age of Extinction, the latest installment of its multi-billion dollar mechanized workhorse. No matter how many rotten tomatoes you throw at Bay, the man still owns the July 4 money-printing machine. And when you own something you can do whatever you want with it—including ditch it for a pre-holiday release, and then watch your movie run roughshod over two straight weeks of box-office competition.


It’s not very American, but when no one wants to compete with you, you win by default—and why should any studio with half a braintrust in charge want to try? 2001’s Transformers: Dark of the Moon and the original 2007 Transformers in 2007 are the first and fourth biggest July 4 openings of all time. And even though Age of Extinction opted for a June 27th premiere, its second week take still claimed the top spot for the holiday weekend. Even a modest $37 million long weekend is still good enough to beat runner-up Tammy by a cool $16 million, and top the 17th highest-grossing Fourth of July release of all time (a little flick called Armageddon). In fact, three of Bay’s pictures are the holiday’s all-time top 20 list, more than any other single director. But that pales in comparison to how he runs the summer:

Screen Shot 2014-07-07 at 3.26.33 PM


It’s not like Bay is impervious to flops, but even his second-worst performing movie, Pain and Gain, managed to double its budget and debut at number one. In fact, besides The Island, no Michael Bay directorial effort has debuted below the top spot, and the past two Transformers movies have retained that spot through their second weeks, with Age of Extinction making it a hat trick. Fox studios knew better than to compete, opting to forego the Fourth and roll out its much-anticipated Dawn of the Planet of the Apes on July 11th, giving Hasbro’s army a full two weeks to settle down before launching it’s own box office invasion.


What we witnessed during this holiday drought was a lesson being learned by major film distributors: as Michael Bay goes, so goes the nation. And if you’re going to map out your summer release schedule for maximum success, best to give a wide berth to the Optimus Prime (or, actually, Megatron) of tentpole movies. In an era when hundred-million-opening-or-bust is the measure of a movie’s success, it’s no wonder that every studio turned yellow at the specter of Bay—even if that means we the consumer get meager offerings as a result. So when Independence Day 2015 rolls around and you’re revving up for a long weekend of brews, BBQs, fireworks and multiplex fun, just hope that Bad Boys 3 announced a late June/early July release. Your pickings will likely be slim.



How Today’s F1 Cars Are So Amazingly Safe (And Horribly Uncomfortable)


Raikkonen during the British Grand Prix.

Raikkonen during qualifying for the British Grand Prix. Ferrari



If you saw Kimi Raikkonen hit a wall at 150 mph during the British Grand Prix, you appreciate how remarkably strong, and safe, a modern Formula 1 car is. The Ferrari driver experienced a 47G impact when he went nose-first into wall yet limped away with no major injuries.


We’ve seen no shortage of spectacular crashes in F1 in recent years, and each is a testament to the level of safety engineered into the cars. Raikkonen’s shunt during the first lap was scary to see. The Finn ran wide through Turn 5, went into the ample run-off-area, then clipped a rain gully as he tried to get back on track. That sent him careening into the wall, then spinning back across the track into the opposite wall. Williams F1 driver Felipe Massa—making his 200th career start—couldn’t avoid the careering Ferrari and ran right into him. Amazingly, no one was seriously injured in the pile-up, and Raikkonen suffered nothing more than a sore ankle.



Fireworks Fails and Other Independence Day Blunders Caught on Camera



As this past weekend proved, fireworks remain the most American thing you can do with your Independence Day. It doesn’t matter if you can’t even see them because it’s only 8 p.m. and the sun doesn’t set for another hour, someone will be so excited by what John Oliver calls “sparkly guns you can fire in the sky” that they’ll set them off anyway just to hear them make their slight screams and upsetting-to-dogs-everywhere pops. But what is it about fireworks—and many of the other Fourth of July festivities—that make them quite so compelling, considering their potential for, well, blowing up in the faces of those planning them?


I mean, sure, there’s historical precedent behind fireworks being at the center of most people’s July 4th celebrations. The very first Independence Day featured fireworks, as did George Washington’s inauguration, making the very idea of shooting off fireworks seem pretty damn American, thank you very much. But if historical accuracy were the deciding factor for most holiday celebrations, Wikipedia would have no reason to list “family reunions” or “baseball games” as traditional July 4 activities, but yet it does.


That said, family reunions are safer ways to spend your holiday than messing around with fireworks. (After all, your aunt might not approve of your career, but her disdain is hardly likely to result in you losing an eyebrow.) And that’s just the beginning. Independence Day is chock full of ways to hurt yourself over the course of a long weekend—and if you have a camera/smartphone handy, make a Reddit-ready viral video or Vine. Here are just a few ways Fourth of July festivities could go wrong, presented by the quick videographers who were there to catch them.



What are you going to simulate? [Pharyngula]



The EU is sinking €1.2bn (and the US is proposing to spend more, $3 billion) into a colossal project to build a supercomputer simulation of the human brain. To which I say, “What the hell? We aren’t even close to building such a thing for a fruit fly brain, and you want to do that for an even more massive and poorly mapped structure? Madness!” It turns out that I’m not the only one thinking this way: European scientists are exasperated with the project.



"The main apparent goal of building the capacity to construct a larger-scale simulation of the human brain is radically premature," Peter Dayan, director of the computational neuroscience unit at UCL, told the Guardian.


"We are left with a project that can’t but fail from a scientific perspective. It is a waste of money, it will suck out funds from valuable neuroscience research, and would leave the public, who fund this work, justifiably upset," he said.



There is a place for Big Science. I’d suggest that when you’re at the preliminary exploratory stage, as we are with human brain function, it’s better to fund many small exploratory parties to map out the terrain, rather than launching a huge invasion with charts that are made out of speculation. We know a computer simulation is going to fail, because we don’t know what it’s going to simulate. So why are they doing this? Maybe it’s a question of who “they” are.



Alexandre Pouget, a signatory of the letter at Geneva University, said that while simulations were valuable, they would not be enough to explain how the brain works. "There is a danger that Europe thinks it is investing in a big neuroscience project here, but it’s not. It’s an IT project," he said. "They need to widen the scope and take advantage of the expertise we have in neuroscience. It’s not too late. We can fix it. It’s up to Europe to make the right decision."



I’ve noticed this, that a lot of gung-ho futurists and computer scientist types have this very naive vision of how the brain works — it’s just another computer. We can build those. Build a big enough computer, and it’ll be just like the brain. Nope. That’s operating on ignorance. And handing ignorant people billions of dollars to implement a glorious model of their ignorance is an exercise in futility.




Closure on the Obokata/STAP affair [Pharyngula]


I’ve been following the story of stimulus-triggered acquisition of pluripotency (STAP) cells with considerable interest, and there’s a good reason for that: from the very beginning, it contradicted how I’d always thought about cell states, and if it were true, I’d have to rethink a lot of things, which was vexing. But on the other hand, empirical results always trump mental models, so if the results held up, there was no question but that I’d have to go through that uncomfortable process of reorganizing my preconceptions. It would be OK, though, because there’d be a great prize at the end.


Well, it turns out that I don’t have to reboot my brain after all, because now that all the flailing about is over, STAP is a product of sloppiness and fakery, and is dead.


So here’s the controversy, and why I found it vexatious. We want to be able to specify cell states; in particular, we’d love to be able to take any cell from the human body, tickle it with a few specific signals, and see it throw away all of its historical constraints and become a different cell type altogether. In particular, the Holy Grail is to find the right combination of switches to cause any cell to become a pluripotent stem cell — the kind of cell we can then induce to become any other cell type we might need.


We know this can’t be impossible, and is probably even fairly simple, because we know that cells can do this already (well, to some degree; your body accomplishes this task by setting aside reserve populations of stem cells. It’s also likely that some cell types are so tightly locked in by the process of differentiation that their state is not reversible). The idea is that we just need to find the right combination of signals/genes — the right kind of key — and we can unlock the cell, and make it open to additional inductions that will allow us to manipulate it.


We have some idea of the shape of the key. Yamanaka identified four genes, Oct4, Sox2, cMyc, and Klf4, that when activated, switched cells into a pluripotent state, making induced pluripotent stem cells, or iPS cells. It works. The handicap right now is that we only have a kind of brute force method of switching those genes on, and two of them are oncogenic, so it’s as if we’ve got a rather clumsy key that opens the lock, but also damages it in unfortunate ways. The resolution to that problem, though, was learning how to finesse the genes — we need to figure out how to more delicately switch on the necessary genes by a way other than bluntly transfecting cells with copies of the genes that are always on.


Then along came Haruko Obokata, an investigator in Japan who announced that she could induce stem cells with simple, generic stress, such as by exposing them to acid or physically pushing on the cells. It was like saying she didn’t need a specific key, all you needed to do was shake the lock really hard, and it would spontaneously pop open. What, really? That just seems too simple. It would be phenomenally awesome if true, but it seemed unlikely. But then, I remember this one lab I worked in where all the publicly popular drugs, like ketamine, were kept locked in a drawer to which only the PI had a key…but the countertop wasn’t secured to the bench, so if you knew about it, you could just lift the top and get easy access. It was a backdoor to the goodies that was so stupid you couldn’t believe it existed, but it did.


Could it be that cells similarly had a stupid weakness that could be so easily exploited? The short answer is no; read the whole article by David Cyranoski.



But the paper1 that set out the fundamental technique was soon shot full of holes. There was plagiarized text in the article. Figures showed signs of manipulation, and some images were identical or nearly identical to those used later in the same paper and elsewhere to represent different experiments. More damning were genetic analyses that strongly suggested the cells were not what they were purported to be. And although deriving STAP cells was advertised as simple and straightforward, no one has yet been able to repeat the experiment.


Within the space of six months, Obokata was found guilty of misconduct by her institution; well-respected scientists, including RIKEN head Ryoji Noyori, bowed their heads in apology; and both papers were retracted. In the end, the evidence for STAP cells seemed so flimsy that observers began to ask where were the extra precautions and the ‘extraordinary proof’ that had been promised post-Hwang.



It sure would have been nice to have a simple technique for generating stem cells, but I have to confess to being a bit relieved. There’s the vindication of prior thinking and the value of incrementally improving our stem cell protocols, of course, but also, I’d personally rather that it weren’t trivial to switch my cells to a de-differentiated pluripotent state — that’s a recipe for easy cancer generation, too. It is somehow reassuring to think that evolution has shaped multi-cellular organisms to be somewhat resistant to spontaneously going all stem-celly under stress.



The Next Big Programming Language You’ve Never Heard Of


programming-getty

Getty



Andrei Alexandrescu didn’t stand much of a chance. And neither did Walter Bright.


When the two men met for beers at a Seattle bar in 2005, each was in the midst of building a new programming language, trying to remake the way the world creates and runs its computer software. That’s something pretty close to a hopeless task, as Bright knew all too well. “Most languages never go anywhere,” he told Alexandrescu that night. “Your language may have interesting ideas. But it’s never going to succeed.”


Alexandrescu, a graduate student at the time, could’ve said the same thing to Bright, an engineer who had left the venerable software maker Symantec a few years earlier. People are constantly creating new programming languages, but because the software world is already saturated with so many if them, the new ones rarely get used by more than a handful of coders—especially if they’re built by an ex-Symantec engineer without the backing of a big-name outfit. But Bright’s new language, known as D, was much further along than the one Alexandrescu was working on, dubbed Enki, and Bright said they’d both be better off if Alexandrescu dumped Enki and rolled his ideas into D. Alexandrescu didn’t much like D, but he agreed. “I think it was the beer,” he now says.


Andrei Alexandrescu.

Andrei Alexandrescu.

Photo: Ariel Zambelich/WIRED



The result is a programming language that just might defy the odds. Nine years after that night in Seattle, a $200-million startup has used D to build its entire online operation, and thanks to Alexandrescu, one of biggest names on the internet is now exploring the new language as well. Today, Alexandrescu is a research scientist at Facebook, where he and a team of coders are using D to refashion small parts of the company’s massive operation. Bright, too, has collaborated with Facebook on this experimental software, as an outsider contractor. The tech giant isn’t an official sponsor of the language—something Alexandrescu is quick to tell you—but Facebook believes in D enough to keep him working on it full-time, and the company is at least considering the possibility of using D in lieu of C++, the venerable language that drives the systems at the heart of so many leading web services.


C++ is an extremely fast language—meaning software built with it runs at high speed—and it provides great control over your code. But it’s not as easy to use as languages like Python, Ruby, and PHP. In other words, it doesn’t let coders build software as quickly. D seeks to bridge that gap, offering the performance of C++ while making things more convenient for programmers.


Among the giants of tech, this is an increasingly common goal. Google’s Go programming language aims for a similar balance of power and simplicity, as does the Swift language that Apple recently unveiled. In the past, the programming world was split in two: the fast languages and the simpler modern languages. But now, these two worlds are coming together. “D is similar to C++, but better,” says Brad Anderson, a longtime C++ programmer from Utah who has been using D as well. “It’s high performance, but it’s expressive. You can get a lot done without very much code.”


In the past, the programming world was split in two: the fast languages and the simpler modern languages. But now, these two worlds are coming together.


In fact, Facebook is working to bridge this gap with not one but two languages. As it tinkers with D, the company has already revamped much of its online empire with a new language called Hack, which, in its own way, combines speed with simplicity. While using Hack to build the front-end of its service—the webpages you see when you open the service in your web browser—Facebook is experimenting with D on the back-end, the systems that serve as the engine of its social network.


But Alexandrescu will also tell you that programmers can use D to build anything, including the front-end of a web service. The language is so simple, he says, you can even use it for quick-and-dirty programming scripts. “You want to write a 50-line script? Sure, go for it.” This is what Bright strove for—a language suitable for all situations. Today, he says, people so often build their online services with multiple languages—a simpler language for the front and a more powerful language for the back. The goal should be a single language that does it all. “Having a single language suitable for both the front and the back would be a lot more productive for programmers,” Bright says. “D aims to be that language.”


The Cape of a Superhero


When Alexandrescu discusses his years of work on D, he talks about wearing the “cape of a superhero”—being part of a swashbuckling effort to make the software world better. That’s not said with arrogance. Alexandrescu, whose conversations reveal a wonderfully self-deprecating sense of humor, will also tell you he “wasn’t a very good” programming language researcher at the University Washington—so bad he switched his graduate studies to machine learning. The superhero bit is just a product of his rather contagious enthusiasm for the D project.


For years, he worked on the language only on the side. “It was sort of a free-time activity, in however much free-time a person in grad school can have, which is like negative,” says Alexandrescu, a Romanian who immigrated to the States in the late ’90s. Bright says the two of them would meet in coffee shops across Seattle to argue the ins and outs of the language. The collaboration was fruitful, he explains, because they were so different. Alexandrescu was an academic, and Bright was an engineer. “We came at the same problems from opposite directions. That’s what made the language great–the yin and the yang of these two different viewpoints of how the language should be put together.”


For Alexandrescu, D is unique. It’s not just that it combines speed and simplicity. It also has what he calls “modeling power.” It lets coders more easily create models of stuff we deal with in the real world, including everything from bank accounts and stock exchanges to automative sensors and spark plugs. D, he says, doesn’t espouse a particular approach to modeling. It allows the programmer “to mix and match a variety of techniques to best fit the problem.”


That’s what made the language great–the yin and the yang of these two different viewpoints of how the language should be put together.


He ended up writing the book on D. But when he joined Facebook in 2009, it remained a side project. His primary research involved machine learning. Then, somewhere along the way, the company agreed to put him on language full-time. “It was better,” he says, “to do the caped-superhero-at-night thing during the daytime.”


For Facebook, this is still a research project. But the company has hosted the past two D conferences—most recently in May—and together with various Facebook colleagues, Alexandrescu has used D to rebuild two select pieces of Facebook software. They rebuilt the Facebook “linter,” known as Flint, a means of identifying errors in other Facebook software, and they fashioned a new Facebook “preprocessor,” dubbed Warp, which helps generate the company’s core code.


In both cases, D replaced C++. That, at least for the moment, is where the language shines the most. When Bright first started the language, he called it Mars, but the community that sprung up around the language called it D, because they saw it as the successor to C++. “D became the nickname,” Bright says. “And the nickname stuck.”


The Interpreted Language That Isn’t


Facebook is the most high-profile D user. But it’s not alone. Sociomantic—a German online advertising outfit recently purchase by British grocery giant Tesco for a reported $200 million—has built its operation in D. About 10,000 people download the D platform each month. “I’m assuming it’s not the same 10,000 every month,” Alexandrescu quips. And judging from D activity on various online developer services—from GitHub to Stackoverflow—the language is now among the 20 to 30 most popular in the world.


For coder Brad Anderson, the main appeal is that D feels like interpreted languages such as Ruby and PHP. “It results in code that’s more compact,” he says. “You’re not writing boilerplate as much. You’re not writing as much stuff you’re obligated to write in other languages.” It’s less “verbose” than C++ and Java.


Yes, like C++ and Java, D is a compiled language, meaning that you must take time to transform it into executable software before running it. Unlike with interpreted languages, you can’t run your code as soon as you write it. But it compiles unusually quickly. Bright—who worked on C++, Java, and Javascript compilers at Symantec and Sun Microsystems—says this was a primary goal. “When your compiler runs fast,” he says, “it transforms the way your write code.” It lets you see the results much faster. For Anderson, this is another reason that D feels more like an interpreted language. “It’s usually very, very fast to compile–fast enough that the edit [and] run cycle usually feels just like an interpreted language.” He adds, however, that this begins to change if your program gets very large.


It’s usually very, very fast to compile–fast enough that the edit and run cycle usually feels just like an interpreted language.


What’s more, Anderson explains, a D program has this unusual ability to generate additional D code and weave this into itself at compile time. That may sound odd, but the end result is a program more finely tuned to the task at hand. Essentially, a program can optimize itself as it compiles. “It makes for some amazing code generation capabilities,” Anderson says.


The trouble with the language, according to Alexandrescu, is that it still needs a big-name backer. “Corporate support would be vital right now,” he says. This shows you that Facebook’s involvement only goes so far, and it provides some insight into why new languages have such trouble succeeding. In addition to backing Hack, Facebook employs some of the world’s leading experts in Haskell, another powerful but relatively underused language. What D needs, Alexandrescu says, is someone willing to pump big money into promoting it. The Java programming language succeeded, he says, because Sun Microsystems put so much money behind it back in the ’90s.


Certainly, D still faces a long road to success. But this new language has already come further than most.



Clever Building Solves Its Space Problem by Imitating an Armadillo




Paris in the 1800s was incredibly dense and dirty. The sweeping boulevards and airy town squares that characterize the French city today were constructed a bit later in the mid-1800s, in accord with the vision of architect Georges-Eugène Haussman. One such street is the Avenue des Gobelins, in the 13th arrondissement. To build that wide boulevard, Haussman had to have nearby buildings—including an old cinema—demolished and rebuilt in the middle of a strangely shaped triangular city block.


Which is why, when Renzo Piano’s architecture firm took on a project for the Pathé Foundation (Fondation Jérôme Seydoux-Pathé) cinematography organization, the firm had a tiny morsel of land to work with.


Constraints often stoke creativity, and in this case, they’ve led to a curved dome of a building that arches upward, like a giant balloon armadillo squeezing its way through the streets of Paris. The Pathé Foundation needed an office building, and because the guts of the original theater that occupied the space couldn’t be adapted to accommodate that, the Renzo Piano team knocked it down.


“This organic shape actually reacts much better to the constraints of the site.”


Amazingly enough, according to lead architect Thorsten Sahlmann, the former building ate up even more of the neighbors’ space. “When we started the building we said, ‘Okay, when we demolish we have to do a building that’s doing better for the neighbors around,’ ” he says. “It’s one of the reasons why the building has this organic shape, because actually it reacts much better to the constraints of the site.”


FONDATION_JEROME-SEYDOUX_PATHE_0614_01

Renzo Piano Building Workshop



By replacing a rectangular structure with a sloping, round one, Sahlmann and his team managed to create over 6,000 extra square feet of courtyard and ground space for the neighbors. Also, thanks to the yardage between the ground floor and the nearby occupants, the new Pathé Foundation could have transparent glass walls, and allow more natural sunlight to reach the ground floor. From the third floor upwards, where the neighboring facades start to creep closer together, Sahlmann and his team covered the exterior in a perforated aluminum siding that lets you see outward from within, but presents itself as opaque from outside, creating the illusion of privacy.


They had another constraint along the way that had to do with the original theater building. Before demolishing the old building, Sahlmann and his team had a historical preservation society look over the site. It wasn’t deemed historically significant, save for the facade: “The sculptures on the facade were done by Rodin, when he was a student and he needed some money,” Sahlmann says. “It was known as the Cinema de Rodin, because everybody knew that the two sculptures were by Rodin.” Keeping the scale of the facade meant that the oddly shaped building would need to be constructed in two parts; as a result, the entryway atrium leads first to courtyard garden, and then to a spiraling staircase that winds up into the dome.