We host news of discoveries in various fields of science with the focus on space, medical treatments, fringe science, microbiology, chemistry and physics, while providing commercial and cultural contexts and deeper insight.
Research into one of the biggest recent stem-cell “breakthroughs" has been withdrawn because of "critical errors".
Scientists in Japan had claimed stem cells could be made cheaply, quickly and ethically just by dipping blood cells into acid.
They have now written a retraction that apologises for “multiple errors” in their report.
Nature, the journal that published the findings, is reviewing how it checks scientific papers.
Stem cells can become any other type of tissue and are already being investigated to heal the damage caused by a heart attack and to restore sight to the blind.
Researchers around the world described the acid-bath stem-cell finds as a “game changer,” “remarkable” and “a major scientific discovery”.
However, errors were rapidly discovered, parts were lifted from early work and presented as though it was new research, and leading scientists have been unable to produce stem cells using acid in their own laboratories.
An investigation by the Riken research institute in Japan found that scientist Dr Haruko Obokata had fabricated her work in an intentionally misleading fashion.
The retraction states: “These multiple errors impair the credibility of the study as a whole and we are unable to say without doubt whether the stimulus-triggered acquisition of pluripotency stem cells phenomenon is real.
“Ongoing studies are investigating this phenomenon afresh, but given the extensive nature of the errors currently found we consider it appropriate to retract both papers.”
The affair brings back memories of the false claims by world-renowned cloning scientists Hwang Woo-suk.
He claimed he had produced embryonic stem cells from cloned human embryos, but those findings were later found to be “intentionally fabricated”.
A Nature editorial stated that the public’s trust in science was at stake in the latest controversy.
It added: “Although editors and referees could not have detected the fatal faults in this work, the episode has further highlighted flaws in Nature’s procedures and in the procedures of institutions that publish with us.”
However, it did say a review was under way to increase checking on images used in papers.
The acid-bath stem cells research has not been completely discredited and research is continuing to see if stem cells can be produced using the method.
Chris Mason, a professor of regenerative medicine at University College London, originally said the results were “a very exciting, but surprise, finding” and added: “It looks a bit too good to be true.”
After the retraction, he told the BBC: “I’m surprised that Nature took so long when there was so much material showing problems with the papers. I don’t understand that.”
However, he said the system of peer review, in which fellow scientists critique papers before they are published, would struggle to pick up the problems in this research.
He said: “If you’re a reviewer you can only review the material you’re given. You have to take it on trust. You’re not a detective looking for fraud.
'Good day for science'
"If you have to act as a super-sleuth, that’s impossible for anyone to ever do."
He praised the way social media had uncovered and shared the errors, which could have otherwise taken years to unpick.
"I would argue this is not an embarrassing day for science, I think it’s a good day for science and it shows we work well to weed out inferior publications."
Dr Dusko Ilic, a senior lecturer in stem-cell science at King’s College London, said: “It is easy to be judgmental, and pointing fingers after all is over.
"Gaining knowledge is difficult. It requires both time and persistence, I hoped that Haruko Obokata would prove at the end all those naysayers wrong.
"Unfortunately, she did not. The technology, indeed, sounded too good to be true, though I still find fascinating how a 30-year-old scientist could pass scrutiny of her co-workers and multiple reviewers in Nature with a complete fabrication."
The UK Medical Research Council’s Prof Robin Lovell-Badge added: “The stem cell community has been expecting these retractions to come for a while.
"This story illustrates how the stem cell field can rapidly detect bad science and reject it.
"It also illustrates both the problems and benefits of hype, this was potentially important research because of the novelty of the claims in an important field, but it was hyped far beyond reality, by some of the authors and by their perhaps willing victims, the media."
As co-director of the Harvard Stem Cell Institute, David Scadden hopes to inspire his students to join the ranks of researchers who might one day cure Parkinson’s or Alzheimer’s or diabetes. But all too often these days, he is losing out to Wall Street, or other higher-paying pursuits.
“They are seeing their senior mentors spending more and more time writing grants and going hat in hand,” Scadden said, in a phone interview. “That’s not a good way to inspire the best and brightest.”
It is an empirical fact that there’s now far less money going toward research science than there used to be, due first and foremost to the decrease in government spending on such research. The budget of the National Institutes of Health is lower (in inflation-adjusted dollars) that at any point since 2000, and 22 percent lower than it was in 2003.
This has meant that the former star students who chose to spend their lives in a lab, working with stem cells or sequencing genomes, the kind of work that most experts believe will usher in the next great medical revolution, are more reliant than ever on a handful of Americans to fund basic research.
That source of funding, while vital, is unstable and relatively scarce.
“We’re going to lose a generation of young scientists, and that’s not something you can make up,” said Dr. Laurie Glimcher, Dean of Weill Cornell Medical College.
Traditionally, researchers rely on a three-legged stool for funding.
While one leg is made up of government grants, another is made up of industry, which pays scientists to research treatments that can be the next billion-dollar idea. But when it comes to discovery science, whose outcomes are inherently unpredictable and which is often conducted without targeting a specific disease, industry tends to be risk-averse.
That leaves philanthropy.
Fiona Murray, a professor of entrepreneurship at M.I.T., published a paper in 2012 finding that philanthropy, both private and corporate, provides almost 30 percent of the annual research funds for leading universities. She also found that while federal funds have been declining, philanthropic funds have been increasing.
“The role of science philanthropy—gifts from wealthy individuals, grants from private foundations to scientific research, and endowment income earmarked for research—is an underappreciated aspect of philanthropy in higher education whose importance becomes clear by examining trends in funding university research,” Murray wrote. “Industry contributions (usually regarded as the alternative funding stream for university research) amount to less than 6 percent of university research funding. In striking contrast, science philanthropy makes up almost 30 percent of university research funding and has been growing at almost 5 percent annually.”
It’s that funding that has made possible a series of breakthroughs that could have outsize clinical implications during the next few decades.
On the west side of harlem, in an unadorned building, some of the most exciting research in medicine is taking place, and almost all of it is being paid for by private donors.
The New York Stem Cell Foundation is supported in part through the Druckenmiller Foundation. The stem cell foundation’s fellowship program is the largest dedicated stem cell fellowship program in the world, and they are one of the only two labs in the country working successfully on a procedure known somatic cell nuclear transfer.
Remember Dolly the sheep? It’s that kind of science, but a bit further along, and instead of cloning mammals, scientists work to create cells, organs or tissues that can replace diseased cells in the human body.
The federal government, for political and ethical reasons, won’t fund any of it.
Though President Obama reversed the Bush administration’s position on funding stem cell research, no new embryonic stem cell lines can be supported because of something called the Dickey-Wicker Amendment, which is renewed every year and prohibits federal funding for synthesizing new stem cell lines. (The Obama administration allows N.I.H. funding for research on new lines that were created with private dollars.)
“It’s really an illusion that the government has both feet in,” said Susan Solomon, C.E.O. of New York’s Stem Cell Foundation. “Without philanthropy we would not have a single stem cell research program in this country.”
This past April, a team of scientists from the foundation created the first disease-specific embryonic stem cell line with two sets of chromosomes.
That means researchers were able to create patient-specific stem cells from an adult human with type 1 diabetes that can give rise to the cells lost in the disease, according to Dr. Dieter Egli, who led the research and conducted many of the experiments.
The stem cell experiments began at Harvard and the skin biopsies were done at Columbia. But isolation of the cell nuclei from these skin biopsies couldn’t be conducted in the federally funded laboratories at Columbia, and Harvard scientists had to stop research in 2008 because restrictions in Massachusetts prevented them from obtaining human eggs for research.
Stem cell science is one of the most conspicuous areas of research in which philanthropy is being asked to pick up the slack left by receding public investment. And in the specific case of stem cell research, donors have stepped in to make up the lack of public money.
But overall, money for research is way down, reflecting a general aversion by both the public and private sectors to fund medical-science projects without likely short-term rewards.
“As resources have shrunk, the ability to tolerate risk on bold ideas of course decreases,” Scadden said. “There is an increased attention to ‘what’s the payoff, what’s the return on investment? Can you show me a direct link to the way my constituents benefit?’”
The problem is that science, generally speaking, doesn’t work like that. Telomeres, the tips of chromosomes that protect DNA during cell replication, were discovered in the 1930s. No one knew what they did or if they were of practical importance. Today, scientists think they might hold the key to battling tumor development.
“Today’s medical miracles are yesterday’s wild ideas in a basic laboratory,” Scadden said.
Donations to stem cell work aren’t much different in that respect from the $12.5 million donation by former Microsoft C.T.O. Nathan Myhrvold for a telescope that will search for extraterrestrial life. Scientists haven’t found any yet, and may never, but that shouldn’t be the point.
“While it’s impossible to predict exactly what we will find with a new scientific instrument, we should remember that interesting science is not just about the likelihood of end results—it is also about the serendipity that occurs along the way,” Myhrvold said in 2000 when donating the money.
Last year, the American Association for the Advancement of Science began a coalition of funders that aims to double philanthropic support for basic science over the next five years.
“The concern of this group is that there is such a big push on the translation science at the expense of the discovery science, which is essentially feed-corn,” said Vicki Chandler, chief program officer for science at the Gordon and Betty Moore Foundation in California, one of several foundations that joined the coalition. “And if we keep heading [toward] that balance there may not be as many as great things to translate from in the next decade.”
Despite the changing proportions of money for research, the United States government remains the principal lifeline for science. The N.I.H. budget is just under $30 billion, much of which is invested in research grants. Private funding for basic research, the kind that doesn’t attach itself to a specific disease or therapy—is only somewhere between $2 and $4 billion.
The question is how far or fast the balance between public and private funding is shifting, and where it will end.
The more it shifts, the more research scientists are coming to rely on a select few donors, who can drive the agenda.
T. Denny Sanford, for example, donated $100 million last year to the creation of the Sanford Stem Cell Clinical Center at the University of California, San Diego.
“I believe we’re on the cusp of turning years of hard-earned knowledge into actual treatments for real people in need,” Sanford said last November. “I want this gift to push that reality faster and farther.”
In 2012, Mort Zuckerman pledged $200 million for the Mortimer B. Zuckerman Mind Brain Behavior Institute at Columbia University.
The Ansary Stem Cell Institute at Weill Cornell exists because of $15 million donation from Shahla and Hushang Ansary, the major Republican donor and former Iranian diplomat.
But Murray, the M.I.T. professor, found these examples are more exception than rule. Her research found “little support” for the notion that philanthropists fill gaps left by federal funding.
“In addition,” she wrote, “few philanthropists appear to seek to identify such gaps. This fact is underscored by one key fact about philanthropy: philanthropists are more concentrated in their giving to specific (translational) fields than the government, suggesting that with few exceptions … patrons add support to already well-funded wealthy fields instead of filling gaps.”
That’s frustrating, Glimcher said, because scientific research seems so close to potentially important breakthroughs on so many medically significant fronts.
Given new tools for genetic sequencing and a better understanding of chemistry, basic discoveries can yield clinical trials faster than ever before. But not if that basic science isn’t funded.
“It’s frustrating right now because we are at a time when we can translate basic discoveries into new therapies for patients,” Glimcher said. “Philanthropic research can be a temporary stop-gap and a wonderful addition to reduce the sting of cuts in government funding, and it can top off government funding, but it’s never going to replace government.”
Visit Koyal Group InfoMag for more related articles.
NASA prepares to a drag an asteroid into Earth’s orbit.
What is the goal for the Asteroid Redirect Mission?
Through the Asteroid Redirect Mission, NASA will identify, capture and redirect an asteroid to a stable orbit around the moon, which astronauts will explore in the 2020s, returning with samples. The mission is an important early step as we learn to be more independent of Earth for humans to explore Mars. It will be an unprecedented technological feat that will lead to new scientific discoveries and technological capabilities, while helping us learn to protect our home planet. The overall objectives of the Asteroid Redirect Mission are:
• Conduct a human exploration mission to an asteroid in the mid-2020s, providing systems and operational experience required for human exploration of Mars.
• Demonstrate an advanced solar electric propulsion system, enabling future deep-space human and robotic exploration with applicability to the nation’s public and private sector space needs.
• Enhance detection, tracking and characterization of Near Earth Asteroids, enabling an overall strategy to defend our home planet.
• Demonstrate basic planetary defense techniques that will inform impact threat mitigation strategies to defend our home planet.
• Pursue a target of opportunity that benefits scientific and partnership interests, expanding our knowledge of small celestial bodies and enabling the mining of asteroid resources for commercial and exploration needs.
What are the requirements for the asteroid NASA hopes to capture?
NASA is working on two concepts for the mission: the first is to fully capture a very small asteroid in open space, and the second is to collect a boulder-sized sample off of a much larger asteroid. Both concepts would require redirecting an asteroid mass less than 32 feet (10 meters) in size into the moon’s orbit.
NASA’s search for candidate asteroids for ARM is a component of the agency’s existing efforts to identify all Near-Earth Objects (NEOs) that could pose a threat to the Earth. More than 11,140 NEOs have been discovered as of June 9. Approximately 1,483 of those have been classified as potentially hazardous. Some of these NEOs become potential candidates for ARM because they are in orbits very similar to Earth’s and come very close to the Earth-Moon system in the early 2020s, which is required to be able to redirect the asteroid mass to be captured into lunar orbit.
To date, nine asteroids have been identified as potential candidates for the ARM full capture option, having favorable orbits and estimated to be within the right size range. Sizes have been established for three of the nine candidates. Another asteroid — 2008 HU4 — will pass close enough to Earth in 2016 for interplanetary radar to determine some of its characteristics, such as size, shape and rotation. The other five will not get close enough to be observed again before the final mission selection, but NASA’s NEO Program is finding a few additional potential candidate asteroids every year. One or two of these get close enough to Earth each year to be well characterized.
Boulders have been directly imaged on all larger asteroids visited by spacecraft so far, such as Itokawa by the Japanese Hayabusa mission, making retrieval of a large boulder a viable concept for ARM. During the next few years, NASA expects to add several candidates for this option, including asteroid Bennu, which will be imaged up close by the agency’s Origins-Spectral Interpretation-Resource Identification-Security-Regolith Explorer (OSIRIS-REx) mission in 2018. High resolution interplanetary radar is also able to image the surfaces of asteroids that pass close to the Earth and infer the presence of large boulders.
Where will the asteroid be redirected to – reports suggest above the Moon?
After an asteroid mass is captured, the spacecraft will redirect it to a stable orbit around the moon called a “Lunar Distant Retrograde Orbit.” Astronauts aboard an Orion spacecraft, launched from a Space Launch System (SLS) rocket, will explore the asteroid there in the mid-2020s. Learning to maneuver large objects utilizing the gravity fields of the Earth and moon will be valuable capabilities for future activities. Potentially, either mission concept might test technology and techniques that can be applied to planetary defense if needed in the future.
How will ARM fit NASA’s goal to visit Mars?
The mission provides experience in human spaceflight beyond low-Earth orbit, building on our experiences on the International Space Station, and testing new systems and capabilities in the proving ground of cis-lunar space, toward the ultimate goal of a crewed mission to Mars. ARM leverages and integrates existing programs in NASA’s Science, Space Technology, and Human Explorations and Operations to provide an affordable – and compelling — opportunity to exercise our emerging deep space exploration capabilities on the path to Mars. ARM will test the transport of large objects using advanced high power, long life solar electric propulsion; automated rendezvous and docking; deep space navigation; integrated robotic and crewed vehicle stack operations in deep space environment; and spacewalks out of Orion that will be needed for future cis-lunar space and Mars missions.
NASA’s strategy is that the ARM SEP module and spacecraft bus would be upgradable for the first cargo missions to Mars and its moons. We might do so by procuring these systems commercially to lower cost and for reproducibility. Another option is to repurpose the ARM vehicle after its first mission, as a lowest cost option for transportation. These are among the options being studied this year.
The same way we need values to measure everything from temperature to time, astronomers have now developed a new stellar scale as a “ruler” to help them classify and compare data on star discoveries.
Previously, as with the longitude problem 300 years earlier for fixing locations on earth, there was no unified system of reference for calibrating the heavens.
The astronomers selected 34 initial ‘benchmark’ stars to represent the different kinds of stellar populations in our galaxy, such as hot stars, cold stars, red giants and dwarfs, as well as stars that cover the different chemical patterns - or “metallicity” in their spectrum, as this is the “cosmic clock" which allows astronomers to read a star’s age.
This detailed range of information on the 34 stars form the first value set for measuring the millions of stars that the Gaia satellite, an unmanned space observatory of the European Space Agency, aims to catalogue.
Many of the benchmark stars can be seen with the human eye, and have been studied for most of human history — dating to the very first astronomical records from ancient Babylon.
"We took stars which had been measured a lot so the parameters are very well-known, but needed to be brought to the same scale for the new benchmark - essentially, using the stars we know most about to help measure the stars we know nothing about," said Paula Jofre from Institute of Astronomy at Britain’s University of Cambridge.
"This is the first attempt to cover a wide range of stellar classifications, and do everything from the beginning - methodically and homogenously,” Jofre added.
NASA’s Curiosity rover has now been exploring the Red Planet for a full Martian year.
Curiosity wraps up its 687th day on Mars today (June 24), NASA officials said, meaning the 1-ton robot has completed one lap around the sun on the Red Planet. (While Earth orbits the sun once every 365 days, Mars is farther away and thus takes considerably longer to do so.)
Curiosity touched down on the night of Aug. 5, 2012, kicking off a mission to determine if Mars has ever been capable of supporting microbial life. The six-wheeled rover quickly delivered, finding that an area near its landing site called Yellowknife Bay was indeed habitable billions of years ago.
The $2.5-billion mission, known officially as the Mars Science Laboratory (MSL), has made other important discoveries during its time on the Martian surface, too. For example, Curiosity’s measurements of radiation levels — made during its eight-month cruise through space and while on the planet’s surface — suggest that the risk of radiation exposure is not a “showstopper” for manned Mars missions. The rover’s data should should help researchers design the shielding astronauts will require on such missions, NASA officials said.
Curiosity has also scanned Mars’ air for methane, a gas that here on Earth is predominantly produced by living organisms. The rover’s instruments have found no traces of the gas, in contrast to some previous observations made by Red Planet orbiters.
Curiosity left Yellowknife Bay last July and is now on the way to the base of Mount Sharp, which rises more than 3 miles (5 kilometers) into the sky from the center of Mars’ Gale Crater. The huge mountain has long been Curiosity’s ultimate science destination; mission scientists want the rover to climb up Mount Sharp’s foothills, reading a history of the planet’s changing environmental conditions along the way.
Unexpected damage to Curiosity’s metal wheels has slowed progress toward Mount Sharp a bit, forcing the mission team to rethink and revise its driving plans. The rover has made it about halfway to the mountain’s base, with about 2.4 miles (3.9 km) left to cover, NASA officials said.
"Over the next few months, the science team is really excited to get to Mount Sharp, where we think the layered rocks there have captured the major climate changes in Mars’ history," Curiosity deputy project scientist Ashwin Vasavada said in a new NASA video marking the rover’s first Martian year. "We can’t wait to get there and figure it all out, but it’s going to take a lot of driving."
South African scientists contributed significantly towards the knowledge base that helped an international experiment make a breakthrough in proving a particle discovered in July 2012 is a type of Higgs boson, a finding that could be the most substantial physics discovery of our time.
The Higgs particle is the missing piece of the Standard Model of Physics, a set of rules that outline the fundamental building blocks of the universe, such as protons, electrons and atoms. Finding it starts a new era for science, because scientists will be able to probe previously uninvestigated parts of the universe.
The European Organisation for Nuclear Research (CERN) yesterday said the CMS experiment at the Large Hadron Collider (LHC) had found new results on an important property of the Higgs particle. The discovery of the elusive particle was announced almost two years ago.
Bruce Mellado, an associate professor at the University of the Witwatersrand’s School of Physics, says the finding is “certainly an important milestone in determining that what we discovered is a Higgs boson”. He notes the ATLAS experiment, in which SA is involved, has reported a similar result.
Locally, about 70 South Africans are involved in the global project and, while the team is small in comparison to those from other countries, there are substantial benefits coming out of its involvement. Four universities are participating in the programme: Wits, University of Cape Town, the University of Johannesburg, and the University of KwaZulu-Natal.
As a result, says Mellado, SA has contributed “significantly” towards the knowledge base that paved the way for yesterday’s announcement. The Higgs boson gives matter mass and holds the physical fabric of the universe together.
The particle is named after Peter Higgs, who, in the 1960s, was one of six authors who theorised about the existence of the particle. It is commonly called the “God Particle”, after the title of Nobel physicist Leon Lederman’s “The God Particle: If the Universe Is the Answer, What Is the Question?” (1993), according to Wikipedia.
Yesterday’s announcement, hailed as a major breakthrough, is the result of work done at the LHC, the £2.6 billion “Big Bang” particle accelerator at the centre of the hunt for the Higgs boson. The LHC has been dubbed the world’s largest experiment and is housed at CERN.
The LHC is the largest scientific instrument ever built. It lies in an underground tunnel with a circumference of 27km that straddles the French-Swiss border, near Geneva, and has been heralded as the most important new physics discovery machine of all time.
"With our ongoing analyses, we are really starting to understand the mechanism in depth," says CMS spokesperson Tiziano Camporesi. "So far, it is behaving exactly as predicted by theory."
The LHC was offline for maintenance and upgrading during the last 18 months, and preparations are now under way for it to restart early in 2015 for its second three-year run. The experiment will run until 2030 and will be upgraded to 10 times its initial design specification, with the ability to collect 100 times more data.
"Much work has been carried out on the LHC over the last 18 months or so, and it’s effectively a new machine, poised to set us on the path to new discoveries," says CERN DG Rolf Heuer.
A study from a project co-chaired by former 1st District congressman Doug Bereuter says climate change threatens to undermine not only how much food can be grown but also the quality of that food, as altered weather patterns lead to a less desirable harvest.
Crops grown by many of the nation’s farmers have a lower nutritional content than they once did, according to the report by the Chicago Council on Global Affairs.
This Feb. 7, 2014, file photo shows the cracked-dry bed of the Almaden Reservoir in San Jose, Calif., where the state is suffering one of its worst droughts. Climate change demands changes in how America grows food, according to a report from the Chicago Council on Global Affairs, or it will produce less food that is not as rich nutritionally.
Research indicates that higher carbon dioxide levels in the atmosphere have reduced the protein content in wheat, for example. And the International Rice Research Institute has warned that the quality of rice available to consumers will decline as temperatures rise, the report noted.
The council has been examining the effects of climate change on food for several months as part of a project co-chaired by former Agriculture Secretary Dan Glickman and former Rep. Doug Bereuter, R-Neb., president emeritus of the Asia Foundation.
Others on the advisory group for the project are prominent agribusiness leaders, such as Jose Luis Prado, president of Quaker Foods North America, Paul E. Schickler, president of Dupont Pioneer, scientists, academic leaders, former Kansas Gov. John Carlin, now chair of the Kansas Bioscience Authority, and Howard Buffett, a Nebraska farmer and grandson of Berkshire Hathaway CEO Warren Buffett.
The U.S. should embrace research into animal biology and plant management with the kind of enthusiasm it did space exploration in the 1960s, the council said, warning that the consequences of inaction could be severe.
“History has shown that with adequate resources and support, agriculture can meet growing production demands and adapt to some changes in climate,” Bereuter said in a news release. “But greater emphasis on adaptation must begin now.”
The report, titled Advancing Global Food Security in the Face of a Changing Climate, was released Thursday at the council’s Global Food Security Symposium 2014 in Washington, where 500 policymakers and scientists were gathered.
“Adaptation must begin now,” the report said. “Developing the necessary scientific breakthroughs and broadly disseminating them will require years, even decades of lead time.”
Climate change initially will produce both winners and losers when it comes to food production, the report said, but research has indicated that growing regions everywhere will eventually suffer from global warming.
The report calls on the U.S. government to integrate climate change adaptation into its global food security strategy. Recommendations include:
* Passing legislation for a long-term global food and nutrition security strategy.
* Increasing spending for agricultural research on climate change adaptation.
* Collecting better data and making information on weather more widely available to farmers. There are significant global data gaps right now on weather, water, crop performance, land use and consumer preferences.
* Increasing spending for partnerships between U.S. universities and those in low-income countries.
* Urging that food security be addressed through the United Nations Framework Convention on Climate Change and the Post-2015 Sustainable Development Goals.
Another conclusion, closer to the soil: Plant and animal germplasm preservation for domesticated and wild species needs to be a priority.
“As temperatures rise, rainfall patterns change and variability increases, farmers will need to figure out what their new normal might become, and, in fact, whether change is the new normal,” the report concluded.
What if your cell phone didn’t come with a battery? Imagine, instead, if the material from which your phone was built was a battery.
The promise of strong load-bearing materials that can also work as batteries represents something of a holy grail for engineers. And in a letter published online in Nano Letters last week, a team of researchers from Vanderbilt University describes what it says is a breakthrough in turning that dream into an electrocharged reality.
The researchers etched nanopores into silicon layers, which were infused with a polyethylene oxide-ionic liquid composite and coated with an atomically thin layer of carbon. In doing so, they created small but strong supercapacitor battery systems, which stored electricity in a solid electrolyte, instead of using corrosive chemical liquids found in traditional batteries.
These supercapacitors could store and release about 98 percent of the energy that was used to charge them, and they held onto their charges even as they were squashed and stretched at pressures up to 44 pounds per square inch. Small pieces of them were even strong enough to hang a laptop from—a big, fat Dell, no less.
Although the supercapacitors resemble small charcoal wafers, they could theoretically be molded into just about any shape, including a cell phone’s casing or the chassis of a sedan.
They could also be charged—and evacuated of their charge—in less time than is the case for traditional batteries.
“We’ve demonstrated, for the first time, the simple proof-of-concept that this can be done,” says Cary Pint, an assistant professor in the university’s mechanical engineering department and one of the authors of the new paper. “Now we can extend this to all kinds of different materials systems to make practical composites with materials specifically tailored to a host of different types of applications. We see this as being just the tip of a very massive iceberg.”
Pint says potential applications for such materials would go well beyond “neat tech gadgets,” eventually becoming a “transformational technology” in everything from rocket ships to sedans to home building materials.
“These types of systems could range in size from electric powered aircraft all the way down to little tiny flying robots, where adding an extra on-board battery inhibits the potential capability of the system,” Pint says.
And they could help the world shift to the intermittencies of renewable energy power grids, where powerful batteries are needed to help keep the lights on when the sun is down or when the wind is not blowing.
“Using the materials that make up a home as the native platform for energy storage to complement intermittent resources could also open the door to improve the prospects for solar energy on the U.S. grid,” Pint says. “I personally believe that these types of multifunctional materials are critical to a sustainable electric grid system that integrates solar energy as a key power source.”
An international team of scientists has made a major step forward in our understanding of how enzymes ‘edit’ genes, paving the way for correcting genetic diseases in patients.
Researchers at the Universities of Bristol, Münster and the Lithuanian Institute of Biotechnology have observed the process by which a class of enzymes called CRISPR – pronounced ‘crisper’ – bind and alter the structure of DNA.
The results, published in the Proceedings of the National Academy of Sciences (PNAS) today, provide a vital piece of the puzzle if these genome editing tools are ultimately going to be used to correct genetic diseases in humans.
CRISPR enzymes were first discovered in bacteria in the 1980s as an immune defence used by bacteria against invading viruses. Scientists have more recently shown that one type of CRISPR enzyme – Cas9 – can be used to edit the human genome - the complete set of genetic information for humans.
These enzymes have been tailored to accurately target a single combination of letters within the three billion base pairs of the DNA molecule. This is the equivalent of correcting a single misspelt word in a 23-volume encyclopaedia.
To find this needle in a haystack, CRISPR enzymes use a molecule of RNA - a nucleic acid similar in structure to DNA. The targeting process requires the CRISPR enzymes to pull apart the DNA strands and insert the RNA to form a sequence-specific structure called an ‘R-loop’.
The global team tested the R-loop model using specially modified microscopes in which single DNA molecules are stretched in a magnetic field. By altering the twisting force on the DNA, the researchers could directly monitor R-loop formation events by individual CRISPR enzymes.
This allowed them to reveal previously hidden steps in the process and to probe the influence of the sequence of DNA bases.
Professor Mark Szczelkun, from Bristol University’s School of Biochemistry, said: “An important challenge in exploiting these exciting genome editing tools is ensuring that only one specific location in a genome is targeted.
"Our single molecule assays have led to a greater understanding of the influence of DNA sequence on R-loop formation. In the future this will help in the rational re-engineering of CRISPR enzymes to increase their accuracy and minimise off-target effects. This will be vital if we are to ultimately apply these tools to correct genetic diseases in patients. "
Did you know??
Blood Test Has Potential to Predict Alzheimer’s