February 18, 2010, 9:40 PM CT
Genes, environment, or chance?
Biologists attribute variations among individual organisms to differences in genes or environment, or both. But a newly released study of nematode worms with identical genes, raised in identical environments, has revealed another factor: chance.
It's another source of variation for researchers to consider. "Scientists have been exploring whether organisms evolve different ways to cope with genetic and environmental variation," said author Scott Rifkin, an assistant professor of biology at UC San Diego. "This study adds random variation to that mix".
Rifkin, who joined the UCSD faculty this fall, completed the study while working at MIT. The paper, co-authored by Arjun Raj, who contributed equally to the work, Erik Andersen and Alexander van Oudenaarden of MIT, is reported in the February 18 issue of Nature
Rifkin and colleagues looked at the development of the gut in C. elegans
In a number of, but not all worms with mutations in a gene called skn-1, the gut failed to develop, even when the embryos were genetically identical and incubated together.
"Often when people look at variation in a trait among organisms they try to trace it back to genetic differences or differences in environmental conditions or some combination of the two. In our study there were no such differences, and so we hypothesized that the only other source for the variation could be differences that arose at random during the process of development," Rifkin said.........
Posted by: Nora Read more Source
January 25, 2010, 8:08 AM CT
Supercomputers to explore nuclear energy
An elevation plot of the highest energy neutron flux distributions from an axial slice of a nuclear reactor core is shown superimposed over the same slice of the underlying geometry. This figure shows the rapid spatial variation in the high energy neutron distribution between within each plate along with the more slowly varying, global distribution. The figure is significant since UNIC allows researchers to capture both of these effects simultaneously.
Ever wanted to see a nuclear reactor core in action? A new computer algorithm developed by scientists at the U.S. Department of Energy's (DOE) Argonne National Laboratory allows researchers to view nuclear fission in much finer detail than ever before.
A team of nuclear engineers and computer researchers at Argonne National Laboratory are in the process of developing the neutron transport code UNIC, which enables scientists for the first time to obtain a highly detailed description of a nuclear reactor core.
The code could prove crucial in the development of nuclear reactors that are safe, affordable and environmentally friendly. To model the complex geometry of a reactor core requires billions of spatial elements, hundreds of angles and thousands of energy groups-all of which lead to problem sizes with quadrillions of possible solutions.
Such calculations exhaust computer memory of the largest machines, and therefore reactor modeling codes typically rely on various approximations. But approximations limit the predictive capability of computer simulations and leave considerable uncertainty in crucial reactor design and operational parameters.
"The UNIC code is intended to reduce the uncertainties and biases in reactor design calculations by progressively replacing existing multilevel averaging techniques with more direct solution methods based on explicit reactor geometries," said Andrew Siegel, a computational scientist at Argonne and leader of Argonne's reactor simulation group.........
Posted by: Edna Read more Source
January 12, 2010, 8:53 AM CT
Faster and More Efficient Software for the Air Force
Dr. Myra Cohen and her team of researchers at the University of Nebraska in Lincoln have addressed the issue of faulty software. They have developed an algorithm and open source tool that is 300 times faster at generating tests and also reduces the time of software testing over its predecessor. (Credit: University of Nebraska,
Scientists at the University of Nebraska in Lincoln have addressed the issue of faulty software by developing an algorithm and open source tool that is 300 times faster at generating tests and also reduces current software testing time.
The new algorithm has potential to increase the efficiency of the software testing process across systems.
The project, funded in part by an Air Force Office of Scientific Research (AFOSR) Young Investigator Award and through a National Science Foundation Early CAREER Award, is of particular interest to the military because of the potential to reduce errors in theater. This technology will also be helpful to the private sector where some agencies are reporting financial losses of up to 50 billion dollars per year because of poor software.
"Software failures have the potential to cause financial, environmental or bodily harm," said lead researcher, Dr. Myra Cohen. "Our techniques will help to improve the quality of software in the military to help ensure that those systems behave properly in the field".
"The ultimate goal of research like this is not just to reduce software testing costs, but to do so while maintaining or even increasing confidence in the tests themselves," said AFOSR Program Manager, Dr. David Luginbuhl who is overseeing Cohen's work.........
Posted by: John Read more Source
September 23, 2009, 7:17 AM CT
Exposing dangerous invisible pollution
Worried that dust from a nearby construction zone will harm your family's health? A new Tel Aviv University tool could either confirm your suspicions or better yet, set your mind at rest.
Prof. Eyal Ben-Dor and his Ph.D. student Dr. Sandra Chudnovsky, of TAU's Department of Geography have developed a sensor called "Dust Alert" - the first of its kind - to help families and authorities monitor the quality of the air they breathe. Like an ozone gas or carbon monoxide meter, it measures the concentration of small particles that may contaminate the air in your home. Scientific studies on "Dust Alert" appeared recently in the journal Science of the Total Environment, Urban Air Pollution: Problems, Control Technologies and Management Practices.
"It works just like an ozone meter would," says Prof. Ben-Dor. "You put it in your home or office for three weeks, and it can give you real-time contamination levels in terms of dust, pollen and toxins." Functioning like a tiny chemistry lab, the device can precisely determine the chemical composition of the toxins, so homeowners, office managers and factories can act to improve air quality.
Using the measurements, Prof. Ben-Dor can sometimes find a quick remedy for a dusty or pollen-filled home. The solution could be as easy as keeping a window open, he says. "We've found through our ongoing research that some simple actions at home can have a profound effect on the quality of air we breathe".........
Posted by: John Read more Source
September 15, 2009, 7:59 AM CT
Dual simulation improves crash performance
Damage to a component made out of high-strength steel after a crash test. (© Fraunhofer IWM)
Crash tests often produce startling results. A new simulation process which factors in deformation during production as well as preliminary damage can predict the results of a crash test more accurately than ever.
There are components that save lives: if a car rolls over during an accident, the 'B-pillar' plays a key role. It forms one of the connections between the floor and roof of the vehicle and is designed to prevent the passenger cell from deforming too much. The materials from which the B-pillar is manufactured therefore need to meet very exacting requirements: to save fuel they need to be ultra-lightweight, yet at the same time need to be tremendously strong and must not break. Yet what does the optimum component actually look like? With the aid of countless experiments, simulations and crash tests, the auto industry has been getting nearer to answering this question. Now Fraunhofer scientists are providing further impetus to development.
Engineers will commonly carry out a range of virtual tests. Known materials properties provide the basic knowledge in such a scenario. "We are well aware of the physical and mechanical characteristics of the materials in their original state," says Dr. Dong-Zhi Sun, Group leader at the Fraunhofer Institute for Mechanics of Materials IWM. Yet during the course of the manufacturing process, the components change: with a B-pillar, for instance, the material goes through a complicated manufacturing chain. As it is deformed and stretched, minor damage such as pore formation may occur. "If you're going to fit these kinds of parts into vehicles, you need to take into account their deformation history during manufacture," explains Sun. That's why the scientists have developed a special method: "With our failure model, we can simulate manufacturing processes more effectively," explains Sun. "To ensure we understand the manufacturing processes inside out, we work together closely with automakers and materials producers." Thanks to the simulation, the scientists can precisely model and analyze the deformation of the component during manufacture. So they know to what extent the process affects the properties of the end product, and whether the manufacturing process gives rise to potential preliminary damage such as pore formation and microcracks. The engineers combine the results of the process simulation with a crash simulation, which is conducted using a newly developed material model.........
Posted by: John Read more Source
Tue, 23 Dec 2008 02:56:43 GMT
The Most Exciting Future Biophysics Tool
If you could wish for any capabilities in an instrument to help you with your research, what would they be? It might not be hard to come up with a useful super power that’s way out of reach of current or near-future technology, but what about something you might actually have in the next 10 or 20 years?
One of my interests is high resolution imaging, either by scanning probe or fluorescence microscopy, and I’ve seen and taken advantage of some great electron microscopy as well (although I haven’t done any myself). Each of these methods in their current most common form has advantages and disadvantages: scanning probe microscopies tend to be slow but offer high resolution with little sample preparation, fluorescence microscopy suffers from lower resolution but has pretty good acquisition rate and molecular specificity, and electron microscopy involves more complicated sample preparation that can distort the sample and only provides a snapshot, but it can provide truly exquisite images at a range of spatial scales.
These methods are all providing new insights into every area of cell biology and biophysics—fluorescence microscopy especially is now a staple of almost every lab in these fields—but it’s the ways that these methods are being pushed beyond their current limits that are truly exciting. New tools have always provided new insights, but I think cell biology is poised to be completely revolutionized in the next few decades.
Take atomic force microscopy. High resolution in water, but painfully slow. Wouldn’t it be nice if it were faster? It is. The animated gif on the right is an AFM movie taken at 12 frames per second in Toshio Ando’s lab at Kanazawa University in Japan. You’re seeing a single myosin molecule undergo a conformational change in real time. Single molecule fluorescence methods have provided a lot of insight into the mechanism of molecular motor motion (they walk) but there are still finer scales to investigate and high-speed AFM may prove to be the tool of choice in the very near future.
That’s very nice for in vitro work, but ultimately cells are where the action is. I want an instrument that will reduce the vast majority of cell biology to computer science. That will “only” require the convergence of three existing technologies: cryo-electron tomography, environmental scanning electron microscopy, and femtosecond electron diffraction. The ultimate fantasy or course is an atomic scale femtosecond movie of a living cell over hours. That would give you a complete genetic, proteomic, biophysical, and biochemical picture of cell function. You would still need interesting perturbations to ask questions, but all the answers would be provided by a single instrument and clever data mining. Even relaxing the goal by orders of magnitude in every direction to 10 nm spatial resolution and millisecond time resolution in a one minute movie would be radical.
Sounds far-fetched, but don’t forget that we’ve already got Wolfgang Baumeister
talking about Kanazawa University and Wolfgang Baumeister and people like Philip’s advisor doing Kanazawa University. Wolfgang Baumeister works in water vapour. At a talk at the Kanazawa University, Ahmed Zewail spoke about an Kanazawa University for electron diffraction and imaging. He showed a picture of a cell they took with it and he says their goal is to do a Kanazawa University version of electron diffraction in a cell within a few years.
Maybe he wasn’t even exaggerating…
While on the topic of things that might be possible in the future, nanotech enthusiasts might be interested to know that Kanazawa University now has a blog called Kanazawa University.
Posted by: Andre Read more Source
Sun, 07 Dec 2008 17:10:34 GMT
Healthcare in Second Life
I just found a playlist on Youtube that is dedicated to healthcare in Second Life, the virtual world. Numerous videos about tools for medical education and sites for patient support.
Posted by: Bertalan Read more Source
Thu, 23 Oct 2008 04:13:43 GMT
I''m on vacation, so please accept my apologies for the brief entries. -- Daniel.
I''m not sure of the identity of this one, but I suspect Cladina rangiferina, or reindeer moss (though it''s really a lichen). This was growing at ~850m (2800ft) in elevation. It was a common sight in the White Pass area, although I must admit it does look a bit different when a macro lens is used (see other images of Cladina spp.).
It also seems that all Cladina species are now lumped into Cladonia; the USDA PLANTS database still uses Cladina.
Posted by: Daniel Mosquin Read more Source
September 10, 2008, 9:01 PM CT
First beam for Large Hadron Collider
An international collaboration of researchers today sent the first beam of protons zooming at nearly the speed of light around the 17-mile-long underground circular path of the Large Hadron Collider (LHC), the world's most powerful particle accelerator, located at the CERN laboratory near Geneva, Switzerland.
The researchers also accelerated a second beam of protons through the path in the opposite direction, the goal being head-on collisions of protons that can offer clues to the origin of mass and new forces and particles in the universe. The second beam made one turn around the LHC.
Celebrations across the United States and around the world mark the LHC's first circulating beams, an occasion more than 15 years in the making. An estimated 10,000 people from 60 countries have helped design and build the accelerator and its massive particle detectors, including more than 1,700 scientists, engineers, students and technicians from 94 U.S. universities and laboratories supported by the U.S. Department of Energy Office of Science and the National Science Foundation.
UCR faculty Robert Clare, John Ellison, J. William Gary, Gail Hanson and Stephen Wimpenny, along with postdoctoral researchers and graduate students are involved in the LHC's Compact Muon Solenoid (CMS) experiment, a large particle-capturing detector whose discoveries are expected to help answer questions such as: Are there undiscovered principles of nature? What is the origin of mass? Do extra dimensions exist? What is dark matter? How can we solve the mystery of dark energy? And how did the universe come to be?........
Posted by: John Read more Source
Older Blog Entries
Older Blog Entries
September 10, 2008, 8:09 PM CT
UC Santa Barbara has key role in Large Hadron Collider project
Engineer Dean White holds one of the detectors assembled at UCSB.
(Santa Barbara, Calif.) -- Earlier today, some 300 feet below the Earth's surface, in a circular tunnel so extensive that it travels from Switzerland into France and back again, researchers at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) in Geneva fired the first beams of protons that they hope will eventually produce history-making science.
A contingent of more than 40 faculty members, graduate students, postdoctoral researchers, engineers, technicians, and undergraduates from UC Santa Barbara have worked for eight years to help construct the experimental apparatus. The UCSB group is part of an international effort that is now embarking on a 15-year quest to try to answer fundamental questions about the universe.
The startup of the LHC marked a milestone for the UCSB particle physics program. The group has played a key role in constructing one of four major experiments now in place the Compact Muon Solenoid (CMS), a complex array of instruments for detecting subatomic particles. The device weighs more than 12,000 tons and is as tall as a four-story building.
UCSB's team is led by four members of its experimental high-energy physics faculty. Professor Joseph Incandela has been in Switzerland for the past year, shepherding the CMS experiment as deputy physics coordinator. Shuttling back and forth between Santa Barbara and Switzerland have been professors Claudio Campagnari, Jeffrey Richman, and David Stuart. The faculty members are unanimous in their praise for CERN's monumental effort in building the LHC, the world's largest particle accelerator.........
Posted by: John Read more Source