Biomass energy calculator developed

By , March 4, 2013 7:30 pm

The U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) has developed a new biomass analysis tool that allows analysis of biomass products into energy. The approach is based on pyrolysis of the biomass which provides information on the energy content of the fuel.

The approach is interesting but focuses on the lowest value fuel content and not on bioprocess conversion capabilities to generate transportation fuels. Certainly good to be able to look at this worst case scenario as a baseline. Would be ideal if we have progressed to the point where such a calculator could address fermentability based on lignin content or similar.

For more information see:

http://www.oemoffhighway.com/blog/10887343/speeding-up-the-biomass-analysis-process

http://www.nrel.gov/news/features/feature_detail.cfm/feature_id=2127

New Year’s musings

By , January 1, 2013 6:38 pm

It is the start of a new year and time for some reflection. I and my family have had a large change with a move from Arizona to Nebraska. I have a similar role as I did as department head but now of a much larger department. The move has been good but I am somewhat surprised by the differences in culture between two fairly similar institutions. Nebraska is much more dominated by agriculture, which is no surprise, but the broader impact of this is much greater into the university than I had anticipated.

In both places people are highly supportive but the infrastructure and resources are quite different. The use of digital technologies in Nebraska is much larger and this is probably a result of being able to utilize a more predictable stream of funding that can be used to keep abreast of technology improvements in order to be more efficient administratively.

Wherever one goes, people are essentially the same and their motivations are not greatly different. In academic institutions we want to develop innovative research projects that truly create change, we teach students with the aim of filling the engineering job pipeline, and we work with our clients to help them solve regionally important problems.

A key challenge we face in the academic community is how to become more efficient in our educational programs especially so that we can reach a broad student base and train them in practical skills. The movement to online courses allows us to reach more students but we lose much of the connection between content experts and the learners. We have lost many of the hands on activities and practical know how. This is to be expected given our increasing reliance on information which comes at the cost of experience in getting things done outside of the digital world. We must finds ways to swing back to a greater balance. Failure to do so will lead to a decline in the true value of a college education. I am hopeful that we will be able to make good strides in the coming year.

The Morrill Act of 1862 applied in 2012

By , August 28, 2012 4:17 pm

This year we celebrate the 150-year anniversary of the Morrill Act enacted by the US Senate and House of Representatives and signed into law by President Lincoln. The act sets aside state lands for use and sale for the development of what have been termed Land Grant Institutions, with initially one per state. Examples include Texas A&M, University of Arizona, Michigan State University, Purdue University, Rutgers University, and University of Nebraska (note that there is no consistency in the word “state” in the institution title).

The goal of the act was for the “endowment, support, and maintenance of at least one college where the leading object shall be, without excluding other scientific and classical studies, and including military tactics, to teach such branches of learning as are related to agriculture and the mechanic arts, in such manner as the legislatures of the States may respectively prescribe, in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life.”

Since this act was written and enacted so long ago, it is certainly fair to ask if the land grant mission still applies today. Many have written on this, see 1, 2, 3, and 4. A land-grant university is to educate the people of its state and solve problems through academic endeavors, research and extension programs.

I see an important part of this land grant mission is an emphasis (but not exclusion) for educating the masses in practical skills, especially toward the pursuit of a profession, which is not necessary the goal in the liberal arts and performing arts. Often discussions on the meaning of the Morrill Act today relocate the word “arts” from its descriptor “mechanical” and place it next to “liberal”. While liberal arts certainly are an important part of education, they don’t appear to be the foundation upon which land grant institutions were developed. Those were often the focus of higher education institutes of the time and were not available to the general public due to a myriad of factors.

Today land grant institutions see their mission as driven by a need to be responsive to local needs (in their state and region), to make education as accessible to the public as possible, and to foster the use of science based methods for conducting business.  Faculty, staff, and students at such institutions need to keep in mind our tradition of reaching out to our communities, of being productive agents of change in these communities, and in use of fact based evidence in the conduct of our business.

Is science cool again?

By , August 27, 2012 5:58 pm

This question is asked by Adam Ruben in the Aug 24th, 2012 issues of Science. The question has been raised in response to a number of high profile science events which have been successful – the mars rover Curiosity, finding the “god” particle in high energy physics at CERN, and so on.

Adam comes to the conclusion that no, science is not cool and that he doesn’t want it to be cool since that implies it could be a fad like the Macarena (shudder) which could lose favor at the whim of the public. He writes that science feels cool again because we haven’t tried to force the issue. The public saw some really difficult and important things actually work rather than fail catastrophically.

I think a key point here is that we see the human side of science and realize that maybe someday I could (or my kids could) do some pretty neat things that impact the world in a positive way. When we present science without the faces of the scientists and their appearance as real people, science feels like an untouchable act done by others. We, the public, invariably want to see that people doing neat things are “just like us” or in some cases as someone we could see ourselves sharing a meal or beverage.

But another part of cool is the unattainable that comes from effortless cool. I’m thinking of Fonzie on the old Happy Days show as the epitome of cool, at least to individuals now in their 40’s. Good science, science that is creative and changes the culture doesn’t come effortlessly, but only through hard work, trial and (lot’s) of error, and persistence. Sounds much more like professional athletics that the science community should use as a model. Imagine the possible endorsements that could be had – the latest high end sneaker designed for standing next to the lab bench; nutrition bars that provide intelligence, creativity, and stamina (steroid-free, for certain); and of course, high performance pocket protectors for the ultimate in lab coat security.

Creating pictures with molecules of life

By , June 18, 2012 10:39 pm

A recent report in Nature (DOI: 10.1038/nature11075) presents the work of researchers at Harvard’s program in Biologically Inspired Engineering that displays structures produced from 42-base strands of nucleic acids. Each strand represents one pixel when associated with complementary strands that interface into building blocks or tiles to construct self-assembled structures. The resemblance to rapid-prototyping is striking but on a more fundamental level the structures appear like a child’s set of building blocks.

Being able to control macroscopic structure by altering amino acid base sequence is not a new idea and has been under investigation since at least 2004 (Shih et al., Nature 2004; Chworos et al., Science 2004). What is novel about the approach shown now is the macroscopic size (albeit through AFM) of the complex shapes developed and the high degree of precision. A full alphanumeric scheme (including emoticons) is shown.

Nucleic acids provide a wonderful degree of modularity with point-to-point level resolution for structural changes. Polymer chemists are likely to be jealous of the ability to control sequence of molecular units.

Potential for this approach is intriguing to ponder. We can use molecular level control to construct macro-scale (at least AFM-level) structures which could then be used as a cast upon which can be laid all sorts of traditional polymers which could be formed into unique molecular level shapes. The nucleic acid structures themselves are likely to not have high degrees of stability and longevity thus necessitating the use of other polymers of greater performance. These DNA structures could be applied as templates for cell growth especially when intercalating compounds are introduced to the nucleic acid building blocks for functionalization.

How will the public perceive such use of the molecules of life? While there is much to be learned by employing nucleic acids in this way researchers need to be cognizant of the need to move toward practical uses rather than overuse of the production of pretty pictures. Interesting possibilities.

Future of medicine – the critical path forward

By , April 14, 2012 5:02 pm

I saw a presentation yesterday by Carolyn Compton, the new head of C-PATH, the critical path institute located in Tucson, AZ. This group has the goal of accelerating the development of new medicines by working at the interface between pharmaceutical companies, academic researchers and clinicians, the FDA, and the public.

Dr. Compton spoke especially on utilization of sequenced genomes of individuals could impact medical diagnostics. The cost and time has dramatically decreased to the point where it is now feasible to sequence an individual’s genome in less than 24 hours at a cost of less than $US 1,000. This information could be used to select patient-specific interventions. For example, breast cancers that are HER2+ should be treated with Herceptin® since this intervention is truly a life saving approach; but if the cancer is HER2- (not having the HER2 mutation), the treatment is not likely to be effective and essentially $US 100k per year for treatment is poorly spent. In this example, having the knowledge of a disease’s genetic fingerprint is crucial in selecting a treatment.

Dr. Compton spoke on identifying many more paths by which a diagnostic physician’s judgement could be better replicated by detailed heuristics based on better diagnostic tools. She referenced the long repeated concept that individuals can consider only 5 inputs when making a decision. What is to be done when there are >5 relevant pieces of information?

We are moving toward greater utilization of computerized expert systems to guide medical diagnostics. How quickly these are implemented depends not only on the development of technology but also on the public acceptance and acceptance by the medical community. Biological engineers are developing tools for rapid detection of SNP’s, of certain microbial components of disease, and in physiological function.  Many of these use lab-on-a chip approaches, microfluidics, nanotech, etc.  Will we every get to the point where medical diagnostics do not require human intervention?

I think that medicine will eventually become fully automated in which an individual who is not feeling well would be able to self collect a relevant sample (perhaps a finger stick) which is inputted to equipment which quickly reports back not only what is the problem but also provides a list of recommended interventions. Certainly many diseases are not driven by host genome but include infections, environment-gene interactions, and physical trauma. My assumption is that deep diagnostic techniques will follow for each of these in the wake of utilizing personalized genetic medicine.

Do we need to rely on the judgement of a physician? Many (most, but certainly not all) have excellent judgement and can see connections and use intuition. Could we some day be able to replicate this intuition (especially an intuition which is developed through varied training and experiences – a sequence which could be replicated digitally)?  If such a path is feasible for medicine, it should also be feasible to address environmental challenges, personalized nutrition, and the like. I do have faith in technology, although we must be cautious to not over hype the coming developments which would raise the public’s expectations unrealistically and too quickly.

What’s in a name – quite a lot more than one might think

By , April 8, 2012 5:06 pm

This past spring saw the passing of Renato Dulbecco (1913-2012) who was eulogized by David Baltimore in the March 30 issue of Science. Dulbecco was a pioneer in bacteriophage genetics, cell transformation by viral DNA, and was the earliest proponent of the human genome project. He developed the first method for quantifying viruses in a culture. He won the Nobel prize for demonstrating that viral DNA can be integrated into cellular DNA thus leading to the inescapable conclusion that genes cause cancer. A key enzyme discovered has come to be known as reverse transcriptase. Dulbecco remained active scientifically well into his 90′s.

There is great importance in scientists understanding the history of their field and the brilliant individuals whose keen intuition led to the framework of our understanding of how the world works. A simple way to make this connection is to recognize that many of the tools we use today are named after real people either because they were the first to develop the tool or it was named for them by a colleague paying respect.

Dulbecco’s name is attached to one of the most prevalent mammalian cell culture media: Dulbecco’s Modified Eagle media, or DMEM. This substance contains all of the sugars, amino acids, and small molecules (usually supplemented with serum) to keep mammalian cells alive for many generations. Dulbecco’s work on cellular transformation not only unlocked a key mechanism in the onset of cancer, but also provided the concept for generating immortalized cells as used in numerous laboratories today. One of Dulbecco’s mentors was Luria – whose name has become synonymous with LB (or Luria broth) used to grow many types of bacteria.

What other common laboratory tools have been named after preeminent scientists whose name may not immediately conjure a real individual?

Genetics of sunflowers and van Gogh’s obsession

By , April 2, 2012 3:54 pm

As any art lover would recognize, Vincent van Gogh’s paintings of sunflowers often depict a variety of flower shapes not commonly seen in nature. These impressionist paintings may be derived from his interpretation of the flower shape or these unusual structures could arise from a very tangible set of unusual genetic variations. A study by John Burke (UGa) recently published in PLoS Genetics addresses sunflower genes that lead to alterations in flower petal symmetry.

The large head of a sunflower is not a single flower, but comprised of many smaller flowers called florets. Burke’s work led to the discovery of two mutations to a single gene (HaCYC2c) that impacts floret symmetry. “Mis-expression of this gene causes a double-flowered phenotype, similar to those captured in Vincent van Gogh’s famous nineteenth-century paintings, whereas loss of gene function causes radialization of the normally bilaterally symmetric ray florets.” It appears that Burke’s group has identified the mutation that caused the double-flowered varieties painted by van Gogh.

Does this new knowledge diminish the aesthetic quality and artistic value of van Gogh’s paintings? We have a small poster replica of one of these paintings in our house and I had always thought that the double headed flowers were flights of fancy rather than an accurate record of plants that once lived. Having evidence that changes in flower shape and structure just like in the painting increases my joy in viewing.

The intersection between art and science will become increasingly well developed as we gain greater depth of understanding in the relationship between gene expression and phenotype. While this removes some of the mystery behind the presence of surprising aspects of our world, to the scientist, this new knowledge increases the beauty. There is much to be gained by having scientists view the world from the perspective of the artist; a much less developed approach is to encourage the artist to view the world through the lens of the scientist.

Heading to IBE 2012 in Indy

By , February 28, 2012 4:32 pm

The annual international meeting of IBE in 2012 will be held in Indianapolis, IN starting on Th March 1st. There will be talks on a variety of developing areas including synthetic biology, renewable biofuels, microdevices, biomaterials, iGEM, environmental engineering, and bio-nanotech. We’ll also have a new session on “biological engineering basics” for those who may have a narrow background and want to learn more about the breadth of our field.

This year IBE is developing communities, groups centered around specific topical content in Biology, Systems Engineering and Bioenergy; Design and Education in Biological Engineering; Biosensors, BioNano and Biomedical Engineering; and Ecological Engineering. Should be a good way to catalyze further discussion on how to drive the research agenda.

Hope to see you in Indy.

Farmer’s little friend

By , November 15, 2011 8:49 pm

Interesting story on a new robot developed to automate the process of harvesting crops on a farm. Story appeared in Wired Science

Manual labor on farm has become a mounting problem in recent years. On the one hand new immigration laws (post 9/11) have made it much more difficult for growers to be able to bring in the help that they need at harvest time. At the same time, food safety guidelines lead to increased regulation on training of on farm help. Despite the high unemployment rate, many crops are spending more time in the field than ideal.

So – how do we solve all these challenges at once? AUTOMATE

Automation and robotics have been used in a number of ag harvesting areas for many years. New tractors with autoguidance, GPS, A/C, etc. have made in field work much more pleasant and safe. Automation has become a major part of greenhouse operations especially in Japan.

Joe Jones, a co-inventor of iRobot’s Roomba vacuum cleaning robot has had interest in horticulture which could use their small, relatively inexpensive, mobile material handling robot. Their venture-backed company has been field testing robots at nurseries around the U.S. Harvest Automation bootstrapped the development of prototype robots using funding from a number of sources.

Small mobile robots that tend crops are just emerging, and most of the action is in produce. Row crops provide a semi-structured environment, and several companies are marketing four-wheeled robots with computer vision systems that monitor and in some cases tend to crops.

For more info on ag automation, see:

http://azstarnet.com/business/local/article_464ed1c7-637f-525f-84dc-8059a1c36e1f.html

or

http://westernfarmpress.com/vegetables/automated-lettuce-thinning-machine-prototype-photos

Panorama Theme by Themocracy