pondering eating out at a fast food or family restaurant - the Center for Science in the Public Interest 2013 extreme eating awards. A few of the standard meals are North of 3,000 calories.
Speaking of food, one wonders how much people really eat throughout Thanksgiving day as well as at the big meal. I've seen a variety of guesses ranging from 2,500 to 4,500 calories for the day and 1,500 to 3,000 for the big meal. Self reported counts are probably not terribly useful, but it would be intreating to see if anyone has done better than guesstimate work.
Sustainable can mean many things. As population grows and the trend to eating meat continues, calories per acre is an increasingly important metric. The wide variety of exotic foods produced naturally may not make the cut. Factory farming many have to increase everywhere. What about monocultures in the Amazon? (via the NY Times). Lots of hard choices ahead ...
The amount of solar radiation at the Earth's orbit is about 1350 watts per square meter. When you correct for the fact the Earth is a rotating sphere with weather the figure drops to an average of about 170 watts per square meter. This drives our weather and life on Earth. You can convert it to electricity with an efficiency in the range of 10 to 20%, heat your home or any of the things that usually come to mind when one hears "solar energy." More important, and central to life, is the fact that plants convert the sunlight to chemical energy through photosynthesis. The process is very inefficient - most crops are a few tenths of a percent efficient. So inefficient that you start to worry about having enough cropland to feed people.
We've been very clever at figuring out how to run the plants closer to their best efficiency - nitrogen based fertilizers and the green revolution were fundamental breakthroughs that allow us to produce enough nutritional calories to feed humanity (although equitable distribution is a big issue and over a billion go hungry). There are a few more tricks that can be used, but few see big revolutionary steps and there is an issue of what happens with global warming - most indications are for lower crop yields.
A compounding factor is meat consumption. Plants are engines for turning solar energy into chemical energy and animals use that energy to grow and live. The energy cost of making an animal is very high - making an animal is very inefficient. You can improve the efficiency of the process by choosing the right animals (chickens are much more efficient than cattle for example) and making sure their lifespan to slaughter is optimal and they don't waste too much energy along the way. This is often enhanced with factory farming, but the process is still inefficient.
Most affluent societies eat a fair amount of meat - an interesting problem to think about is what happens if the average living standard raises to the point where meat consumption increases. We're seeing that in China and India now. Will economics take care of this or will it encourage more marginal agricultural practices that may not be robust against global warming?
One solution that has been proposed is artificial meat .. a brief summary of what is happening now. (hat tip to Aaron) There is a real problem with an uncanny valley effect for people used to meat. I'm a vegetarian and lost any interest in meat long ago - it is a non-issue for me personally. But, if it had better results and was cheaper than real meat, perhaps it could reduce the demand on agriculture a bit. If it worked it would be a many billion dollar industry.
The Norwegian Army is introducing meatless Mondays for environmental reasons... I once ate in one of the officer's mess halls at West Point after giving a talk and was surprised they had no problem producing a good vegan lunch. I was told a low percentage of the ranks are vegan and are even offered vegan MREs... But while the US military has numerous green initiatives, I doubt we'll see meatless mondays any time soon.
The study reveals that these recipe calorie increases are caused by changes in ingredients, but also because people are now accustomed to eating bigger portions of food. Food is less expensive and a smaller part of the consumer income now than it was in 1936, so cookbooks need to adapt to today's social norms of more calorie dense food s and bigger servings of these foods. Serving sizes have increased gradually throughout the years and cookbook editions. The largest jump is a 33.2% increase in portion serving sizes since 1996 alone. This expanded portion size helps explain why calories per serving have increased from an average of 168.8 calories to 436.9 calories, which is a 63% increase in calories per serving. The chicken gumbo recipe for example, went from making 14 servings at 228 calories each in the 1936 edition, to making 10 servings at 576 calories each in the 2006 version. What is the best way to prevent this sneaky remake of the classics? Wansink states " These recipes were once intended to serve nearly twice as many people as they do today, so don't let a full portion get anywhere near your plate.
In developing nations, rural areas, and even one's own home, limited access to expensive equipment and trained medical professionals can impede the diagnosis and treatment of disease. Many qualitative tests that provide a simple "yes" or "no" answer (like an at-home pregnancy test) have been optimized for use in these resource-limited settings. But few quantitative tests—those able to measure the precise concentration of biomolecules, not just their presence or absence—can be done outside of a laboratory or clinical setting. By leveraging their discovery of the robustness of "digital," or single-molecule quantitative assays, researchers at the California Institute of Technology (Caltech) have demonstrated a method for using a lab-on-a-chip device and a cell phone to determine a concentration of molecules, such as HIV RNA molecules, in a sample. This digital approach can consistently provide accurate quantitative information despite changes in timing, temperature, and lighting conditions, a capability not previously possible using traditional measurements.
In a study published on November 7 in the journal Analytical Chemistry, researchers in the laboratory of Rustem Ismagilov, Ethel Wilson Bowles and Robert Bowles Professor of Chemistry and Chemical Engineering, used HIV as the context for testing the robustness of digital assays. In order to assess the progression of HIV and recommend appropriate therapies, doctors must know the concentration of HIV RNA viruses in a patient's bloodstream, called a viral load. The problem is that the viral load tests used in the United States, such as those that rely on amplification of RNA via polymerase chain reaction (PCR), require bulky and expensive equipment, trained personnel, and access to infrastructure such as electricity, all of which are often not available in resource-limited settings. Furthermore, because it is difficult to control the environment in these settings, viral load tests must be "robust," or resilient to changes such as temperature and humidity fluctuations.
Many traditional approaches for measuring viral load involve converting a small quantity of RNA into DNA, which is then multiplied through DNA amplification—allowing researchers to see how much DNA is present in real time after each round of amplification, by monitoring the varying intensity of a fluorescent dye marking the DNA. These experiments—known as "kinetic" assays—result in a readout reflecting changes in intensity over time, called an amplification curve. To find the original concentration of the beginning bulk RNA sample, the amplification curve is then compared with standard curves representing known concentrations of RNA. Since assays, such as those for HIV, require many rounds of DNA amplification to collect a sufficiently bright fluorescent signal, small errors introduced by changes in environmental conditions can compound exponentially—meaning that these kinetic measurements are not robust enough to withstand changing conditions.
In this new study, the researchers hypothesized that they could use a digital amplification approach to create a robust quantitative technique. In digital amplification, a sample is split into enough small volumes such that each well contains either a single target molecule or no molecule at all. Ismagilov and his colleagues used a microfluidic device they previously invented, called SlipChip, to compartmentalize single molecules from a sample containing HIV RNA. SlipChip is made up of two credit card-sized plates stacked atop one another; the sample is first added to the interconnected channels of the SlipChip, and with a single "slip" of the top chip, the channels turn into individual wells.
- See more at: http://www.caltech.edu/content/slipchip-counts-molecules-chemistry-and-cell-phone#sthash.meCUq9FO.dpuf
If that was not disappointing enough to those who believe that high-protein diets lead to more weight loss (they don’t), the evidence for other approaches is even more disillusioning, as it consistently ranges between “low” and “insufficient”.
Thus, low fat approaches appear no better than high-fat (strength of evidence: “moderate”), while the evidence in support of low-calorie diets, complex vs. simple carbs, glycemic load, Mediterranean-style diets, lower-fat lacto-ovo-vegetarian or vegan-style, or lower fat high dairy/calcium with added fiber and/or low glycemic index/load foods, use of liquid and bar meal replacements, or even very low calorie approaches is largely “insufficient” to make any reasonable recommendations in favour of any of these strategies versus any other.
Not that people do not lose on any or all of these diets as long as they are “energy restricted” – of course they do!
But, what is lacking is evidence that any of these countless dietary approaches confer any meaningful advantage (in terms of amount of weight lost, metabolic benefits or sustainabilty of weight loss) compared to any other.
So, whilst millions of “bestseller” diet books may continue to make millions for their authors and publishers by touting one revolutionary weight loss solution after another, they are essentially closer to religious belief systems than scientific fact.
It appears after all, that the single recommendation that best summarizes all of the actual evidence on diet and weight management boils down to “eat less” – and we all know how effective that recommendation turns out to be.
What does work is to not get overweight in the first place. Once you do it is extremely difficult to regulate weight back into the normal level - our genetics prevent it and we haven't had enough time to evolve a metabolism which can deal with a superabundance of food. Then again we may not want such bodies in the long term - superabundance is historically rare and there are no guarantees it will exist in the future.
The diet industry is a sure-fire business game. Lots of customers and a often good short term success for most diets (hence a lot of positive word of mouth promotion) - it is relatively easy to lose weight for the short term. But ultimately our hard wired metabolism takes over and most of us gain back more weight than we lost. So we try another diet and enter a very unhealthy yo-yo dieting existence. It is like a strange game of musical chairs with the chairs being diets. In this variation new chairs are constantly being added rather than removed.
lots of limitations to a study quoted, but interesting
The recent study "The Impact of Sustained Engagement on Cognitive Function in Older Adults: The Synapse Project" published in the journal Psychological Science by the psychology researcher Denise Park and her colleagues at the University of Texas at Dallas is an example of an extremely well-designed study which attempts to tease out the benefits of participating in a structured activity versus receiving formal education and acquiring new skills. The researchers assigned subjects with a mean age of 72 years (259 participants were enrolled, but only 221 subjects completed the whole study) to participate in 14-week program in one of five intervention groups: 1) learning digital photography, 2) learning how to make quilts, 3) learning both digital photography and quilting (half of the time spent in each program), 4) a "social condition" in which the members participated in a social club involving activities such as cooking, playing games, watching movies, reminiscing, going on regular field trips but without the acquisition of any specific new skills or 5) a "placebo condition" in which participants were provided with documentaries, informative magazines, word games and puzzles, classical-music CDs and asked to perform and log at least 15 hours a week of such activities. None of the participants carried a diagnosis of dementia and they were novices to the areas of digital photography or quilting. Upon subsequent review of the activities in each of the five intervention groups, it turned out that each group spent an average of about 16-18 hours per week in the aforementioned activities, without any significant difference between the groups. Lastly, a sixth group of participants was not enrolled in any specific program but merely asked to keep a log of their activities and used as a no-intervention control.
When the researchers assessed the cognitive skills of the participants after the 14-week period, the type of activity they had been enrolled in had a significant impact on their cognition. For example, the participants in the photography class had a much greater degree of improvement in their episodic memory and their visuospatial processing than the placebo condition. On the other hand, cognitive processing speed of the participants increased most in the dual condition group (photography and quilting) as well as the social condition. The general trend was that the groups which placed the highest cognitive demands on the participants and also challenged them to be creative (acquiring digital photography skills, learning to make quilts) showed the greatest improvements.