Does Sugar Make Dietary Fat Less OK?

                                                                      &nbs…

                                                                                                       image source

I have defended dietary fat as healthy in many blog posts, including

But there is one circumstance in which dietary fat might not be so great: if you are still eating sugar. In last week's post, "Heidi Turner, Michael Schwartz and Kristen Domonell on How Bad Sugar Is" I quote this:

Schwartz agrees that sugar can cause major health problems, but says it isn’t acting alone. The most potent way to activate the brain’s reward system is actually by combining sugar with fat, he says. And much of the American diet contains both of these components.

There is a claim here about complementarity in badness: that is, in the presence of sugar, dietary fat is worse than in the absence of sugar. I take the view that in the absence of sugar, dietary fat—other than the big mistake of transfat—is quite healthy. But it is logically possible that dietary fat combined too close in time to sugar is unhealthy. Let me spin out a possible theory. I should say first that I am not really persuaded by the "overwhelmingly rewarding" theory that Michael Schwartz is putting forward. Sugar is extremely rewarding. Fat is extremely rewarding. Is the combination of sugar and fat that much more rewarding than sugar in combination with nonfat foods or fat in combination with nonsugar foods?

Instead, let discuss things from the standpoint of the satiation to calorie ratio I talk about in "Letting Go of Sugar." Dietary fat by itself, or in combination with other foods low on the insulin index (see "Forget Calorie Counting; It's the Insulin Index, Stupid") is quite satiating: it will make you feel full quite fast. But sugar has a negative satiation to calorie ratio: it makes you feel less full. So add enough sugar to your dietary fat, and the dietary fat's normal tendency to make you feel full will be neutralized. 

The idea that sugar neutralizes the tendency of dietary fat to make you feel full still doesn't seem to make dietary fat any worse than anything else combined with sugar. But what if, in addition to the mechanisms that normally make dietary fat satiation, there is a volumetric mechanism that makes you full if there is a high volume of food in your stomach. It would make sense that sugar can neutralize some mechanisms that make you feel full, but it can't neutralize the volumetric mechanism. But dietary fat doesn't have a lot of volume per calorie, so if the normal mechanism that makes dietary fat so very satiating is neutralized, there isn't a volumetric backup mechanism for fat. Sugar gets past the main safeguard that makes you not want to overeat dietary fat and that's it. 

Solution? Don't eat sugar. See "Letting Go of Sugar" for how to get there. Sugar is bad whether or not it is combined with dietary fat. Even on the theory above, dietary fat is only bad when combined with sugar.

 

Don't miss these other posts on diet and health and on fighting obesity:

Also see the last section of "Five Books That Have Changed My Life" and the podcast "Miles Kimball Explains to Tracy Alloway and Joe Weisenthal Why Losing Weight Is Like Defeating Inflation." If you want to know how I got interested in diet and health and fighting obesity and a little more about my own experience with weight gain and weight loss, see my post "A Barycentric Autobiography."

Eric Morath Defines Full Employment

In the article above, Eric Morath gives a nice definition of "full employment," which means the same thing as "the natural level of employment":

Full employment is the term economists use to describe a sweet spot in the economy—a point in a business cycle when unemployment is very low, but not so low that it starts stoking severe wage and price inflation. The Fed’s goal is to keep the economy in that sweet spot for as long as possible.

Heidi Turner, Michael Schwartz and Kristen Domonell on How Bad Sugar Is

                                                   Link to the article shown above

                                                   Link to the article shown above

Heidi Turner is a medical nutrition therapist at The Seattle Arthritis Clinic. Michael Schwartz, M.D. is director of the University of Washington Medicine Diabetes Institute and the Nutrition Obesity Research Center there. Kristen Domonell interviewed both of them about sugar for the University of Washington Medicine website. Here are some highlights (bulleting added): 

Heidi Turner

  • Sugar is the universal inflammatory ... Everyone is sugar intolerant.
  • Addictive qualities aside, there’s also a large social element at play, says Turner. Bad day? Turn to sugar. Celebration at work? Just add sugar. It’s both delicious and comforting, which is part of the reason it’s so hard to get away from, she says.

Michael Schwartz

  • The most potent way to activate the brain’s reward system is actually by combining sugar with fat, [Schwartz] says. And much of the American diet contains both of these components.
  • I wouldn’t say people become dependent on it in the way they become dependent on a drug,” says Schwartz. “But for some people, the anticipation of eating something that is highly rewarding becomes an important focus for how they live each day.”

Finally, based on these interviews, Kristen Domonell herself writes: 

  • Eating a diet that’s high in added sugar is bad news for your heart, according to a major 2014 study. The researchers found that eating more than the recommended amount of added sugar [in this case 25 grams a day for women, 36 grams for men and 12 grams for children] may increase your risk of dying from heart disease. Even if you go to the gym and eat your greens regularly, you aren’t immune from the effects of sugar on your health. Eating a high-sugar diet can set you up for disease, even if you’re otherwise healthy, according to a new study. Researchers found unhealthy levels of fat in the blood and livers of men who ate a high-sugar diet, which may increase the risk of heart disease, they report.
  • And while many people eat sugar as a pick-me-up, it could be having the opposite effect. One recent study found that men who ate a high-sugar diet were more likely to develop depression or anxiety than those who ate a diet lower in sugar.
  • Sugar is hiding out where you least expect it—in everything from dressings and sauces to whole grain bread.

I really like the quotations above. But let me detail things in the article I disagree with. First, whole fruit is more problematic than they suggest. Most types of fruit have quite a bit of sugar in them, and despite the fiber content, most types of fruit have a substantial insulin kick. I discuss this in "Forget Calorie Counting; It's the Insulin Index, Stupid."

Second, the "recommended amounts of sugar" (25 grams a day for women, 36 for men and 12 for children) give the wrong idea that eating a little sugar is going to be OK. I discuss in "Letting Go of Sugar" why it is better to cut out sugar almost entirely. 

In the same vein, it is strange when Kristen writes:

In other words, if you’re forced to choose between white table sugar and honey, go for the honey. But if it’s a choice between honey or no sugar at all, going sugar-free is your best bet.

The likelihood that someone will put a gun to your head and say "Eat either the table sugar or the honey, or I'll shoot" is quite low. 

Finally, Michael Schwartz buys into what I think is a faulty evolutionary story:

Back when food was way scarcer, our ancient ancestors needed to take every advantage they had to consume high calorie foods. So the human brain evolved to perceive sugar—and fat—as very rewarding, says Schwartz. Today, our brains are still wired for feast or famine, even though you can buy thousands of calories of food for a couple bucks at the local convenience store.

Other than for hibernation, animals in the wild don't get terribly fat. There is a good reason. Being too fat would make the animal slow, which is bad for a predator and bad for a prey. In the environment of evolutionary adaptation, humans too, played the roles of predator and prey, and it wouldn't have paid to be too fat. The trouble for us is processed food, which didn't exist in the environment of evolutionary adaptation. On this, see "The Problem with Processed Food." In designing processed food in an effort to get people to eat a lot and buy a lot of each particular product, the food industry puts sugar into almost everything. That is a good indication of the power of sugar. 

 

Don't miss these other posts on diet and health and on fighting obesity:

Also see the last section of "Five Books That Have Changed My Life" and the podcast "Miles Kimball Explains to Tracy Alloway and Joe Weisenthal Why Losing Weight Is Like Defeating Inflation." If you want to know how I got interested in diet and health and fighting obesity and a little more about my own experience with weight gain and weight loss, see my post "A Barycentric Autobiography."

 

 

John Locke: Kings as War Leaders

When I was 10 years old, the 26th Amendment to the US Constitution extended the vote to everyone over 18 years old. The main argument for this extension of the franchise was that someone old enough to die for his country was old enough to vote. (The argument focused on young men who were being drafted, but young women got the vote, too.) This association of military service and the right to vote goes way back;, as John Ferejohn and Frances McCall Rosenbluth's book "Forged Through Fire: War, Peace, and the Democratic Bargain" argues.  Elites need soldiers to fight fiercely in order to preserve national autonomy. Letting soldiers vote helps in that. 

John Locke points to the converse in Sections 107-110 of his 2d Treatise on Government: “Of Civil Government” (in Chapter VIII, "Of the Beginning of Political Societies"). People under threat know they need a war leader. The war leader, often called a king, therefore often has the consent of the governed for relatively autocratic rule in the conduct of war. But John Locke argues that this selection of a war leader did not confer on the war leader autocratic power in other domains. Of course a successful war leader could could often seize autocratic power in other domains, but that power did not have the same legitimacy as the autocratic power over the conduct of war. I have included a lead-in to Sections 107-110 at the end of Section 106: 

§. 106. ... all petty monarchies, that is, almost all monarchies, near their original, have been commonly, at least upon occasion, elective.

 §. 107. First then, in the beginning of things, the father’s government of the childhood of those sprung from him, having accustomed them to the rule of one man, and taught them that where it was exercised with care and skill, with affection and love to those under it, it was sufficient to procure and preserve to men all the political happiness they sought for in society. It was no wonder that they should pitch upon, and naturally run into that form of government, which from their infancy they had been all accustomed to; and which, by experience, they had found both easy and safe. To which, if we add, that monarchy being simple, and most obvious to men, whom neither experience had instructed in forms of government, nor the ambition or insolence of empire had taught to beware of the encroachments of prerogative, or the inconveniences of absolute power, which monarchy in succession was apt to lay claim to, and bring upon them; it was not at all strange, that they should not much trouble themselves to think of methods of restraining any exorbitances of those to whom they had given the authority over them, and of balancing the power of government, by placing several parts of it in different hands. They had neither felt the oppression of tyrannical dominion, nor did the fashion of the age, nor their possessions, or way of living, (which afforded little matter for covetousness or ambition) give them any reason to apprehend or provide against it: and therefore it is no wonder they put themselves into such a frame of government, as was not only, as I said, most obvious and simple, but also best suited to their present state and condition; which stood more in need of defence against foreign invasions and injuries, than of multiplicity of laws. The equality of a simple poor way of living, confining their desires within the narrow bounds of each man’s small property, made few controversies, and so no need of many laws to decide them, or variety of officers to superintend the process, or look after the execution of justice, where there were but few trespasses, and few offenders. Since then those, who liked one another so well as to join into society, cannot but be supposed to have some acquaintance and friendship together, and some trust one in another; they could not but have greater apprehensions of others, than of one another: and therefore their first care and thought cannot but be supposed to be, how to secure themselves against foreign force. It was natural for them to put themselves under a frame of government which might best serve to that end, and chuse the wisest and bravest man to conduct them in their wars, and lead them out against their enemies, and in this chiefly be their ruler.

  §. 108. Thus we see, that the kings of the Indians in America, which is still a pattern of the first ages in Asia and Europe, whilst the inhabitants were too few for the country, and want of people and money gave men no temptation to enlarge their possessions of land, or contest for wider extent of ground, are little more than generals of their armies; and though they command absolutely in war, yet at home and in time of peace they exercise very little dominion, and have but a very moderate sovereignty, the resolutions of peace and war being ordinarily either in the people, or in a council. Though the war itself, which admits not of plurality of governors, naturally devolves the command into the king’s sole authority.

  §. 109. And thus in Israel itself, the chief business of their judges, and first kings, seems to have been to be captains in war, and leaders of their armies; which (besides what is signified by going out and in before the people, which was to march forth to war, and home again in the heads of their forces) appears plainly in the story of Jephtha. The Ammonites making war upon Israel, the Gileadites in fear send to Jephtha, a bastard of their family whom they had cast off, and article with him, if he will assist them against the Ammonites, to make him their ruler; which they do in these words, “And the people made him head and captain over them,” Judges xi. 11. which was, as it seems, all one as to be judge. “And he judged Israel,” Judges xii. 7. that is, was their captain-general six years. So when Jotham upbraids the Shechemites with the obligation they had to Gideon, who had been their judge and ruler, he tells them, “He fought for you, and adventured his life far, and delivered you out of the hands of Midian,” Judg. ix. 17. Nothing mentioned of him, but what he did as a general: and indeed that is all is found in his history, or in any of the rest of the judges. And Abimelech particularly is called king, though at most he was but their general. And when, being weary of the ill conduct of Samuel’s sons, the children of Israel desired a king, like all the nations to judge them, and to go out before them, and to fight their battles, 1 Sam. viii. 20. God granting their desire, says to Samuel, “I will send thee a man, and thou shalt anoint him to be captain over my people Israel, that he may save my people out of the hands of the Philistines,” ix. 16. As if the only business of a king had been to lead out their armies, and fight in their defence; and accordingly at his inauguration pouring a vial of oil upon him, declares to Saul, that the Lord had anointed him to be captain over his inheritance, x. 1. And therefore those, who after Saul’s being solemnly chosen and saluted king by the tribes at Mispah, were unwilling to have him their king, made no other objection but this, “How shall this man save us?” v. 27. as if they should have said, this man is unfit to be our king, not having skill and conduct enough in war, to be able to defend us. And when God resolved to transfer the government to David, it is in these words, “But now thy kingdom shall not continue: the Lord hath sought him a man after his own heart, and the Lord hath commanded him to be captain over his people,” xiii. 14. As if the whole kingly authority were nothing else but to be their general: and therefore the tribes who had stuck to Saul’s family, and opposed David’s reign, when they came to Hebron with terms of submission to him, they tell him, amongst other arguments they had to submit to him as to their king, that he was in effect their king in Saul’s time, and therefore they had no reason but to receive him as their king now. “Also (say they) in time past, when Saul was king over us, thou wast he that leddest out and broughtest in Israel, and the Lord said unto thee, Thou shalt feed my people Israel, and thou shalt be a captain over Israel.”

  §. 110. Thus, whether a family by degrees grew up into a commonwealth, and the fatherly authority being continued on to the elder son, every one in his turn growing up under it, tacitly submitted to it, and the easiness and equality of it not offending any one, every one acquiesced, till time seemed to have confirmed it, and settled a right of succession by prescription: or whether several families, or the descendents of several families, whom chance, neighbourhood, or business brought together, uniting into society, the need of a general, whose conduct might defend them against their enemies in war, and the great confidence the innocence and sincerity of that poor but virtuous age, (such as are almost all those which begin governments, that ever come to last in the world) gave men one of another, made the first beginners of commonwealths generally put the rule into one man’s hand, without any other express limitation or restraint, but what the nature of the thing, and the end of government required: which ever of those it was that at first put the rule into the hands of a single person, certain it is no body was intrusted with it but for the public good and safety, and to those ends, in the infancies of commonwealths, those who had it commonly used it. And unless they had done so, young societies could not have subsisted; without such nursing fathers tender and careful of the public weal, all governments would have sunk under the weakness and infirmities of their infancy, and the prince and the people had soon perished together.

One thing John Locke leaves out of this story of political evolution as a response to war is an account of how the collective pursuit of war often caused an atrophy in respect for natural law justice between nations. A group of warriors raiding another tribe often encourage each other in theft, rape and militarily unnecessary murder that many of the individuals in that group of warriors would shrink from without such social support for depredations on the other tribe. If John Ferejohn and Frances McCall Rosenbluth are right that war fosters democracy, the price of democracy is high. 

Link to the Wikipedia article for "Saul." Above is an image of the painting "The Battle of Gilboa," by Jean Fouquet. As Wikipedia notes, "the protagonists [are] depicted anachronistically with 15th Century armour."

Link to the Wikipedia article for "Saul." Above is an image of the painting "The Battle of Gilboa," by Jean Fouquet. As Wikipedia notes, "the protagonists [are] depicted anachronistically with 15th Century armour."

Tushar Kundu: Pulling America Back Together

                                            Link to Tushar Kundu's Linked-in page

                                            Link to Tushar Kundu's Linked-in page

I am delighted to present a guest post from Tushar Kundu. I work with Tushar very closely in his role as a full-time Research Assistant for the Dan Benjamin, Kristen Cooper, Ori Heffetz and my Well-Being Measurement Initiative. Tushar is one of the most impressive individuals I have ever met. What follows are Tushar's words:


In my junior year of high school, I was asked to write an essay about whether I believed the American Dream exists. I passionately argued, and believed, that it did. I placed my faith in the American meritocracy, claiming that with enough hard work and grit, enough had been provided so that anyone should be able to achieve success. However, if I were to respond to the prompt today, I would come to the almost exact opposite conclusion, likely bashing my original stance as a naïve take that ignores centuries of history and evidence to the contrary. Despite the apparent flip-flop, there exists common ground between the two views. In particular, it is a belief about what America could be that has continued to mold the lens through which I view this country. I believe that placing the realization of the American Dream back in our sights is essential for the health of this nation, particularly as leaders across the political spectrum have continued to let it to fall by the wayside.

 

Growing up in 21st century America, I’ve always felt a tension between two competing stories about what this country stands for. I am child of two immigrants who came to the U.S. for college, my father with $20 in his pocket. My parents successfully built up enough capital to purchase a home in the Bay Area, and provided me an exceedingly comfortable, nurturing childhood in one of the most diverse, dynamic areas in the world. There is no doubt that this upbringing represents the causal link to the words I penned five years ago. Of course I believed the American Dream to be alive and well, here it was right in front of my eyes!

 

My shift in perspective was precipitated by a shift in location, as I travelled across the country for college. Contrary to what some may believe, my evolution in thought is not captured by the simplistic narrative of indoctrination within the liberal bubble of a college campus. Instead, what proved to be most transformative was that rather than being taught what to think, in a basic way, we were taught how to think. I was exposed to an overwhelming amount of new information, alternative viewpoints, and multiple ways to interpret the same set of facts. This meant distilling information and curating signal from the noise became an invaluable, necessary skill. I began to demand rigor and empirical justification from my information sources. On top of this, I benefited most from being thrown into an environment filled with brilliant professors, mentors, and peers who inspired and challenged me. Altogether, this experience pushed hard on my beliefs, so that I was forced to question their origins rather than rationalizing them as objective truth. This led me to the realization that often what I believed to be a truism about the human condition was only a projection of my own experience onto the lives of others.

 

I always knew that for me, working hard would lead to success. And I knew that I was privileged relative to most people. What I had not considered is the extent to which this privilege had permeated and defined my life. It’s not simply that I’ve been provided a mountain of financial and emotional support, but also opportunities for personal investment, safety nets in case I fail, a lack of serious responsibility, social and professional networks, and flexibility to decide my pursuits. The future of this country depends not only on our ability to address the implications of growing inequality in an economic sense, but also the chasms that have emerged within each of these domains of life.  

 

To understand how these inequalities are manifested, it is vitally important to listen to other’s descriptions of how they operate on a day-to-day basis. Even then, we should learn to accept that as individuals, there are some aspects of the human experience that are simply inaccessible, no matter how attentively one listens. For example, I can sympathize with and believe you when you tell me the difficulties of being a woman, being looked down on by the “elite” class, being an African-American, or being laid off from the factory-line job you held for twenty years. But that doesn’t mean I can truly understand every problem held by every person. It would be the epitome of arrogance to assume otherwise. Internalizing this is part of developing deference towards others. To be clear, I don’t mean to advocate for blind belief or a moral equivalence of all ills, but rather for adopting a default position of respect towards people’s stated concerns, and the treatment of each individual as the foremost expert on harms he or she endures.

 

Developing deference is not easy; even the sheer amount of concern within your immediate circle can be overwhelming. Scroll through Twitter, turn on cable news, or chat with your neighbors and it will seem like every day brings a new outrage.  Cries of populist uprisings, growing polarization, and widespread discontent have led to no end of theorizing on the cause of the chaos, in addition to wild speculation about future disasters sure to descend upon us. Yet before we throw up our hands, we need to take a step back; why do things feel so bad? Independent of the problems that we face, it remains the case that we each have an enormous present-bias. Things may feel bad, but consider the state of affairs merely fifty years ago. The country was embroiled in civil riots, it experienced multiple high-profile political assassinations, communist hysteria ran high, and broad segments of society were locked out of accessing entire spheres of American life. In this context, today does not seem so uniquely terrible. I want to emphasize this point: Things can and do get better. Acknowledging this does not invalidate our very real problems; rather it refocuses our attention to the light at the end of what can be a dismally dark tunnel. Maintaining the belief that the long-term trend remains positive helps us avoid resigning ourselves to an inevitable fate, succumbing to the false notion that we are powerless.

 

Thus far I have stressed that we recognize the legitimacy of people’s grievances, and the fact that we have to power to address them. Naturally, the question that follows pertains to how we go about addressing them. Fixing what ails America will require no end of energy, dedication, and ingenuity that must be drawn from all corners of the country. But I know it can be done. It can be done because I have faith in American exceptionalism. Not in the sense that America has a God-given ability to lead the world in every which way. In fact, we lag behind on many metrics, from our aging infrastructure, to our cash-strapped education system, to our increasingly expensive patchwork of a health care system. Rather, we are exceptional in what we aspire to be. Never in the course of human existence has a pluralistic, multi-cultural state been successfully established, let alone be the model of peak economic dynamism, enlightened cultural and ethical values, and strength in global leadership. Yet this is exactly what the self-assigned label of the shining city on a hill demands. Despite our shortcomings, Americans should be unashamedly prideful in this vision. It is only natural that we undergo growing pains during this grand experiment, particularly now, when stability engendered from the multi-generational consolidation of power within a single social group is finally being upended. New voices are being introduced every day, and they continue to begin new, necessary conversations.

 

It is in this context that a sense of collective identity is key. Think of identities as quilts, formed through a careful stitching together of experiences, producing a mosaic totally unique to each individual. A troubling trend involves the growing uniformity of patterns within these mosaics, as our natural divisions become deeply entrenched along an increasing number of lines. Today, knowing a person’s stance on transgender bathrooms provides a strong signal for all sorts of other information from their race, religion, and education to how they take their coffee. Many today would never entertain the thought of marrying someone of the opposite political party. How can any form of resolution be on the table when sacrificing a single inch is tantamount to a rejection of who you are as a person? We have erected a wall, and if we are to tear it down, we must combat our instincts and acknowledge a shared obligation to each other as human beings before our obligation to our political tribe.

 

But this is far easier said that done. We can attempt to look to history for times in which racial, political, and ideological polarizations were superseded by an appeal to a common identity. The most frequent answer is quite depressing: wartime. By pointing the finger at an easily identifiable enemy, war brings about a binding force that pushes us to put aside previously factious differences in pursuit of the greater good. To be sure, in no way should we consider war for the purpose of producing societal cohesiveness. However, we can learn from our history in order to identify other ways in which we may create a similar reprioritization of our individual allegiances. Fundamentally, such consideration highlights the utility of the nation-state as one of the most powerful units of social interaction. At times I am tempted to dismiss this fact of human behavior, but reclaiming the mantle of patriotism need not be shameful. If we are careful to avoid whitewashing history, and to distinguish our idealistic aspirations from the uncomfortable truths of reality, fostering pride in our country represents the most promising path forward.

 

One way to concretely incorporate patriotism into the sphere of public policy would be to establish mandatory public service. At first blush, this is directly at odds with our dearly held value of individual freedom. But it is truly the strength of the state that guarantees any freedoms at all. The overwhelming majority of Americans are educated by public teachers in public schools. This provides the knowledge, skills, and freedom of independent thought required to pursue our various callings. Environmental protections secure our right to live healthy lives by keeping our air, water, and communities free of pollutants. Our ability to move freely within the country is built on the continued maintenance of our roads, bridges, and airports. Servicemen and women, police officers, and firefighters ensure our physical safety on a daily basis. In some respect, we already acknowledge this shared obligation to the state through our payment of taxes. If we can agree that we each owe some debt to society, public service would merely represent the repayment of this debt with our time instead of our income.

 

Furthermore, there are several significant advantages to public service relative to standard taxation. A key distinction involves the accommodation of individual preference in how one chooses to serve one’s country. When thinking about mandatory public service, what immediately comes to mind is serving in the military. However, any realistic implementation of the policy would need to feature much more. It would build upon already existing organizations such as Teach for America, the Peace Corps, and AmeriCorps, all which model the types of work that invoke a sense of societal contribution without requiring combat training.  

 

In addition to building upon existing programs, the implementation of mandatory public service would create an opportunity to examine other ways to best utilize our public institutions. In the aftermath of the financial crisis, it has become clear that lack of access to reliable credit has led to catastrophic harm. It’s a national tragedy that despite being richest country in the world, nearly half of all Americans would struggle to come up with 400 dollars in the case of an emergency. As with many issues, we don’t need to reinvent the wheel. Consider an old idea that has recently reentered mainstream thought: establishing banking services at every U.S. postal office. Ideas in this vein pair perfectly with mandatory public service, as asking more from our public institutions increases the number of available jobs, while mandatory public service drives up the supply of Americans ready to fill them.  

 

Not only would public service represent an expansion of our individual choice set relative to taxation, but it would also feature greatly improved outcomes with respect to national sentiment. It is the rare individual for which the payment of taxes causes no resentment. To varying extents, we each feel a sense of fundamental unfairness; I’ve earned my income, why do I have to give it up? To add further insult, we are likely to watch our money disappear behind an opaque screen of red tape, leaving us firmly in the dark about where it ultimately ends up. In sharp contrast, public service affords citizens the ability to see the fruit of their labor with their own eyes. Finally, and perhaps most importantly, a national public service program would push people of radically different backgrounds together, bringing us one step closer towards realizing the goal of a cohesive national identity. Shared experiences, both good and bad, form the soil in which we may cultivate a sense of gratitude towards our fellow citizens, and pride in our role within the narrative of Americans striving to create a society like no other that has come before.

 

There is no doubt that this proposal would face fierce opposition, but it is not the only approach we have at our disposal. Less radical would be to build on existing programs that move us towards providing adequate opportunity to all regardless of background. Consider the Affordable Care Act. While deeply polarizing, the most popular part of the law, protection against discrimination based on pre-existing conditions, sheds light on the types of policy that can garner popular support. Specifically, I refer to the class of actions that protects fundamental human rights, and guarantees everyone a chance to fulfill their potential. This is vague by intention; I recognize there exists a huge range in thinking about what it could mean. While I may dream of Medicare for all, expanding broadband to rural areas, a universal basic income, a carbon tax, a far more progressive tax system, addressing homelessness through public housing (and adding a ton of market-rate housing while we are at it), and a doubling or tripling of immigration quotas, this sounds like a dystopian nightmare to some. But if I tallied up the opinions of every American, I maintain the popular consensus would be far closer to the vision I outlined than it would be to the other end of the spectrum. Examine the status quo and ask yourself if you are satisfied. If we want to revive, or perhaps birth for the first time the reality of the American Dream, complacency is all that stands in our way.

Nina Teicholz on the Bankruptcy of Counting Calories

Calories-in/calories-out is a useful identity. But in isolation it tends to give people the wrong idea about successful weight loss. In particular, it says nothing about when a particular combination of calories in and calories out will lead to suffering and when it won't. As I emphasize in "Prevention is Easier Than Cure of Obesity":

By "what works" I mean not only being successful at losing weight and keeping it off, but also doing so with a minimum of suffering. As an economist, I would consider suffering a bad thing, even if suffering had no adverse effects on health whatsoever. But suffering also makes a weight loss program difficult to sustain, so suffering does have a bad effect on health. So minimizing suffering is crucial.

The calories-in/calories out identity is typically thought of this way:

Weight Gain in Calories = Calories Consumed - Calories Expended

What sneaks in with this arrangement of the identity is the questionable idea that calories consumed and calories expended are fixed quantities not subject to any deeper forces. Rearranging the identity gives a different perspective:

Calories Expended = Calories Consumed + Weight Loss in Calories

This rearrangement subtly hints at the idea that, holding calories consumed fixed, effective weight-loss that puts a lot of fatty acids and ketones into the blood stream from metabolized body fat might make one feel more energetic and so raise calories expended. Conversely, relatively ineffective weight loss combined with a low level of calories consumed will lead to internal starvation with all kinds of body signals going out to discourage energy expenditure and encourage the consumption of more calories. Those body signals are exactly the kinds of signals that can cause suffering.  

Nina Teicholz attacks naive misunderstandings of calories in/calories out in her May 20, 2018 Los Angeles Times article "Counting calories won't reduce obesity. So why are we requiring restaurants to post them." I agree that posting calories is not a particularly effective public health intervention. I would go so far as to say it would be much more useful to post the insulin index of different types of food. (See "Forget Calorie Counting; It's the Insulin Index, Stupid." One benefit of requiring the posting of the insulin index for restaurant food is that the research would get done to measure the insulin index for more types of food.) The virtue of low-insulin-index food is that it has a high ratio of satiation to calories—where here by "satiation" I mean "being satiating." With food that has a high ratio of satiation to calories, it will feel natural to stop eating before consuming too many calories. You won't have to try so hard to stop. 

Here is what Nina says about counting calories:

Although we've long held on to the intuitive idea that slimming down is merely a matter of beating the math — create a caloric deficit of 3,600 calories and lose a pound of fat — the evidence has been stacking up against it for more than a century.

Since the early 1900s, medical research has shown that people do lose weight on calorie-restricted diets — in the short term. But in most cases, they quickly gain it back. Reviewing hundreds of papers on dieting published already by 1959, two researchers concluded in the AMA Archives of Internal Medicine: "Most obese persons will not stay in treatment for obesity. Of those who stay in treatment, most will not lose weight, and of those who do lose weight, most will regain it."

Moreover, the researchers found, people usually put back on more weight than they'd lost. This cruel twist is due to the fact that a person's metabolic rate slows down to accommodate semi-starvation, but it doesn't bounce back, resulting in a stubbornly depressed metabolism. To maintain that weight loss, it appears a person must restrict calories for life — a state of deprivation that, as it turns out, few humans can sustain. The two AMA authors wrote that the most common "ill effects" of constant hunger include nervousness, weakness and irritability, and, to a lesser extent, fatigue and nausea.

Yet we seem committed to the myth that weight loss is merely a matter of calories in vs. calories out. That's why it's front-page news when researchers discovered that most participants in "The Biggest Loser" reality TV show didn't maintain their new, low weight — and that six years out, several weighed more than when they appeared on the show.

If counting calories in the usual way doesn't work, what does work? Nina points to some hints:

Insufficient sleep, for instance, may impair fat loss, as one small controlled trial concluded. Not getting enough sleep also increases the hunger hormone, ghrelin, according to another study. Chronic stress also appears to stimulate ghrelin, as well as the stress hormone cortisol, which is thought to weaken the body's ability to metabolize carbohydrates.

The most promising area of obesity research focuses on the effects of eating carbohydrates. Some 70 clinical trials now show that restricting carbohydrates is a highly effective way of fighting obesity. Low-carbohydrate diets are either equally or more effective than low-calorie diets, according to an analysis in JAMA.

One of the reasons low-carb diets work is precisely that they don't require counting calories. People are allowed to eat as much as they like, so long as they keep carbohydrates low. In part because foods with protein are satiating, people on this diet don't get hungry. Their metabolism doesn't slow down, and they aren't required to sustain a state of semi-starvation.

One recent survey of some 1,500 people found that more than a third of them were able to keep off more than 20 pounds and maintain a low-carb diet for two years or more. Another study, conducted at Stanford, found that subjects successfully lost weight without monitoring calories simply by eating high-quality "real" foods and more vegetables while reducing refined carbohydrates.

The big thing that Nina is missing is the idea of fasting, or time-restricted eating. As I say in "4 Propositions on Weight Loss," in my book the bottom line is this:

... for a large fraction of people, fasting—combined with avoiding sugar, bread, rice and potatoes—is a powerful, not-too-painful, tool for weight loss. 

 

Don't miss these other posts on diet and health and on fighting obesity:

Also see the last section of "Five Books That Have Changed My Life" and the podcast "Miles Kimball Explains to Tracy Alloway and Joe Weisenthal Why Losing Weight Is Like Defeating Inflation." If you want to know how I got interested in diet and health and fighting obesity and a little more about my own experience with weight gain and weight loss, see my post "A Barycentric Autobiography."

Should the U.S. Dollar Be Weak or Strong?

                                                                      &nbs…

                                                                                               Link to the article above

The words "strong" and "weak" for a currency can mislead people into thinking a strong currency is good and a weak currency is bad. That isn't right. As I was quoted by Angelo Young in the article above:

“It is not right to say in an unqualified way that a ‘strong’ dollar is good, or to say that a ‘weak’ dollar is good,” Miles Spencer Kimball, a professor of economics at the University of Colorado Boulder, told Salon.

It depends on why the dollar is strong or weak.

  • If the dollar gets stronger because demand for US products has increased, that is a good sign.
  • If the dollar gets stronger because the US government is borrowing a huge amount, including from foreigners who have to buy dollars to lend to the US government, that is a bad sign.
  • If the dollar gets stronger because the Fed is raising rates to keep the economy from overheating, that is appropriate. 
  • If the dollar gets weaker because demand for US products has fallen, that is a bad sign. 
  • If the dollar gets weaker because Americans are saving more and put some of the savings into foreign assets that they trade away dollars for, that is a good sign. 
  • If the dollar gets weaker because the Fed is cutting rates to bring the economy out of a recession, that is appropriate. 

Unfortunately, I don't know of a convenient, compact, vivid terminology for changes in exchange rates that makes the direction clear without seeming to assert a value judgment that doesn't necessarily follow. If the dollar strengthens, it can be called the dollar appreciating, which also sounds good. If the dollar weakens, it can be called the dollar depreciating, which also sounds bad. But strengthening/appreciation can be bad and weakening/depreciation can be good. It all depends.