American virtue is running low. Here’s how we get it back.  Part V

The rule of law depends as much on citizen virtue as on constitutional structure. And our collective civic virtue is currently under strain

By Justin Collings  Oct 2, 2021,

Editor’s note: In his April 4 address at the general conference of The Church of Jesus Christ of Latter-day Saints, President Dallin H. Oaks spoke of his belief that “the United States Constitution contains at least five divinely inspired principles”: popular sovereignty, the separation of powers, federalism, individual rights and the rule of law. This is the final essay in a five-part series that addresses each of these principles.

Americans often note proudly that ours is the oldest written constitution still in force anywhere in the world. We are right to be proud, but some might quibble with our chronology. The most venerable constitution, they might argue, applies only in Massachusetts.

Drafted in 1779 and ratified in 1780, the Massachusetts constitution was perhaps the most influential of the state charters adopted between the Declaration of Independence in 1776 and the activation of the U.S. Constitution in 1789. (As an example of experimental federalism in action, the national Constitution drew liberally on the best features of state constitutions.) Like the U.S. Constitution, the Massachusetts constitution provided for a bicameral legislature, a strong chief executive (the governor) with a veto power over legislation, and an independent judiciary. The Massachusetts constitution enshrined the core principles discussed earlier in this series: popular sovereignty, individual rights and the separation of government powers.

Remarkably, this far-sighted, influential and enduring document was largely the work of a single mind and pen. They were the mind and pen of a Braintree, Massachusetts, lawyer by the name of John Adams.


John Adams was a colorful character — proud, pugnacious, cranky, opinionated, stubborn, sturdy, loveable, generous, magnanimous, fearless, talkative, blunt and honest to a fault. He loved his country passionately and was loyal through and through. He was the funniest of the founding fathers, as well as the hardest-working and best-read. To my taste, he was also the best writer. (Jefferson might have been a more elegant stylist, but I would take Adams’s pungent, potent, pictorial prose over Jefferson’s controlled cadences any day of the week.)

Adams was a deep, broad and penetrating thinker, and he thought about political philosophy until the last day of his nine-decade life. Political science, Adams believed, was the science of human happiness. That made it, in his view, the most important science of all.

In the spring of 1776, Adams pushed a resolution through the Continental Congress that all the states should draft their own, independent constitutions — a resolution that, in Adams’s view, should be regarded as the real declaration of independence. That same year, to help the states in their efforts, Adams wrote “Thoughts on Government,” a concise and incisive handbook on constitutional drafting. Aside from Tom Paine’s “Common Sense,” Adams’s tract was the most influential political pamphlet of that historic year. When Adams drafted a constitution for Massachusetts, he largely followed his own advice.

RELATED

What ‘rights’ do states really have under the Constitution?

One of the remarkable features of Adams’s draft was that it outlined the why as well as the what of constitutionalism. “In the government of this commonwealth,” Part One concluded, “the legislative department shall never exercise the executive and judicial powers, or either of them; the executive shall never exercise the legislative and judicial powers, or either of them; the judicial shall never exercise the legislative and executive powers, or either of them; to the end it may be a government of laws, and not of men.”

The purpose of the separation of powers, then — as well as of other constitutional principles — was to establish the rule of law. It was to prevent arbitrary, unlimited, or tyrannical rule. It was to ensure that governors and governed alike obeyed the rules of the game.


The concept is beguilingly simple and tremendously powerful. America’s founders were familiar with Edward Gibbon’s assessment of history as “little more than the register of the crimes, follies and misfortunes of mankind.” Much of the criminality, folly and misfortune, they believed, was the work of unconstrained rulers — rulers who thought themselves above the law, acted on that belief, and got away with it.

The U.S. Constitution was designed to prevent this. Never, in an independent and unified America, would a tyrant rule with absolute sway — changing the laws whenever he pleased or flouting the laws at will. Nor would there be an American Caligula, the Roman tyrant who reportedly published the laws at the tops of tall pillars, where citizens could not read them. In the American Republic, the laws would be accessible and clear — within the reach of every citizen.

The rule of law requires that laws apply only prospectively (hence the Constitution’s ban on ex post facto laws, which retroactively criminalize behavior that was legal when performed) and that they apply generally to all citizens (hence the Constitution’s ban on bills of attainder, which single out named individuals or groups for special punishment or unfavorable treatment). The rule of law honors human freedom by creating the conditions under which all citizens can plan the course of their future lives. Where the rule of law prevails, citizens know what is legal and what is not. They know how to order their lives to comply with the laws. They know that the law applies impartially to everyone, that it has no favorites and no foes. They know that their lives are governed by the provisions of written laws, not by the whims of untamed rulers. Theirs, they know, is a government of laws and not of men.


The rule of law, to repeat, is a precondition for human freedom. But the rule of law also has preconditions of its own. John Adams knew this well, and he spent a lifetime worrying whether those preconditions were secure. He described some of them in the Massachusetts constitution itself. One was education. “Wisdom, and knowledge, … diffused generally among the body of the people, being necessary for the preservation of their rights and liberties,” Adams wrote, “… it shall be the duty of legislatures and magistrates … to cherish the interests of literature and the sciences, and all seminaries of them.” The government, Adams continued, had a duty “to encourage private societies and public institutions … for the promotion of agriculture, arts, sciences, commerce, trades, manufactures, and a natural history of the country; to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, good humor, and all social affections, and generous sentiments among the people.”

RELATED

The Bill of Rights guarantees unprecedented freedom. Is it waning?

This was an ambitious legislative agenda, if ever there was one. (I, for one, would appreciate more “good humor” among today’s politicians and the people at large.) But Adams knew that although the government might “cherish” education and “encourage” civic graces, it could never compel them. The same was true of what Adams regarded as the most essential precondition of republican government. “Our Constitution,” he observed of the federal one, “was made only for a moral and religious people. It is wholly inadequate to the government of any other.”

But a government committed to religious liberty can’t compel people to be moral and religious. Nor can a government committed to liberty more generally compel the other preconditions of the rule of law. Writing in 1967, a German thinker named Ernst-Wolfgang Bӧckenfӧrde described the dilemma this way: “The liberal, secular state depends on conditions that it cannot guarantee.”

Bӧckenfӧrde was not using “liberal” and “secular” in a partisan or polemical way. By “liberal,” he meant only a state committed to liberty; by “secular,” he meant a state without an official church. His point was that a free state cannot survive unless its citizens are committed to freedom. But the state can’t require such commitment without ceasing to be free.

In a similar way, the rule of law cannot survive without what has been called “obedience to the unenforceable.” Without unenforced obedience to law, you can never have enough police officers. But obedience to the unenforceable is, by definition, unenforceable. The rule of law requires moral commitments that the law alone cannot ensure. This is republican government’s greatest dilemma, its most enduring danger. As Bӧckenfӧrde put it, “This is the great gamble that” constitutional governments “have undertaken for the sake of freedom.”


It is a gamble whose outcome depends on us. The rule of law is the grand constitutional principle that encompasses all others. But the rule of law depends as much on citizen virtue as on constitutional structure. And our collective civic virtue is currently under strain.

For America’s founding generation, “virtue” had a concrete meaning: it meant placing the common, public good ahead of selfish, private interests. Virtue, for the Constitution’s framers, meant public-spirited sacrifice.

Such virtue now seems in short supply. Perhaps it always has been. John Adams spent at least 60 years lamenting the lack of virtue among the Americans of his day. But today we seem to lack even a common sense of the civic virtue toward which we aspire. We lack not only virtue but a shared definition of virtue.

Too many today think of virtue primarily in terms of how one votes and what one (re)tweets. Too many would rather signal virtue than cultivate it. Too many engage in showy self-assertion; too few practice quiet self-denial. Virtue, in our day, has largely become a question of joining the right team — of flagging, in approved ways, that one belongs to the right side.

RELATED

What do you do when democracy is too much of a good thing?

But being on the right side, I contend, is less important than doing the right things. Virtue is less a matter of partisan allegiance than a question of personal character.

As this series closes, let me suggest that the best way to defend and sustain the Constitution is to start close to home. Indeed, you might best start at home. For the Constitution rests upon the foundation of a host of supporting institutions — local governments, volunteer organizations, churches, schools, neighborhoods and, above all, families.

If you want to uphold the Constitution, it wouldn’t hurt to learn a bit more about it. I hope that these essays have contributed modestly toward that end. But you will serve the Constitution and the country even more valiantly if you make some personal sacrifices to inculcate in the rising generation a love for those virtues that John Adams extolled: education, art, science, literature, humanity, benevolence, industry, frugality, punctuality, honesty, sincerity, generosity, good humor and charity.

Surely, as Saint Paul proclaimed, “the greatest of these is charity.” But while we’re at it, let’s not neglect good humor. Somewhere in civic heaven, John Adams will be smiling.

Justin Collings is a professor at Brigham Young University Law School and a fellow at the Wheatley Institution.

Deseret News

**************

The Bill of Rights guarantees fundamental freedoms. Is it waning? Part IV    

If Americans generally abandon their allegiance to the paramount freedoms of the First Amendment, there will be little the Supreme Court can do to save us from ourselves

By Justin Collings  Sep 25, 2021, 10:00pm MDT

Billy Gobitis was not a troublemaker, but he was in trouble at school. Big trouble. Along with his older sister, Lillian, age 12, 10-year-old Billy had been expelled. The year was 1935; the place was Minersville, Pennsylvania; and the Gobitis siblings, in defiance of a school district requirement, had refused to salute the American flag. Other students threw rocks at them. Billy’s teacher tried to physically force his arm into the salute position (in those days, this meant extending the arm upward at a forty-five-degree angle with the palm turned up, a gesture with eerie similarity to the Roman salute, then popular in Fascist Italy and Nazi Germany).

Billy was a respectful boy, and he felt he should explain his actions in a letter to the schoolboard. “I am a true follower of Christ,” the young Jehovah’s Witness explained, scrawling out a determined, if somewhat uneven schoolboy cursive. “I do not salute the flag, not because I do not love my country, but I love my country and I love God more and I must obey his commandments.” One of those commandments, Billy believed, forbade worshiping idols, and he thought the flag salute fell afoul of that command.

Billy’s family suffered for their convictions. The community boycotted their grocery store. Some family members were physically attacked. The children were placed in a private school, which the parents could barely afford. To offset the cost of private tuition, Billy’s father brought a lawsuit against the Minersville School District. He took his case all the way to the Supreme Court.

Report ad

And lost badly. By a vote of 8 to 1, the justices affirmed the school district’s decisions. By then it was 1940, and Billy Gobitis was 15. Europe was at war, and America was in turmoil. Requiring students to salute the flag and recite the Pledge of Allegiance was the norm at schools across the country. The court explained that such practices helped “to evoke that unifying element without which there can ultimately be no liberties, civil or religious.” Exempting the Gobitis children “might cast doubts in the minds of the other children which would themselves weaken the effect of the exercise.” Besides, the school district’s countervailing interest was of the highest order: “National unity,” the justices observed, “is the basis of national security.”

RELATED

Separation of powers is supposed to stop tyranny. But is it eroding?

As for the Gobitis’ claims that their religious liberty had been infringed, the court explained that Billy and Lillian’s parents remained free to teach them whatever religious principles they pleased — at home.

The decision was a bitter disappointment for the Gobitis family. For Jehovah’s Witnesses around the country, it proved devastating. Scored by the country’s highest court as agents of disloyalty, Witnesses faced a flurry of ferocious persecution. From Maine to Wyoming, Witnesses were beaten and jailed, tarred and feathered, their sanctuaries burned to the ground. The ACLU told the Department of Justice that 1,500 Witnesses had been physically attacked in 300 communities nationwide. “They’re traitors,” explained one sheriff as Witnesses were driven out of his community; “the Supreme Court says so. Ain’t you heard?”

The justices read reports of these events with horror and alarm. Several began rethinking their earlier decision. Some stated publicly that they had gotten it wrong. In 1943, the chance came to revisit the issue in a strikingly similar case. This one involved a West Virginia Witness, also suspended for refusing to salute the flag.

The court announced its decision in the case of West Virginia v. Barnetteon June 14, 1943. Six justices voted in favor of the Witness and against the precedent from the Gobitis case. Somewhere, 18-year-old Billy Gobitis must have been cheering. Appropriately enough, it was Flag Day.

The court’s opinion, by Justice Robert Jackson, is perhaps the most eloquent statement of First Amendment freedoms in Supreme Court history. Jackson was a gifted prose stylist, and in the Barnette case he rose to the occasion with arguably the greatest opinion he ever wrote.

The opinion concluded with a ringing climax. “If there is any fixed star in our constitutional constellation,” Jackson wrote, “it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion or force citizens to confess by word or act their faith therein.” By forcing students to salute the flag in violation of their religious convictions, the court held, school districts around the country were unconstitutionally prescribing orthodoxy.

Report ad

But shouldn’t unelected judges defer to decisions made by the people’s democratically appointed representatives? Not when fundamental rights are at play. “The very purpose of a Bill of Rights,” Jackson explained, “was to withdraw certain subjects from the vicissitudes of political controversy, to place them beyond the reach of majorities and officials and to establish them as legal principles to be applied by the courts. One’s right to life, liberty, and property,” Jackson continued, “to free speech, a free press, freedom of worship and assembly, and other fundamental rights may not be submitted to vote; they depend on the outcome of no elections.”


Most Americans know this in their bones, even if they know little else about the Constitution. There are some rights that no official can infringe, some spheres of life where government power must not intrude. Popular sovereignty is the foundation of our constitutional order, but there are some things even more sacred than collective self-governance, some rights beyond even democracy’s reach. The Declaration of Independence calls such rights “inalienable;” they cannot be surrendered or sold. Legitimate government exists to “secure” them. No government is authorized to subvert them.

As I have suggested earlier in this series, the most basic protection for individual rights lies in the Constitution’s structure — in its reciprocal checks and balances, as well as in the separation of powers within the national government and between the national government and the states. But these structural protections, though powerful, are incomplete. The Bill of Rights — the first 10 amendments to the Constitution, supplemented and applied against the states by the 14th Amendment — exists to plug the gaps.

RELATED

What ‘rights’ do states really have under the Constitution?


On the whole, it has plugged them remarkably well. There have been lapses, to be sure. It took the Supreme Court too long to acknowledge the full promise of the 14th Amendment, and it has taken the country as a whole too long to make Bill of Rights guarantees a reality for all Americans. But focusing too fixedly on past failures or infamous Supreme Court rulings can obscure the powerful if mundane reality of millions of Americans, across multiple generations, speaking what they think, publishing what they please, and worshiping (or not) as the Holy Spirit moves them.

This mundane reality has been possible, however, not only because of what the Bill of Rights says or what the Supreme Court rules, but also and especially because of what the American people have internalized: an unshakable determination to defend and exercise their foremost freedoms, and to allow others to do the same. For most of our modern history, the Bill of Rights has been a source of unity and consensus — the preeminent object of our constitutional patriotism.

Report ad

Consensus around the Bill of Rights has been broad and deep, so much so that in recent decades the fiercest controversies regarding individual rights have played out at the margins. In 1965, for instance, the Supreme Court found in the “penumbras” of the Bill of Rights a promised right to privacy. Eight years later, this ostensible right to privacy became the basis of the Court’s ruling in Roe v. Wade that the Constitution guarantees a robust right to abortion. Roehas proved a source of intense and enduring controversy, as have rulings related to a host of other hot-button issues — from affirmative action to same-sex marriage, from assisted suicide to corporate political speech. These debates have been intense, not only because the societal stakes so high, but because it is not always clear (to put it mildly) how (or if) the Constitution’s 18th-century text applies to these 21st-century problems.

In some respects, of course, the Bill of Rights does reflect the particular concerns of the 18th century. It’s nice to know, for instance, that the government can’t require us to quarter troops in peacetime, but most of us don’t often think about the third amendment. In other respects, however, the Bill of Rights has aged astonishingly well. The Constitution’s protections with respect to criminal procedures remain remarkably relevant, and in my view the essential guarantees of the First Amendment — religious liberty, the freedoms of speech and of the press, the right of peaceful protest — are as timeless as they are priceless.

For a long time, they were also uncontroversial, despite debates at the margins. Running through all the fierce, rights-related debates of the late-20th and early 21st century were a set of common convictions about the right of all individuals to think for themselves and to express their thoughts in words of their own choosing. Throughout the stormy constitutional controversies that have attended America’s culture wars, contending factions all steered by the “fixed star” that Justice Jackson extolled back in 1943.

Today we seem to be losing sight of that star. In recent years, too many of us have reneged on our common commitments to allow others to voice unpopular opinions, to live the fundamental tenets of their faith and to gather together to call for reform. Too many have forgotten that protected protests must be peaceful, and that a vigorous press is essential to democratic freedom. Too many have forgotten the wisdom, spoken long ago by Supreme Court Justice Louis Brandies, that “the remedy to be applied” to false or wicked speech “is more speech, not enforced silence.”

RELATED

What do you do when democracy is too much of a good thing?

I believe that this applies in principle to private actors as well as to the government. Freedom survives and flourishes through our collective commitment to the principles of freedom. If Americans generally abandon their allegiance to the paramount freedoms of the First Amendment, there will be little the Supreme Court (or anyone else) can do to save us from ourselves. Nor will it help much if we preserve our constitutional liberties against government overreach, only to surrender them to the prying eyes or censorious policies of powerful corporations.

True, the Bill of Rights doesn’t bind private corporations the way it binds the government. But our society will be stronger and our liberty more secure if corporations willingly and eagerly embrace the spirit of our First Amendment freedoms. In an age increasingly dominated by technological titans, the gravest threats to freedom might well stem from private rather than public actors. Perhaps that is already the case. In any event, citizens should be vigilant and legislators active. In a time when one hears much talk of the imperatives of corporate morality, perhaps the most urgent imperative of all is for corporations and citizens alike to re-embrace the spirit of the First Amendment.

The Bill of Rights is now 230 years old. Although its promises have sometimes been flouted or applied selectively, it has served us, on the whole, exceedingly well. But it faces novel challenges and unprecedented strains. In the face of those challenges, it is past time for us to renew our acquaintance with, and our commitment to, these precious guarantees. May all Americans, once more and forever, steer by the fixed star of our first and foremost freedoms.

Justin Collings is a professor at Brigham Young University Law School and a fellow at the Wheatley Institution.

Deseret News

***********************

What ‘rights’ do states really have under the Constitution? Part III   

The dividing line is fiercely contested and always has been. But wherever one draws the line, it makes more sense to talk about ‘federalism’ than ‘states’ rights’

By Justin Collings  Sep 18, 2021

Editor’s note: In his April 4 address at the general conference of The Church of Jesus Christ of Latter-day Saints, President Dallin H. Oaks spoke of his belief that “the United States Constitution contains at least five divinely inspired principles”: popular sovereignty, the separation of powers, federalism, individual rights and the rule of law. This essay is the third in a five-part series that will address each of these principles.

An entire region felt itself besieged. The cards of power seemed stacked against a tiny, beleaguered cluster of states. Federal policy, pursued by a president from another part of the country, was wrecking the region’s interests. Some of the region’s leading statesmen held a convention to coordinate a united response. Firebrands talked of secession. The region’s handful of states, they insisted, might abandon the American Union and forge a regional confederacy all their own.

The year was 1814, not 1860, and the place was Hartford, Connecticut, not Charleston, South Carolina. The aggrieved partisans were New England Federalists enraged by James Madison’s war with England, not Southern Democrats alarmed by Abraham Lincoln’s election. Fortunately for the country, the threat of secession from the Hartford Convention of December 1814 was not serious. The firebrands were swiftly sidelined. Wiser heads prevailed. In time, news of Gen. Andrew Jackson’s victory at the battle of New Orleans vindicated President Madison’s administration of the War of 1812 and made the Hartford delegates look disloyal. But for a brief moment, a band of northern discontents had flown the flag of sovereign states’ rights.

RELATED

The inspired Constitution: 5 principles that animate our country’s governing document


“States’ rights” is a phrase with baggage. For some, it has a dishonorable past and a malodorous smell. Its banners were hoisted by defenders of slavery in the 19th century and by champions of segregation in the 20th century. To many modern ears, talk of states’ rights has the ring of a racist dog whistle.

This response is understandable, but three qualifications are in order.

The first is that “states’ rights,” from the very beginning, was a two-edged sword. Yes, some Southerners invoked states’ rights to protect slavery, but other Southerners — including James Madison and Thomas Jefferson — invoked states’ rights to denounce the Sedition Act of 1798, a flagrant violation of the First Amendment, and Northern abolitionists invoked the principle to protest federal fugitive slave laws.

By contrast, defenders of slavery were only fair-weather proponents of states’ sovereignty. Their invocations of states’ rights were opportunistic and unprincipled. “Whenever a question arose of extending or protecting slavery,” observed the eminent historian Henry Adams, “the slaveholders became friends of centralized power.” States’ rights was then the mantra of the free states; “Massachusetts appealed to this protecting power as often and almost as loudly as South Carolina.” In other words, nothing about states’ rights was ever inherently pro-slavery.

The second clarification is that assertions of states’ rights are least persuasive when individual constitutional rights are at play. Madison presciently predicted that the greatest threat to individual freedoms would come from the states, not the federal government. Our early history proved Madison right, and the framers of the 14th Amendment responded by barring the states from infringing fundamental rights or from treating citizens unequally. Lamentably, subsequent Supreme Court decisions betrayed the 14 Amendment’s original promise. (Plessy v. Ferguson, which approved the odious principle of “separate but equal,” is only the most notorious example.) But by its plain terms, the 14th Amendment already barred the abhorrent practices that 20th-century segregationists defended by spuriously asserting states’ rights.

The final qualification is that “states’ rights” is a misnomer. The Constitution doesn’t grant “rights” to the states in the same way it grants rights to individuals. There is no “state’s rights” clause. Yes, the 10th Amendment makes explicit what the Constitution’s entire structure implies: “The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.” But this is quite different from an affirmative grant of power or a positive protection of rights. Under the 10th Amendment, states’ powers are residual. The amendment operates by subtraction. By its terms, state governments wield only those powers that the people have neither granted to the federal government nor retained for themselves. State power begins where federal power ends.

The boundary between the two is blurry. The dividing line is fiercely contested and always has been. But wherever one draws the line, it makes more sense to talk about “federalism” (the balance of power between the federal government and the states) or “state autonomy” (constitutional limits on the federal government’s power to curb or constrain states) than to revive a fraught phrase like “states’ rights.”

RELATED

What do you do when democracy is too much of a good thing?


Fair enough, you might say. But what exactly is the nature of “federalism” under the Constitution? What are the precise contours of “state autonomy”?

Here history helps. In 1789, when George Washington swore the oath of office as president of the United States, the federal government was tiny. Washington oversaw a much larger staff as a planter presiding over Mount Vernon than as president presiding over the executive branch.

Today things look very different. The federal government employs more than 2 million civilian workers and disposes of a budget that tallies in the trillions of dollars. Today’s central government is a colossus of unprecedented scope. It resembles Behemoth and Leviathan, the legendary beasts of the Bible.

Unsurprisingly, the federal government’s activities have expanded with its size. Over time, this growth has raised persistent questions about the scope of federal power. For the most part, federal power has been a one-way ratchet. With the Supreme Court’s (occasionally reluctant) approval, the federal government has penetrated more and more spheres of American life. There are few signs that this expansion will slow soon.

This leads some to cheer and others to jeer. Even skeptics of federal power should acknowledge that the original Constitution created a central government of extensive powers. The Constitution empowered the federal government to tax and to spend, to raise armies and wage war, to regulate commerce and preempt conflicting state laws. It also conferred power to pass all laws “necessary and proper” to the exercise of enumerated powers. Federal powers are thus implied as well as explicit. They reach means as well as ends.

These principles were codified in landmark decisions by Chief Justice John Marshall during the 1810s and 1820s. But it wasn’t until the middle decades of the 20th century that the modern administrative state truly strained all substantive limits on federal power.

The hero (or villain) of this story is the commerce clause, which allows Congress “to regulate Commerce … among the several States.” The Supreme Court has always understood this language expansively, but in the aftermath of Franklin Roosevelt’s New Deal, the commerce power assumed unprecedented scope.

Report ad

In 1942, in the case of Wickard v. Filburn, the Supreme Court unanimously approved an agricultural regulation that capped how much wheat a farmer could produce — even though the farmer in question grew wheat only to feed his livestock and family. Although farmer Filburn’s wheat never left his home state (or indeed, his own farm), the justices reasoned that any wheat grown anywhere in the country could affect the price of wheat in the interstate market. Even private production for home consumption was therefore part of interstate commerce, and Congress could validly regulate it.

The Wickardcase gave Congress and regulators a green light, and they pressed the gas with gusto. The next 50 years witnessed what one scholar called “the rise and rise of the administrative state” — the growth and growth of federal power. I sometimes call it, mixing metaphors, the Death Star Pac-Man Commerce Clause.

For one brief period, the Supreme Court ruled that Congress still couldn’t regulate the states as states — couldn’t, for instance, control how states treat their own employees — but the justices soon changed course, ruling that the states’ recourse against federal overreach lay not with the courts but with the political process. States could resist federal encroachment mostly by electing senators willing to hold the beast at bay.

RELATED

Separation of powers is supposed to stop tyranny. But is it eroding?


Around the end of the 20th century, and to the astonishment of many constitutional scholars, the court renewed its commitment to limiting federal power. In a series of landmark judgments, the court ruled that Congress cannot invoke the commerce clause to regulate noneconomic activity (such as gun possession) or to compel economic activity (such as purchasing health insurance). The justices affirmed limits on when states can be sued, and they ruled that Congress cannot require states to pass laws or enforce federal legislation. Nor, the court held, can Congress attach conditions on federal funding to states so drastic that they amount to coercion.

These were all important decisions, though their overall practical impact was modest. Other recent decisions — including expansive readings of the commerce clause to allow federal regulation of private drug consumption, as well as a sweeping interpretation of the necessary and proper clause — have pointed in the opposite direction.

The current court seems sympathetic to concerns about state autonomy, but no court decision is likely to significantly limit federal power. The real check on federal encroachment remains a political check. Concerned citizens should vote for candidates committed to state autonomy. Alarmed state officials should refuse to enable the federal juggernaut — even when it offers them goodies. Worried states should be wary of bureaucrats bearing federal subsidies.


But how worried should we be? Indeed, why should we care about federalism at all?

We should care about federalism, for one thing, because the Constitution commands it. As Chief Justice Marshall observed long ago, “the enumeration (of constitutional powers) presupposes something not enumerated.” Fidelity to the Constitution demands meaningful outer limits on federal power.

More pragmatically, we should care about state autonomy because autonomous states can experiment. Nearly 90 years ago, Supreme Court Justice Louis Brandeis noted how “a single courageous State may … serve as a laboratory; and try novel social and economic experiments without risk to the rest of the country.” Allowing individual states to serve as “laboratories for democracy” would, Brandeis believed, allow the rest of the country to see what works and what doesn’t. As states experiment and learn from one another, governance improves everywhere.

Finally, federalism lowers the stakes of national politics. Although most Americans identify as Americans first and state citizens second, variation among the states — particularly cultural variation — remains significant and sometimes stark. Utah and Connecticut are very different places — as are Massachusetts and Mississippi, Texas and Vermont. Apart from a crucial core of fundamental rights enshrined in the Constitution itself, there is no need for a one-size-fits-all, national solution to every issue under the sun. The scalding temperature of our national politics would drop dramatically if, on a host of issues, the federal government (including the federal judiciary!) would allow the states to live and let live. (States, of course, should allow one another the same privilege.) As things stand, partisans of all stripes scream for a federal response to virtually every divisive issue.

Sometimes, to be sure, partisans rediscover the virtues of federalism after failures in national elections. They like local solutions when they lack national power. “We are all federalists,” I once heard a wise judge say, “when we are losing.”


I believe that we should all be federalists at all times — win or lose, rain or shine, whoever’s foot now bears the boot, whoever’s ox has just been gored. We should be federalists as a matter of constitutional principle and prudent policy. When power is devolved to the government units closest to questions of concern and most capable of resolving them, Americans receive an unparalleled, experiential education in the art of self-government. And the ties that bind us together as a union will be stronger if we don’t strain or snap them in the quixotic pursuit of ideological purity and national conformity. Within proper limits, federalism makes for better governance, calmer national politics, and brighter prospects for government of the people, by the people, and for the people.

Justin Collings is a professor at Brigham Young University Law School and a fellow at the Wheatley Institution.

Deseret News

**************************************

Separation of powers is supposed to stop tyranny. But is it eroding?  Part II

Separation of powers is arguably the defining feature of the U.S. Constitution — the beating heart of our constitutional design

By Justin Collings  Sep 11, 2021, 10:00pm MDT

Editor’s note: In his April 4 address at the general conference of The Church of Jesus Christ of Latter-day Saints, President Dallin H. Oaks spoke of his belief that “the United States Constitution contains at least five divinely inspired principles”: popular sovereignty, the separation of powers, federalism, individual rights, and the rule of law. This essay is the second in a five-part series that will address each of these principles.

Harry Truman had a problem. Two years earlier, he had sent American troops across the Pacific to defend South Korea against a communist invasion from the North. He did so without a formal declaration of war or any other authorization from Congress. This might pose a constitutional problem, but Truman had a concrete problem as well. Thousands of American soldiers, on his watch, were fighting a bloody overseas war. To win, they needed tanks, guns and other weaponry. And to procure such armaments, Truman needed steel.

That was his problem. In the spring of 1952, a national labor strike threatened to halt production at America’s steel mills — a potential disaster, Truman believed, for the war effort in Korea. If the strike went forward as planned, steel production would effectively cease. Congress declined to intervene. Truman found himself facing a crisis.

But Truman was president of the United States — the man whose desktop placard proclaimed, “The Buck Stops Here.” On April 8, 1952, the buck stopped resoundingly when Truman signed an executive order directing the secretary of commerce to seize the nation’s steel mills and ensure that production continue. “These are not normal times,” Truman explained in a nationwide broadcast. “I have to think of our soldiers in Korea … the weapons and ammunition they need.”

To his staff, Truman justified his action with a homespun theory of constitutional power. “The president,” he told them, “has the power to keep the country from going to hell.”

Less than two months later, the Supreme Court corrected the president’s view. In the case of Youngstown Sheet & Tube Co. v. Sawyer, a majority of six justices declared the steel seizure unconstitutional. The justices in the majority offered different reasons for their ruling, but they agreed on two core propositions: First, emergencies don’t create additional executive power; and second, the president can’t seize private property without congressional approval.

Truman was livid. Even Justice Tom Clark, his own former attorney general, had voted against him. The president was somewhat reconciled only when Justice Hugo Black, author of the court’s main opinion, invited the president and the other justices to a party at his elegant antebellum home in Old Town Alexandria. As the evening began, recalled Justice William O. Douglas, the president was “a bit testy.” But when Black passed around the bourbon and canapes, Truman relaxed.

“Hugo,” he said, “I don’t care much for your law, but, by golly, this bourbon is good.”

RELATED

What do you do when democracy is too much of a good thing?

Separation of powers

The steel seizure controversy highlights fundamental features of American law. It was, at bottom, a dispute about where Congress’ power ends and the president’s begins. With the Supreme Court acting as umpire, the case offers a classic illustration of the separation of powers.

“Separation of powers” is a well-worn label for an old but essential concept. Separation of powers is arguably the defining feature of the U.S. Constitution — the beating heart of our constitutional design. It has done more to secure and maintain American liberty than anything found in the Bill of Rights or anywhere else in the Constitution.

“Every banana republic,” observed the late Justice Antonin Scalia, “has a bill of rights.” What they don’t have is an effective separation of governmental powers, which is why the rights they solemnly proclaim aren’t worth the paper on which they are printed.

Separation of powers was the American founders’ principal response to the critical dilemma they faced. Within their lifetime, the framers had experienced two varieties of misrule: centralized tyranny under George III and something like anarchy under the Articles of Confederation. Prior to the Revolution, the king and Parliament had wielded too much power; after the Revolution, the confederation Congress possessed too little. The Constitution’s framers sought a third way — a happy medium between oppression and chaos; a Goldilocks government between the two extremes.

The pursuit of such a balance had been, for millennia, political philosophers’ quest for the holy grail. It was the cardinal question of constitutional design: How do you grant the government enough power to govern but not enough to oppress? How do you empower majority rule without imperiling individual rights?

“In framing a government which is to be administered by men over men,” wrote James Madison, “the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself.”

The Constitution’s response was to divide the power it bestowed. It did so in two directions: vertically, between the federal government and the states; and horizontally, among the branches of the federal government. The phrase “separation of powers” usually refers to the horizontal split, which is the subject of this essay. The vertical split is usually called federalism, the topic of the next essay in this series.

Branches of government

The three branches of the federal government — the legislative, the executive and the judicial — play different roles but all have one function in common: Each branch must work to keep the other branches at bay. This was Madison’s core insight.

In perhaps the most famous passage he ever wrote, Madison observed that “the great security against a gradual concentration of the several powers in the same department consists in giving to those who administer each department the necessary constitutional means and personal motives to resist encroachments of the others. … Ambition,” Madison continued, “must be made to counteract ambition. The interest of the man must be connected with the constitutional rights of the place.”

How does that work in practice? The basic mechanism is captured in another familiar phrase: checks and balances. Under the U.S. Constitution, the separation of powers is not complete. Instead, the branches are both independent (such that no one branch can dominate the others) and interdependent (such that each branch constrains the others). The Constitution parcels its powers in such a way that each branch has a vested interest in checking the other branches and preserving its own independence.

Consider, for example, the powers to make war and conduct foreign policy. Under the old British constitution — a hodgepodge of customs and laws rather than a single written document — decisions about war and peace were among the so-called prerogative powers of the king.

Prerogative powers fell within the crown’s special purview, essentially unchecked by Parliament. The U.S. Constitution, by contrast, carefully divided the traditional prerogative powers between Congress and the president, and it assigned the federal courts to police the boundary between the two.

RELATED

The U.S. Constitution: The document that changed the world

The president, for instance, is commander in chief of the armed forces, but only Congress can declare wars or raise armies to fight them. As commander in chief, the president may conduct military operations, but only Congress can fund them. The president may negotiate treaties, but a Senate supermajority (two-thirds) must ratify them.

One of many Constitutional interpretation manuals

That, at least, was the original design. Modern practice looks rather different. At least since Truman’s unilateral actions in Korea, presidents of both parties have repeatedly waged wars without congressional approval (though clever executive branch lawyers often claim to find a legal basis for such wars in obscure corners of musty statutes). Each unilateral adventure overseas supplies a new precedent, which future presidents invoke to justify unilateral adventurism of their own.

A similar dynamic is at work with respect to treaties. Getting two-thirds of the Senate to approve a treaty is hard work. In a hyperpartisan age, some might think it impossible. Here again, presidents of both parties have skirted the two-thirds requirement by signing “executive agreements” with the heads of foreign states. Executive agreements are functionally identical to treaties — they waddle like treaties, they swim like treaties, they quack like treaties — but presidents have gotten away with their gambits by giving such pacts a different name.

Modern presidents wield as much power as they do largely because modern Congresses let them get away with it.

Inflated presidential powers are by no means limited to war and foreign affairs. In a recent book, Saikrishna Bangalore Prakash, an expert on presidential power at the University of Virginia Law School, has argued that in one realm after another “our presidents take actions not traceable to any plausible reading of the original Constitution.”

Such actions include unilaterally declaring war and contracting treaties, but also spending money without congressional appropriation, making federal law courtesy of congressional delegations and spurious readings of existing law, ignoring laws that restrict the president’s use of the military, and amending congressional laws or even the Constitution simply by repeatedly violating them. On Prakash’s telling, trivial presidential usurpations have paved the way for increasingly significant usurpations until the modern presidency bears little resemblance to the founders’ design.

Congress, moreover, is complicit in the creep — in part because roughly half of Congress is usually beholden to the president as the leader of their party, in part because many members of Congress nurture presidential ambitions of their own. For nine decades and more, Congress has delegated vast powers to administrative agencies, which do exponentially more lawmaking than Congress ever does.

Most American law is made by agencies who answer to the president rather than the American people. In recent decades, Congress has sat idly by while presidents, by virtue of executive orders, increasingly make laws rather than enforce them. The tide has been swelling for years; it shows no signs of stemming soon.

Defenders of the inflated modern presidency and its sprawling administrative state insist that they are required by the imperatives of governance in a complex, modern, industrial-technological era. The Constitution’s 18th-century letter must yield, so the argument goes, to 21st-century necessities.

One should view such arguments with caution. A Constitution designed to endure for centuries should, of course, be applied flexibly to changing circumstances. But tyrants in every age have pleaded necessity; authoritarians always invoke emergencies, exigencies or the latest crisis to justify assuming additional powers.

The Constitution’s framers defined tyranny as the concentration of powers in one set of hands, and I see no reason to revisit that definition today. The Constitution’s unique combination of independent branches and interdependent checks remains the most powerful means of preventing such a concentration.

Fortunately, a measure of interbranch independence persists, and many checks remain in place. What’s more, the remedy for most contemporary imbalances lies squarely in Congress’s hands. Modern presidents wield as much power as they do largely because modern Congresses let them get away with it. But in most instances, legislators could reassert their constitutional powers simply by passing laws. Congress retains the power to restrain the presidency, and Congress still answers to voters.

“A dependence on the people,” Madison wrote in Federalist 51, “is no doubt the primary control on the government; but experience has taught mankind the necessity of auxiliary precautions.” Madison was talking about the separation of powers. Perhaps the time has come for We the People to exercise that primary control by insisting with our votes that candidates take such auxiliary precautions seriously.

Justin Collings is a professor at Brigham Young University Law School and a fellow with the Wheatley Institution.

Deseret News

*****************************

What do you do when democracy is too much of a good thing? Part I

The answer is in the U.S. Constitution’s inspired balance between popular rule and centralized power

By Justin Collings  Sep 4, 2021

Deseret Editor’s note: In his April 4 address at the general conference of The Church of Jesus Christ of Latter-day Saints, President Dallin H. Oaks spoke of his belief that “the United States Constitution contains at least five divinely inspired principles”: popular sovereignty, the separation of powers, federalism, individual rights, and the rule of law. This essay is the first in a five-part series that will address each of these principles.

December 1789. Thomas Jefferson had been home less than three weeks, and his country was calling again. As United States minister to Paris, Jefferson had relished the sparkle of French wit, the savor of French cuisine and the glories of the gallic countryside. He had also witnessed the opening scenes of the French Revolution.

Now he was back in his native Virginia, just getting settled in his beloved Monticello, when a letter arrived from New York. It was from a fellow Virginian, George Washington — the newly installed first president of the United States of America. Washington wanted Jefferson to serve as his secretary of state. Reluctantly, Jefferson accepted the chief executive’s summons.

Before he left for New York, the infant nation’s temporary capital, Jefferson received a special salute from his neighbors — the people of Albemarle County, Virginia. In grateful response, Jefferson highlighted in just a few lines what for him had been the core meaning of the American Revolution.

“We have been fellow-laborers and fellow-sufferers,” Jefferson observed, “and heaven has rewarded us with a happy issue from our struggles. It rests now with ourselves alone to enjoy in peace and concord the blessings of self-government, so long denied to mankind: to show by example the sufficiency of human reason for the care of human affairs and that the will of the majority, the natural law of every society, is the only sure guardian of the rights of man.”

The blessings of self-government. That, for Jefferson, was what the colonial rebels had been fighting for during eight long years from 1775 and 1783; and it was what the recently ratified federal Constitution had been designed to secure.

In the summer of 1776, in the most famous paragraph he ever penned, Jefferson proclaimed “self-evident” truth that “governments … deriv[e] their just powers from the consent of the governed.” Four score and seven years later, Abraham Lincoln described the country’s deadliest crisis as a test of whether “government of the people, by the people, for the people” could “long endure.”

For both Jefferson and Lincoln, then, “America” was an experiment in self-government — in what political theorists called “popular sovereignty.”

The stakes of that experiment were colossal. It encompassed the fate of freedom, not only in America but around the globe. Jefferson and Lincoln believed with many others that the United States had mounted the stage of human history — and that the entire world was watching.

RELATED

Our inspired Constitution


But what did these high-sounding phrases mean? What is self-government or popular sovereignty? Who are “the People” invoked in the Constitution’s preamble? And how can they govern themselves?

The notion of popular rule has ancient roots. It was a central — and hotly contested — concept ancient Greek philosophers and politicians alike. Among the Greeks, rule by the people (the demos, from which we get our modern term democracy) meant two fundamental things. It meant that the people could both choose their own rulers and hold those rulers to account. Some Greek thinkers used a chilling term to describe the people’s rule. The people, they said, were a political society’s proper tyrannos. The demos, that is, was a tyrant.

Tyrant, for these writers, was a neutral term. But it had, then as now, a darker side. It was the people of Athens, after all, who condemned and executed Socrates.

For centuries after the golden age of Greece, thinkers and statesmen warned of the risks of popular rule. In the early days of the French Revolution, one writer observed that the people “is essentially credulous; and, in its moments of fury, it uses ostracism against a great man. It wishes the death of Socrates, bewails it the next day, and a few days later dresses altars for him. The people,” he concluded, “does not know how to govern without passion!” (As if to prove his point, the author of these words, Jean-Baptiste Salle, later had his head severed from his shoulders before a gaping crowd in the streets of Bordeaux.)

Did this mean that the people shouldn’t govern at all? Not necessarily. The key was to distinguish between popular sovereignty and day-to-day governance. The people could govern themselves by delegating lawmaking powers to their chosen representatives. Self-government didn’t require actual governing. It proceeded via representation. It required only what Jefferson called “the consent of the governed.”

The rub was how to secure that consent — how to ensure that representatives pursue the people’s interest, rather than their own, and that legislation reflect the people’s considered wishes, not just momentary passions. This was the conundrum, above all others, that perplexed the framers of the U.S. Constitution.


Following the Declaration of Independence in 1776, each American state adopted a constitution of its own. Many state constitutions were strikingly democratic. Pennsylvania was governed almost entirely by a single-house legislature. Most state legislators served for very short terms. “Where annual elections end,” ran a common saying, “tyranny begins.”

Voters thus had frequent occasion to punish legislators who displeased them, and they made the most of that opportunity. Constantly fearing for their jobs, legislators responded swiftly to shifts in the popular mood. This led them to adopt various measures that were politically popular but fiscally reckless. States lavishly printed paper currency, commanded creditors to accept the worthless cash, imposed punitive trade policies against their sister states, and left their wartime debts unpaid.

Some observers thought democracy in America had become too much of a good thing. The states, they believed, had swung from one pendulum to the other. Before the Revolution, there had been too little official responsiveness to popular pressures; now, perhaps, there was too much. Before the Revolution, there had been too much centralized power; now, it seemed, there was too little.

The delegates to the Philadelphia Convention of 1787 sought to strike a happy balance. They aimed to craft a government that would derive its powers from the people and secure the consent of the governed, but also enjoy some independence from shifting popular passions. It was in this respect that Jefferson’s farewell to his Virginia neighbors was incomplete. The new country would not be ruled by “the will of the majority” at any given moment, but by the will of the majority over time.

Hence the Constitution’s sophisticated clockwork — a staggered sequence of elections and terms of varying length. Representatives would serve for two years; presidents for four; senators for six; judges for life. Biannual elections would keep the government accountable, but no single election would radically change its composition. To effect dramatic change, a political movement would need to win repeatedly over time. It would require the people’s enduring approval.

RELATED

The U.S. Constitution is a document filled with principles for the world to emulate


None of that would matter, of course, unless the proposed Constitution acquired the people’s immediate approval through ratification. By its own terms, the Constitution would enter into force only if conventions in nine states ratified it. The delegates in Philadelphia had spoken in We the People’s name. It was now the People’s turn to speak for themselves.

By modern standards, the process by which the Constitution was ratified was intolerably exclusive. Few Black Americans or other people of color — and no women — voted for delegates to the state ratifying conventions. But modern standards have been shaped by impulses and ideals that ratification helped unleash. By the standards of the late 18th century, the process was astonishingly democratic — almost breathtakingly inclusive. Property qualifications for voting — and for service as delegates — all but evaporated. Up and down the eastern seaboard, more freemen were allowed to vote or stand for election than ever before anywhere in the world.

“True,” writes Akhil Amar, a law professor at Yale, the ratification process “fell far short of universal suffrage as modern Americans understand the idea, but where had anything close to universal suffrage ever existed prior to 1787?”

The answer, of course, is nowhere. More than two centuries later, universal suffrage is an unquestioned premise across much of the planet. But this fact owes much the example of the U.S. Constitution — including subsequent amendments that banned slavery and gave the vote to men and women of all races. The ratification process had shortcomings, to be sure. But we can identify those shortcomings today largely thanks to standards the original Constitution helped create.

Ratification, of course, involved more than voting. It entailed an unprecedented process of public deliberation. Oceans of ink were spilled on mountains of paper as armies of essayists and orators, poets and pundits made their cases for or against the proposed Constitution.

The most brilliant contributions to this debate — the seven dozen Federalistessays, mostly composed by Alexander Hamilton and James Madison — have become an enduring classic of political philosophy. But there were many lesser lights that, for their time and season, burned brightly and intensely.

It was a remarkable conversation, one that led not only to the adoption of the Constitution in 1789, but to the enactment of the Bill of Rights two years later. The Constitution enshrined the principle of popular sovereignty; but the ratification process embodied it.

That process wasn’t always edifying. The years in which the Constitution was drafted, debated, and ratified were years of searing political conflict and intense polarization. It was an age of bluster and invective, character assassination and fake news. Those years witnessed shenanigans in statehouses and shady tactics at the polls.

They saw libel in the newspapers and violence in the streets. They were years, in short, not entirely unlike our own.

But glimmering through the gloom there glowed a civil and articulate few — dedicated citizens committed to reasoned discourse and public deliberation. These were the forgers of an American dialogue, the founders and framers of a national conversation. Over time, the conversation has become more inclusive. It is the richest heritage of the founding era.

The survival of self-government requires that the conversation continue. We must defend, in our day, the rights of all to engage in that ongoing dialogue, even if they or their views are unpopular. We must collectively carry the conversation forward with unflagging civility and mutual respect, with a passionate democratic gusto, and in an invincible spirit of freedom.

Justin Collings is a professor at Brigham Young University Law School and a fellow at the Wheatley Institution.

Deseret News