- Used Books
- Staff Picks
- Gifts & Gift Cards
- Sell Books
- Stores & Events
- Let's Talk Books
Special Offers see all
More at Powell's
Recently Viewed clear list
More copies of this ISBN
Other titles in the Touchstone Books series:
History of American Law 2ND Editionby Lawrence M Friedman
THE REPUBLIC OF BEES
In 1776, the colonies declared themselves independent. The bitter war that followed ended in an American victory. Peace, of course, raised as many questions of government as it answered. A plan of government is a plan for distribution of the power and wealth of a society. The choice of system, then, is no idle exercise in political theory. How to plan the new American government was the major policy issue of the late 18th century. The first grand scheme was embodied in the Articles of Confederation. It proved unsatisfactory to powerful circles in the country. After the failure of the Articles, a federal Constitution was drawn up, and ratified in 1787.
Each colony, too, underwent its own revolution. Colonies became states, and embarked on new courses of action with new problems and new programs. First, they had to fight a war and patch up domestic disruptions. All this called for a major outburst of lawmaking. In Pennsylvania, for example, a constitutional convention, in 1776, declared a general amnesty and established a new form of government. Old officials were replaced by men loyal to the Revolution. The ordinary business of government was to continue, where possible; and the emergencies of war had to be coped with. In October 1777, British troops "penetrated into [the] state, and after much devastation and great cruelty in their progress," seized Philadelphia; the state government then created a "council of safety," with vast and summary powers "to promote and provide for the preservation of the Commonwealth." It had power to seize goods "for the army and for the inhabitants," punish traitors, and "regulate the prices of such articles as they may think necessary." But the "ordinary course of justice" was to continue as far as feasible. In the same year, the legislature passed a bill of attainder against a number of men who had "traitorously and wickedly" gone over to the king. The state redefined and punished treason, declared bills of credit of the Continental Congress and the state to be legal tender; and, inevitably, legislated about the militia, army supplies, taxes, and the policy of war.
When the war ended, debates over law continued. The king of England and his government had been successfully overthrown. Should the king's law be also overthrown? Should ordinary private law be radically altered? The first generation seriously argued the question. The common law was badly tarnished; so was the reputation of the lawyers, many of whom had been Tories. It seemed to some men that new democratic states needed new institutions, from top to bottom, including fresh, democratic law. A pamphleteer, who called himself Honestus, asked, in 1786: "Can the monarchical and aristocratical institutions of England be consistent with...republican principles?" It was "melancholy" to see the "numerous volumes" of English law, "brought into our Courts, arranged in formidable order, as the grand artillery to batter down every plain, rational principle of law." Thomas Paine, an old firebrand, spoke for at least some zealots when he denounced, in 1805, the "chicanery of law and lawyers." He complained that Pennsylvania courts, even at that late date, had "not yet arrived at the dignity of independence." The courts, he said, still "hobble along by the stilts and crutches of English and antiquated precedents," which were often not democratic at all, but "tyrrannical." During Shays's Rebellion, in Massachusetts (1786), mobs stopped the courts from sitting, and forcibly staved off execution of judgments against debtors. It was easy to attribute class bias to the courts, and attribute this class bias in turn to the antiquated, oppressive, inappropriate common law.
There were two apparent alternatives to the stilts and crutches. The common law could be replaced by some rival system. Or all systems could be abandoned in favor of natural principles of justice. The first alternative had some slight basis, in hope if not in fact. There were other systems of law. After the French revolution, American liberals were particularly attracted to the French civil law. In the early 19th century, the Napoleonic Code served as a symbol and model of clarity and order. Some civil-law jurists were translated into English during this period: A Treatise on Obligations, Considered in a Moral and Legal View, "translated from the French of [Robert] Pothier," appeared in New Bern, North Carolina, in 1802. To some small extent, French scholars influenced American legal thought. Compared to civil law, common law seemed, to a number of jurists, to be feudal, barbaric, uncouth.
In hindsight, the common law had little to fear. It was as little threatened as the English language. The courts continued to operate, continued to do business; they used the only law that they knew. Few lawyers had any grasp of French. French lawbooks were rare and inaccessible; English authorities flooded the country. To be sure, there were some American jurists who had the education and skill to handle Continental law — James Kent of New York, for example. Joseph Story, who served on the Supreme Court, was a tower of erudition. These men cited and used bits of foreign law in their writings and opinions. But they were not revolutionaries. They believed in purifying and improving the law, not in overthrowing it. They were willing to snatch doctrines and ideas from Continental Europe; but even English law did that. One of the culture heroes of the American legal elite was England's Lord Mansfield, who died in 1793. Mansfield was Scottish by birth and an ardent admirer of Roman-flavored civil law.
And of course the common law had many defenders. Not everybody saw the common law as old and despotic. It was also the birthright of free men, a precious inheritance, perverted by the British under George III, but still a vital reality. One rhetorical pillar of the men of 1776 was that the common law embodied fundamental norms of natural law. The first Continental Congress, in 1776, adopted a Declaration of Rights; it declared that the colonies were "entitled to the common law of England," in particular the right of trial by jury. Americans were also entitled to the benefit of those English statutes which "existed at the time of colonization; and which they have, by experience, respectively found to be applicable to their several local and other circumstances."
Common-law lawyers were among the heroes of the Republic. John Adams was one; Thomas Jefferson, for all his ambivalence toward common law and its judges, another. Lawyers mostly drafted the state and federal constitutions. Courts were increasingly manned by lawyers, who listened to the arguments of other lawyers. Lawyers moved west with the line of settlement; they swarmed into state capitals and county seats. Wherever one looked in political life — in town, city, county, state, and national government — the lawyers were there. Unlike some later revolutions, and some earlier colonial Utopias, the new republic did not try to do business without lawyers. Old lawyers continued to function, training new lawyers in their image, who, like their teachers, turned almost instinctively to common law. The common law was also a weapon of integration. The Northwest Ordinance imposed common law on the lands of the American frontier. In the prairies and forests, where French settlers lived and worked in the path of the American onrush, the common law was an agent of American imperialism.
The common law would have to be Americanized, of course. Now that the states had freedom to choose, what parts of English law would remain in force? This was a tortuous question, not easily solved. Many states passed statutes to define the limits of the law in force. A Virginia law of 1776 declared that the "common law of England, all statutes or acts of Parliament made in aid of the common law prior to the fourth year of the reign of King James the first, and which are of a general nature, not local to that kingdom...shall be considered as in full force." The Delaware constitution of 1776 (art. 25) provided that "The common law of England as well as so much of the statute law as has been heretofore adopted in practice in this State, shall remain in force," except for those parts which were "repugnant to the rights and privileges" expressed in the constitution and in the "declaration of rights."
The New York experience was particularly complex. A law of 1786 declared the common law in force, and such English statutes as were in effect in the colony on April 19, 1775. Later, New York specifically re-enacted some British laws — the Statute of Frauds, for example, a law first passed in 1677, and which had virtually become a part of the common law. In 1788, a New York law, "for the Amendment of the Law, and the better Advancement of Justice," declared that "after the first day of May next," no British statutes "shall operate or be considered as Laws" in the state. The New York Constitution of 1821 (art. VII, sec. 13) stated that "Such parts of the common law, and of the acts of the legislature of the colony of New York, as together did form the law of the said colony" on April 19, 1775, and the resolutions of the colonial Congress, "and of the convention of the State of New York," in force on April 20, 1777, would continue to be law, unless altered or repealed, and unless they were "repugnant" to the constitution. No mention was made of British statutes; for good measure, an act of 1828 specifically pronounced the British statutes dead.
Yet even this flock of New York laws fell short of solving the problem. A New York court later held that some English statutes had become part of the "common law" of the colony. This meant that an undefinable, unknowable group of old laws somehow maintained a ghostly presence. They lived on, of course, only insofar as they were not "repugnant" to the constitution or unsuitable to conditions. One could never, then, be sure if an old law were dead or alive. New York was not the only state whose judges held that some of the old statutes were valid, and thus sentenced the legal public to a certain amount of uncertainty. To this day, an occasional case still turns on whether some statute or doctrine had been "received" as common law in this or that state. The question of "reception" had troubled the colonials too. Independence merely altered the form of the question. And in a broader sense, the question is an abiding one in all common-law jurisdictions. Judges must constantly re-examine the law, to see which parts still suit society's needs, and which parts must be thrown on the ash heap, once and for all.
The reception statutes dealt with the older English law. What about new law? There was, as expected, a strong burst of national pride. To Jesse Root of Connecticut, writing in 1798, it was "unnecessary, and derogatory" for courts of an independent nation to be governed by foreign law. His ideal was "the republic of bees," whose members "resist all foreign influence with their lives," and whose honey, "though extracted from innumerable flowers," was indisputably their own. In pursuit of the republic of bees, New Jersey passed a law, in 1799, that
no adjudication, decision, or opinion, made, had, or given, in any court of law or equity in Great Britain [after July 4, 1776]...nor any printed or written report or statement thereof, nor any compilation, commentary, digest, lecture, treatise, or other explanation or exposition of the common law,...shall be received or read in any court of law or equity in this state, as law or evidence of the law, or elucidation or explanation thereof.
Kentucky prohibited the mere mention of recent British law. Its statute, passed in 1807, declared that "reports and books containing adjudged cases in...Great Britain...since the 4th day of July 1776, shall not be read or considered as authority in...the courts of this Commonwealth." During Spring Term, 1808, Henry Clay, appearing before the court of appeals of Kentucky, "offered to read" a "part of Lord Ellenborough's opinion" in Volume 3 of East's reports; the "chief justice stopped him." Clay's co-counsel argued that the legislature "had no more power to pass" such a law than to "prohibit a judge the use of his spectacles." The court decided, however, that "the book must not be used at all in court."
But Lord Ellenborough was not so easily banished, in New Jersey, or Kentucky, or elsewhere. The New Jersey statute was repealed in 1819. As a practical matter, English law continued to be used by lawyers and courts, throughout the period, throughout the country. England remained the basic source of all law that was not strictly new or strictly American. The habits of a lifetime were not easily thrown over, despite ideology. Indigenous legal literature was weak and derivative. There was no general habit of publishing American decisions; American case reports were not common until a generation or more after Independence. To common-law lawyers, a shortage of cases was crippling. To fill the gap, English materials were used, English reports cited, English judges quoted as authority. In the first generation, more English than American cases were cited in American reports. Ordinary lawyers referred to Blackstone constantly; they used his book as a shortcut to the law; and Blackstone was English to the core. Sometimes curiously old-fashioned bits of law — phrases, old doctrines, old writs — turned up in curious places (for example, the American frontier); the reason was the ubiquity of Blackstone.
American law continued, in short, to borrow. The English overlay was obvious, pervasive — but selective. The English doctrines that were invited to this country were those which were needed and wanted — and only those. Sweeping changes took place in American law in the years between 1776 and the middle of the 19th century. During that time, there developed a true republic of bees, whose flowers were the social and economic institutions that developed in their own way in the country. They, not Lord Ellenborough and Lord Kenyon, were the lawmakers that made American law a distinctive system: a separate language of law within the family founded in England.
The second apparent alternative to the common law was also a mirage. To abolish the tyranny of lawyers and their rules, to reduce law to a common-sense system, at the level of the common man's understanding, a system of simple, "natural" justice: this was an age-old yearning, but it flared up with special vigor after 1776. As one citizen of Kentucky put it, the state needed "a simple and concise code of laws...adopted to the weakest capacity."
In part, the antilaw movement was an outgrowth of radical politics. One current of thought distrusted the common law on the grounds that it was remote from the needs of ordinary people, and was biased toward the rich. Another current of thought distrusted the law because it was archaic, inflexible, irrelevant; it did not suit the needs of merchant or businessman. Both groups could make common cause against lawyers' law, which suited nobody's wants but the lawyers. There was a general interest, then, in a reform of legal institutions, in which rich and poor, radical and conservative could share. In a complex society, however, it was Utopian to imagine that lawyers' law could be overthrown and replaced by natural justice, whatever that might mean. On the contrary, more and more rules, of more and more definite shape, were needed as time went on. The reform urge, as we shall see, did not abate; but it came to mean, not artlessness, but adaptation to the needs of a market economy.
One basic, critical fact of 19th-century law was that the official legal system penetrated, and had to penetrate, deeper and deeper into society. Medieval common law was not the law everywhere in England; nor was it everybody's law. American law was more popular, in a profound sense. It had no archaic or provincial rivals. It had no competitive substratum. Paradoxically, American law, divided into as many subsystems as there were states, was less disjointed than the one "common law" of older England.
Of course, millions were beyond the reach of formal law and indifferent to it. But comparatively speaking, American law had an enormous range. It drew its strength from the work and wealth of millions, and it affected the work and wealth of millions more. In 16th- or 18th-century England, few people owned or dealt in land. Only a small percentage were inside the market economy. Only a few were potential customers for family law, the law of torts, the law of corporations. There was surely less oligarchy in the United States than in the old kingdoms of Europe. A law for the millions, for the middle class, had to develop. And this law, to survive, had to be more pliant and accessible than a law for the wealthy few.
In short, law had to suit the needs of its customers; it had to be at least in a form that lawyers, as brokers of legal information, could use. What happened to American law in the 19th century, basically, was that it underwent tremendous changes, to conform to the vast increase in numbers of consumers. It is dangerous to sum up long periods and great movements in a sentence. But if colonial law had been, in the first place, colonial, and in the second place, paternal, emphasizing community, order, and the struggle against sin, then, gradually, a new set of attitudes developed, in which the primary function of law was not suppression and uniformity, but economic growth and service to its users. In this period, people came to see law, more and more, as a utilitarian tool: a way to protect property and the established order, of course, but beyond that, to further the interests of the middle-class mass, to foster growth, to release and harness the energy latent in the commonwealth: "Dynamic rather than static property, property in motion or at risk rather than property secure and at rest."
It was not only property to which the word dynamic seemed more and more apt. These two polar words — dynamic and static — aptly describe a fundamental change in the concept of law. The source of the change lay not so much in the Revolution as in revolution: the transformation of economy and society that occurred in the machine age and the age of rational thought. A dynamic law is a man-made law. The Constitution talked about natural rights, and meant what it said; but these rights did not define the duties and status of the subject; rather, they served as a framework for the fulfillment of people's needs and desires. Gradually, an instrumental, relativistic theory of law made its mark on the system. It meant a more creative view of precedent. It meant asking for the functions of past law, and measuring these against demands of the present and future. Once, change in law was looked on as rare and treated almost apologetically. But in the 19th century, Americans made law wholesale, without any sense of shame. Basically, this was legislative law, law made by elected representatives, rather than law made by judges. To be sure, the boldness of the judges and the rapidity of social change meant that there was room for both institutions in the house of creative law-making; the judges seized this opportunity, and played a mighty, if secondary, role in making fresh law.
CONSTITUTIONS: FEDERAL AND STATE
The Revolutionary period was, by necessity, an age of innovation in fundamental law. The old ties with England had been snapped. The states and the general government decided to put their basic political decisions in the form of written constitutions. Some states had begun as chartered colonies; they had gotten into the habit of living under these charters, and had even learned to revere them, as guarantees of their liberty. American statesmen tended to look on a written constitution as a kind of social compact — a basic agreement among citizens, and between citizens and state, setting out mutual rights and duties, in permanent form.
The Articles of Confederation (1777) envisioned a loose, low-key grouping of highly sovereign states. It did not provide for a strong executive. It had no provision for a federal judiciary. Congress, however, was given some judicial power; it was "the last resort on appeal in all disputes and differences...between two or more states concerning boundary jurisdiction or any other cause whatever." Congress also had power over matters of admiralty law, with "sole and exclusive right" to establish "rules for deciding, in all cases, what captures on land or water shall be legal," and how prizes might be "divided or appropriated." Congress also had sole right to set up "courts for the trial of piracies and felonies committed on the high seas," and courts "for receiving and determining, finally, appeals in all cases of captures" (art. IX).
The Articles of Confederation, by common consent, were a failure; the Constitution of 1787 was a stronger, more centralizing document. The Northwest Ordinance (1787), which set up a scheme of government for the Western lands, and which was enacted shortly before the Constitution, took it for granted that all future states would have a "permanent constitution." Any new states carved out of the Northwest Territory would have a "republican" constitution, consistent with federal law (Northwest Ordinance, 1787, art. V).
The original states had in theory the option to write or not write constitutions. But most of them quickly chose the way of the new-written word. Within a short time after the war broke out, eleven states had drafted and adopted new constitutions. To some, a constitution was a rallying point, a symbol of unity during war. The New Jersey Constitution (1776) put it this way:
in the present deplorable situation of these colonies, exposed to the fury of a cruel and relentless enemy, some form of government is absolutely necessary, not only for the preservation of good order, but also the more effectually to unite the people, and enable them to exert their whole force in their own necessary defense.
A few states chose to rest upon their original charters. But these, too, were eventually replaced by new documents, of the constitutional type. Connecticut discarded its charter and adopted a constitution in 1818. Eventually, every state in the union came to have a constitution in the strict sense of the word. All, in short, embarked on careers of making, unmaking, and remaking constitutions.
Constitutionalism answered to a deep-seated need, among members of the articulate public, for formal, outward signs of political legitimacy. This urge had driven tiny, isolated colonies in what is now Rhode Island or Massachusetts to express the structure and principles of government in the form of an agreement — a visible, legible bulwark against the lonely disorder of life outside the reach of the mother country. Much later, but by something of the same instinct, the remote Oregon pioneers, in a no-man's land disputed among nations, drew up a frame of government and called it a constitution. So did the residents of the "lost state of Franklin" in the 1780s, in what is now part of eastern Tennessee. So did the handful of citizens of the "Indian Stream Republic," in disputed territory near the border of New Hampshire and Canada. And so did the Mormons of the "State of Deseret." These "constitutions," to be sure, were mostly copycats; they borrowed provisions from existing constitutions, taking a phrase here, a clause there, and making whatever changes were considered appropriate. They were short-lived and of dubious legality. But they illustrate how strong the idea of the written constitution had become in American life.
There have been dozens of state constitutions. Their texts, style, and substance vary considerably. Some of the earliest ones, written before the 1780s, were quite bold and forward-looking for their times. The first Pennsylvania Constitution (1776) represented a sharp victory for the liberals of the state. Virginia pioneered a Declaration of Rights (1776). The idea and content of the Bill of Rights came from sources in the states. The federal Constitution could not have been ratified, without the promise of a bill of rights, which took the form of ten amendments. After 1787, the language and organization of the federal Constitution became in turn a powerful model for state constitutions. One feature, however, was not easily transferred to the states: durability. There has been only one federal Constitution. It has been amended from time to time — but never overthrown. A few states (for example, Wisconsin) have also had only one constitution. Other states have followed a more variegated, or chaotic, constitutional career. Louisiana has had nine constitutions, perhaps ten, depending on how one counts. Georgia has had at least six.
The federal Constitution was marvelously supple, put together with great political skill. The stability of the country — Civil War crisis aside — has been the main source of its amazing survival. But the Constitution itself deserves a share of the credit. It turned out to be neither too tight nor too loose. It was in essence a frame, a skeleton, an outline of the form of government; on specifics, it mostly held its tongue. The earlier state constitutions, before 1787 and for some decades after, also guarded themselves against saying too much. There were, of course, some idiosyncratic features, even before 1787, in state constitutions. New Hampshire (1784), in the spirit of Yankee thrift, solemnly declared that "economy" was a "most essential virtue in all states, especially in a young one; no pension shall be granted, but in consideration of actual services, and...with great caution,...and never for more than one year at a time." But most state constitutions began with a bill of rights, described the general frame of government, and left it at that.
A constitution, if at all different from ordinary law, has two functions. First, it provides a terse exposition of the structure of government — its permanent shape, the nature of its organs or parts, and their boundaries and limits. Second, it may contain a list of essential rights, essential limitations on government, essential rules — all those propositions of high or highest law, which the drafters mean to secure against the winds of temporary change. But this second function has no natural boundary. Opinions differ from generation to generation on what rights and duties are most fundamental. Even the federal Constitution was more than mere framework. Imbedded in it were fragments of a code of law. Trial by jury, for example, was guaranteed (art. III, sec. 2, par. 3). The Constitution defined the crime of treason (art. III, sec. 3), and set out minimum requirements for convicting any man of this crime. The Bill of Rights contained a miniature code of criminal procedure.
What existed in embryo, and in reasonable proportions in the federal Constitution, was carried to much greater lengths in the states. The inflation of constitutions reached its high point (or low point) after the Civil War. But the process began long before that. Even the state bills of rights became bloated. The federal Bill of Rights had ten sections; Kentucky, in 1792, had 28. Some of these were quite vague: "elections shall be free and equal" (art. XII, sec. 5); others seemed hardly to warrant their exalted position — for example, that estates of "such persons as shall destroy their own lives shall descend or vest as in case of natural death."
The Delaware constitution of 1792 was another offender. It included many details of court organization and procedure; for example, "No writ of error shall be brought upon any judgment...confessed, entered, or rendered, but within five years after the confessing, entering or rendering thereof." This constitution also specified minutely how courts should handle accounts of executors, administrators, and guardians. The Alabama constitution of 1819 provided that "A competent number of justices of the peace shall be appointed in and for each county, in such mode and for such term of office, as the general assembly may direct." Their civil jurisdiction, however, "shall be limited to causes in which the amount in controversy shall not exceed fifty dollars." Clearly, the fifty-dollar limit was no immutable right of man. The Tennessee constitution of 1796 set a ceiling on the governor's salary ($750 a year) and forbade any change before 1804. No one could deduce these numbers from natural law.
There was a point to every clause in these inflated constitutions. Each one reflected the wishes of some faction or interest group, which tried to make its policies permanent by freezing them into the charter. Constitutions, like treaties, preserved the terms of compromise between warring groups. These sometimes took the form of a clause that postponed the power of the state to enact a given kind of law. The federal Constitution left the slave trade untouchable until 1808; until that year "the Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress" (art. I, sec. 9, par. 1). Ohio (1802) made Chillicothe the "seat of government" until 1808; the legislature was not to build itself any buildings until 1809. For very delicate issues, the tactics of constitutionalism appeared essential. Otherwise, slight changes in political power could upset the compromise. One legislature can swiftly repeal the work of another; a constitution is harder to change.
Between 1790 and 1847, state constitutions became more diverse and diffuse. Some developments, like some problems, were peculiar to one state, or to one group of states; some were common to the country as a whole. The most general problems were apportionment and suffrage. Any change in the electoral map or in the right to vote meant a reallocation of political power. The suffrage was a bottleneck of law; who voted determined who ruled. Hence, constitutional disputes over suffrage and apportionment were widespread and sometimes bitter. In Rhode Island, the franchise was narrow, and the apportionment scheme outdated. Only those men who owned real estate worth $134 were entitled to vote; this excluded perhaps nine out of ten even of white males over 21. Conservatives stubbornly resisted any change. The so-called "rebellion" of Thomas Dorr (1842) was an unsuccessful, mildly violent attempt to force a change. A new constitution, which went into effect in 1843, finally brought about some measure of reform.
In other states, bloodless revolutions overthrew constitutions and reformed the suffrage. The search for permanence was constant, but permanence escaped men's grasp. The Pennsylvania constitution of 1776, a product of advanced 18th-century liberalism, was replaced in 1790 by a much more conservative constitution; and in 1838 by a moderate one. Statute books were supple; new governments changed them as they wished. Constitutions were brittle. They could be patched up at times; but when they were too deeply impregnated with the policies and interests of an old or lost cause, they had to be completely redone. Inflexibility was the vice of constitutions, as well as the virtue.
An observer with nothing in front of him but the texts of these state constitutions could learn a great deal about state politics, state laws, and about social life in America. The Southern constitutions gave more and more attention, over time, to the protection of slavery, and the repression of free blacks. Legislatures were forbidden to emancipate slaves, unless the master agreed and was compensated. In Pennsylvania (1838) any person who fought a duel, or sent a challenge, or aided or abetted in fighting a duel, was "deprived of the right of holding any office of honor or profit in this State." The Connecticut constitution of 1818, though it paid lip service to freedom of worship, froze every resident into his "congregation, church or religious association." A man might withdraw only by leaving a "written notice thereof with the clerk of such [religious] society." Constitutions often dealt with the state militia, a matter of considerable interest to the Revolutionary generation. In Ohio (1802) brigadiers-general were to be "elected by the commissioned officers of their respective brigades." Some states barred practicing clergymen from public office. The Tennessee constitution of 1796 testily remarked that "ministers of the gospel are, by their professions, dedicated to God and the care of souls, and ought not to be diverted from the great duties of their functions." The draft constitution of "Franklin," in 1784, would have extended this ban to lawyers and "doctors of physic." The Georgia constitution of 1777 declared that "Estates shall not be entailed; and when a person dies intestate, his or her estate shall be divided equally among their children; the widow shall have a child's share, or her dower, at her option." As early as 1776, North Carolina provided that "the person of a debtor, where there is not a strong presumption of fraud, shall not be confined in prison, after delivering up, bona fide, all his estate real and personal, for the use of his creditors, in such manner as shall be hereafter regulated by law." Many 19th-century constitutions contained provisions of this general type, on the subject of imprisonment for debt.
State constitutions reflected the theories of the day on separation of powers, and on checks and balances. The earlier the constitution, however, the weaker the executive branch. For example, 18th-century constitutions gave only feeble powers to the chief executive (called a governor, like his colonial antecedent). His term of official life was typically brief. The Maryland constitution of 1776 solemnly asserted that "a long continuance" in "executive departments" was "dangerous to liberty"; "rotation" was "one of the best securities of permanent freedom." This constitution practiced what it preached. The governor — a "person of wisdom, experience, and virtue" — was to be chosen each year, on the second Monday of November, for a one-year term, by joint ballot of the two houses of the legislature. But he could not continue his wisdom in office "longer than three years successively, nor be eligible as Governor, until expiration of four years after he shall have been out of that office."
The Pennsylvania constitution of 1776 showed a similar bias. It too called for rotation in office. In England, office tended to depend on the crown or the great grandees; public office was essentially a nice warm udder to be milked. American constitutions firmly rejected this notion. According to the Pennsylvania constitution of 1776, "offices of profit" were not to be established; such offices led officeholders to a state of "dependence and servility unbecoming freemen," and created "faction, contention, corruption, and disorder" in the public (sec. 36). But, as the emphasis on rotation shows, these constitutions also rejected the modern notion of politics as a specific career. Rather, it was a duty, a form of public service, open to the virtuous amateur. This notion, alas, did not long survive.
Early constitutions, as was mentioned, slighted the executive; they preferred to give the lion's share of power to the legislature. In the light of American political history, this was only natural. The colonial governor — and the judiciary, to a certain extent — represented foreign domination. The assemblies, on the other hand, were the voice of local influentials. The Pennsylvania constitution of 1776 gave "supreme legislative power" to a single house of representatives. No upper house or governor's veto checked its power. Over the course of the years, however, the states became disillusioned with legislative supremacy. The governor was one beneficiary of this movement. Typically, he gained a longer term of office, and the power to veto bills. In the federal government, the President had this power from the start.
Judicial power, too, increased at the expense of the legislature. Judicial power took the form of judicial review; the judges, in private litigation, passed on acts of other branches of government; and had the right to declare these acts void if, in the judges' opinion, they were unauthorized by the constitution. Ultimately, judicial review fed on constitutional detail; the more clauses a constitution contained, especially clauses that did something more than set out the basic frame of government, the more potential occasions or excuses for review. But wholesale use of the power was, on the whole, an unsuspected sword in this period. True, in the landmark decision of Marbury v. Madison (1803), John Marshall and the Supreme Court, for the first time, dared to declare an act of Congress unconstitutional. But the Court made no clear use of this power, against Congress, for over 50 years. The weapon was used more frequently against state statutes. State supreme courts, too, began to exercise judicial review. It was an uncommon technique; it was hated by Jeffersonians; some judges resisted it; it made little impact on the ordinary working of government. But when its occasion arose, it was an instrument of unparalleled power.
Legislative supremacy declined because influential citizens were more afraid of too much law than of not enough. In a number of states, scandals tarnished the reputation of the legislatures. Blocs of voters became afraid that landlords, "moneyed corporations," and other wealthy and powerful forces were too strong in the lobbies of these assemblies. A movement arose to limit the power of the legislatures. Rules to control legislation were written into one constitution after another. The process began modestly enough. Georgia's constitution (1798) outlawed the practice of legislative divorce, except if the parties had gone to "fair trial before the superior court," and obtained a verdict upon "legal principles." Even then, it took a "two-thirds vote of each branch of the legislature" to grant a divorce. Indiana, in 1816, forbade the establishment by statute of any "bank or banking company, or monied institution for the purpose of issuing bills of credit, or bills payable to order or bearer."
The Louisiana constitution of 1845 was something of a turning point. More than those that came before, this constitution was a charter of economic rights and obligations, and a code of legislative procedure, as well as a plain frame of government whose economic import was implicit. The constitution sharply restricted the state's power to pledge its credit or to lend its money. The state was not to "become subscriber to the stock of any corporation or joint-stock company." Lotteries, legislative divorces, and special corporate charters were forbidden. Every law was to "embrace but one subject, and that shall be expressed in the title." No new exclusive privileges or monopolies were to be granted for more than twenty years. No banks of any sort were to be chartered. These were not chance notions. They were not, in the eyes of contemporaries, extreme. The state was trying to reach legitimate legislative goals; but what was new (and ominous) was that it was doing it through anti-legislation: by foreclosing whole areas of law to statutory change. Other states enthusiastically followed Louisiana's lead. But when times changed, and conditions changed, these overblown constitutions became all too often embarrassments. They were therefore evaded, or amended, or done over completely.
No two state constitutions were ever exactly alike. But no constitution was pure innovation. Among the states, there was a great deal of copying, of constitutional stare decisis. A clause or provision tended to spread far and wide, when it met some felt need, or caught the fancy of lawmakers and constituents. New states embarked on statehood with constitutions that borrowed heaps of clauses and sections from constitutions in older states. Some new states favored the latest, most recent model; others looked to their neighboring states; others to the home state of their settlers. The New York constitution of 1846 left a deep mark on Michigan, and later on Wisconsin. The first constitution of California was heavily indebted to Iowa; Oregon was indebted to Indiana.
Borrowing and influence are not, of course, the same. The states shared a common political culture. Michigan was not a legal satellite of New York; the people in the two states were Americans of the period, and, for the most part, thought alike on political and legal issues. The New York constitution was a recent model; thus people in Michigan used it. When convenient patterns were readily at hand, it was inefficient to start drafting from scratch. Borrowing was always selective. No constitution was swallowed whole. On matters of great importance, conscious choice was never absent. No state was bound to adopt the constitution of another state, or any part of it. They adopted out of expedience and fashion.
In a common-law system, judges make at least some of the law, even though legal theory has often been coy about admitting this fact. American statesmen were not naive; they knew it mattered what judges believed and who they were. How judges were to be chosen and how they were to act was a political issue in the Revolutionary generation, at a pitch of intensity rarely reached before or since. State after state — and the federal government — fought political battles over issues of selection and control of the judges.
The bench was not homogeneous. Judges varied in quality and qualification, from place to place, and according to their position in the judicial pyramid. Local justices of the peace were judges; so were the justices of the United States Supreme Court. English and colonial tradition had allowed for lay judges, as well as for judges learned in law. There were lay judges both at the top and the bottom of the pyramid. In the colonies, the governor frequently served, ex officio, as chancellor. New Jersey continued this system, in its constitution of 1776. This constitution also made the governor and council "the court of appeals in the last resort in all causes of law as heretofore." Since governor and council were or might be laymen, this meant that nonlawyers had final control over the conduct of trials and the administration of justice. In the New York system, too, laymen shared power at the apex of the hierarchy of courts. The constitution of 1777 set up a court "for the trial of impeachments, and the correction of errors," consisting of senators as well as judges. This system lasted well into the 19th century.
The lay judges were not necessarily politicians, though this was ordinarily the case. But they were invariably prominent local men. William E. Nelson has studied the background and careers of the eleven men who served as justices of the superior court of Massachusetts between 1760 and 1774, on the eve, that is, of the Revolution. Nine had never practiced law; six had never even studied law. All, however, of these lay judges had "either been born into prominent families or become men of substance." Stephen Sewall, chief justice in 1760, was the nephew of a former chief justice; he had served thirteen years as a tutor at Harvard College.
The base of the pyramid was even more dominated by laymen. Lay justice did not necessarily mean popular or unlettered justice at the trial court level. The English squires were laymen, but hardly men of the people. Lay justice in America had something of the character of rule by the squires. Nor was lay justice necessarily informal. Laymen, after years on the bench, often soaked up the lawyer's jargon and tone. After all, the difference between lawyers and nonlawyers was not that sharp; frequently, a man came to the bar after the briefest of clerkships and with little more than a smattering of Blackstone. The way lay judges absorbed their law was not much different from the way men in general learned to be "lawyers."
There are many anecdotes in print about the coarseness and stupidity of lay judges. Old lawyers, writing years later, and historians of bench and bar, have tended to drag the reputation of these judges through the mud. For sentimental and other reasons, these lawyers and lawyer-historians wanted to exaggerate the rawness and vulgarity of pioneer judges, and to make the point that laymen who wore the clothing of judges must be incompetent. The actual facts are harder to unearth. Popular feelings against the courts, in the late 18th century, had nothing to do with whether judges were laymen or not. The complaint was not that justice was crude, but that it was biased in favor of creditors, in favor of the rich.
In any event, the lay judge was in slow retreat throughout the period. Eventually, he disappeared entirely from upper courts. The first lawyer on Vermont's supreme court was Nathaniel Chipman (1752-1843). He took office, in 1787, as an assistant judge of the court. Not one of the other four judges was an attorney. On such a court, a lawyer found it easy to take the lead. Chipman later became chief justice, and edited law reports for Vermont. In other states, professionalization came even earlier. All five of the judges of the Virginia Court of Appeals, established in 1788, were lawyers; Edmund Pendleton headed the court.
Historically, judges were appointed from above. But American democracy put strong emphasis on controls from below. This implied some more popular way to choose the judges. The Vermont constitution of 1777 gave to the "freemen in each county" the "liberty of choosing the judges of inferior court of common pleas, sheriff, justices of the peace, and judges of probates." Under the Ohio constitution of 1802, "judges of the supreme court, the presidents and the associate judges of the courts of common pleas" were to be "appointed by a joint ballot of both houses of the general assembly, and shall hold their offices for the term of seven years, if so long they behave well." This gave the electorate at least an indirect voice in judicial selection. Georgia in 1812, and Indiana in 1816, provided for popular election of some judges; Mississippi in 1832 adopted popular election for all. New York followed in 1846, and the rush was on.
The movement, according to Willard Hurst, was "one phase of the general swing toward broadened suffrage and broader popular control of public office which Jacksonian Democracy built on the foundations laid by Jefferson." It was a movement, however, "based on emotion rather than on a deliberate evaluation of experience under the appointive system." The hard facts about judicial behavior were never easy to come by. There is no simple way to compare elected and appointed judges. Still, if judges were not elected, how could they be forced to respond to the will of the people?
There was plenty of evidence from which a jaundiced mind could conclude that judges were political men, and had to be kept in check. Thomas Jefferson, and his party-members, in particular, were quite convinced of this. Federal judges were appointed for life. Before Jefferson came to power, they were naturally Federalists. Their behavior was quite controversial. To Jefferson's men, the judges seemed "partial, vindictive, and cruel," men who "obeyed the President rather than the law, and made their reason subservient to their passion." As John Adams was leaving office, in 1801, Congress passed a Judiciary Act. The act created a flock of new judgeships, among other things. Adams nominated judges to fill these new posts; they were confirmed by the Senate in the last moments of the Adams regime. Jefferson's party raged at these "midnight judges." It was a final attempt, Jefferson thought, to stock the bench forever with his political enemies.
The law that created the "midnight judges" was repealed; the judges lost their jobs; but the other Federalist judges stayed on serenely in office. These holdover judges threatened Jefferson's program, he felt; there was no easy way to be rid of them: "Few die and none resign." He wanted to limit their power; he wanted to make them more responsive to national policy — as embodied, of course, in Jefferson and in his party. John Marshall, the Chief Justice of the United States, was particularly obnoxious to Jefferson. He was a man of enormous talents, and (as luck would have it) enjoyed good health and long life. He outlived a score of would-be heirs to the office, including Spencer Roane of Virginia, and cheated a whole line of Presidents of the pleasure of replacing him. Equally annoying to Jefferson and his successors was the fact that later justices, who were not Federalists, seemed to fall under Marshall's spell, once they were safely on the bench. Madison appointed Joseph Story, who was at least a lukewarm Democrat. On the bench, he became a rabid fan of the Chief Justice. Roger Brooke Taney, who finally replaced John Marshall, had been a favorite of Andrew Jackson, and a member of his cabinet. But Taney, too, became living proof of the perils of lifetime tenure. He outlived his popularity with dominant opinion, at least in the North. The author of the Dred Scott decision tottered on in office until 1864, when the Civil War was almost over.
The prestige of federal courts stands high today, particularly with liberals and intellectuals. An independent court system is (at least potentially) a tower of strength for the poor, for the downtrodden, for the average person facing big institutions or big government. Hence Jefferson's famous fight against the Federalist judges is one policy of his party that has not done well in the court of history. Yet, undeniably, some federal judges behaved in ways that would be considered disgraceful today. Federal judges did not run for re-election; but they played a more active political role than is standard for judges today. Some Federalists made what were in effect election speeches from the bench. They harangued grand juries in a most partisan way. This gave some point to Jefferson's attacks. Other judges were more discreet, but (in Jefferson's view) equally partisan. One of the sources of Marshall's strength was his tremendous solemnity. His opinions, mellifluous, grandly styled, even pompous, purported to be timeless and non-political. They appealed to principle, to the sacred words of the Constitution. Their tone implied that their true author was the law itself in all its majesty; the judge was a detached, impartial vessel. This attitude annoyed Jefferson inordinately, who saw in it nothing but a subtle, maddening hypocrisy. Either way, it was an effective piece of political theater.
Jefferson's attacks on the judges did not totally fail. Like Roosevelt's plan to pack the court in 1937, Jefferson and his successors may have lost the battle but won the war. In both cases, an extreme tactic — the threat of impeachment, a court-packing plan — ended in failure. But perhaps, in both cases, the real impact of the tactic was to scare the opposition; it served its chief role as a bogeyman, and not without impact.
The Constitution gave federal judges tenure for life. It left only one way open to get rid of judges: the terrible sword of impeachment. Federal judges could be impeached for "Treason, Bribery, or other high Crimes and Misdemeanors" (art II, sec. 4). The South Carolina constitution of 1790 permitted impeachment for "misdemeanor in office" (art. V, sec. 3). Literally interpreted, then, the law permitted impeachment only in rare and extreme situations. But there were a few notable cases, in the early 19th century, in which impeachment was used to drive enemies of party and state out of office. In 1803, Alexander Addison, presiding judge of the fifth judicial district in Pennsylvania, was impeached and removed from his office. He was a bitter-end Federalist; as a lower-court judge, he harangued grand juries on political subjects. On one occasion, he refused to let an associate judge speak to the grand jury in rebuttal; impeachment was grounded on this incident. A straight party vote removed Addison from the bench. Eighteen Republicans in the Pennsylvania senate voted him guilty; four Federalists voted for acquittal.
On the federal level, the purge drew first blood in 1804. It was a rather shabby victory. John Pickering, a Federalist judge, was impeached and removed from office. He had committed no "high crimes and misdemeanors"; but he was an old man, a drunk, and seriously deranged. It was understandable to want him off the bench; but it was far from clear that the words of the Constitution applied to this kind of case. Pickering's removal was, in fact, a stroke of politics, a dress rehearsal for a far more important assault, the impeachment of Samuel Chase.
This celebrated affair took place soon after Pickering's trial. Chase was a Justice of the United States Supreme Court. He had a long and distinguished career, but he was an uncompromising partisan, and a man with a terrible temper. He too was notorious for grand-jury charges that were savage attacks on the party in power. President Jefferson stayed in the background; but his close associates moved articles of impeachment against Chase. The articles of impeachment made a number of specific charges of misconduct against Chase; among them, that at a circuit court, in Baltimore, in May 1803, he did "pervert his official right and duty to address the grand jury" by delivering "an intemperate and inflammatory political harangue," behavior which was "highly indecent, extra-judicial and tending to prostitute the high judicial character with which he was invested to the low purpose of an electioneering partizan."
But the anti-Chase faction overplayed its hand. In a frank private conversation, Senator Giles of Virginia admitted what was really at stake:
a removal by impeachment [is] nothing more than a declaration by Congress to this effect: You hold dangerous opinions, and if you are suffered to carry them into effect you will work the destruction of the nation. We want your offices, for the purpose of giving them to men who will fill them better.
The trial was long, bitter, sensational. Republicans defected in enough numbers so that Chase was acquitted, by the slimmest of margins. John Marshall and his court were thenceforward "secure." It was the end of what Albert Beveridge claimed was "one of the few really great crises in American history." In January 1805, an attempt was made to impeach all but one of the judges on Pennsylvania's highest court; it failed by a narrow vote. Impeachment was not a serious threat after these years.
Another radical plan to get rid of bad judges was to abolish their offices. Of course, Jefferson could not do away with the Supreme Court, even had he wanted to; that would have meant major Constitutional change, which was clearly impossible. But his administration repealed the Judiciary Act of 1801; that at least put the midnight judges out of business. An ingenious method for removing unpopular judges was tried out in Kentucky. In 1823, the Kentucky court of appeals struck down certain laws for relief of debtors; this act aroused a storm of protest. Under Kentucky law, the legislature could remove judges by a two-thirds vote; the "relief party" could not muster that percentage. Instead, the legislature abolished the court of appeals and created a new court of four judges, to be appointed by the governor. The old court did not quietly give up its power. For a time, two courts of appeal tried to function in the state, and state politics was dominated by the dispute between "old court" and "new court" factions. Most lower courts obeyed the old court; a few followed the new court; a few tried to recognize both. Ultimately, the "old court" party won control of the legislature, and abolished the new court (over the governor's veto). The old court was awarded back pay; the new court was treated as if it had never existed at all.
As these crises died down, it seemed as if the forces of light had triumphed over darkness — that this country was to have a free, independent judiciary rather than a servile mouthpiece of state. The Chase impeachment failed (it is said) because in the end both parties believed in a strong, independent judiciary; both believed in the separation of powers. Many politicians did in fact have qualms about impeachment; it smacked of overkill. Some of Jefferson's men shared these qualms. They did not feel that a sitting judge should be replaced on political grounds. But the failure of impeachment was not a clear-cut victory for either side. It was rather a kind of social compromise. The judges won independence, but at a price. Their openly political role was reduced; and ultimately most states turned to the elective principle. There would be no more impeachments, but also no more Chases. What carried the day, in a sense, was the John Marshall solution. The judges would take refuge in professional decorum. It would always be part of their job to make and interpret policy; but policy would be divorced from overt, partisan politics. Principles and policy would flow, at least ostensibly, from the logic of law; they would not follow the naked give and take of the courthouse square. Justice would be blind; and it would wear a poker face. This picture of the behavior of judges had enough truth, and enough hypnotic force, to influence the role-playing of judges, and to bring some peace and consensus to issues of tenure, selection, behavior, and removal of judges.
This did not mean that judges could, or would, avoid the mine-fields of politics and policy. High courts faced sensitive issues every term. Some of these issues were so charged with emotion that the veil of objectivity fell or was torn from the judges' faces. Dred Scott, in 1857, was an example of the court overreaching itself — a blatant political act, and more significantly, a wrong-headed one. There were many minor Dred Scotts at lower levels of decision. Prejudice and arrogance in court did not die out in 1810, or 1820, or ever, but it became less overt; in any event, it became harder to document. Long after Chase, there were political trials in the United States, and judges who persecuted unpopular or dissenting men. In 1845, Circuit Judge Amasa J. Parker presided at the trial of the antirent rioters of upstate New York; his behavior was as partisan and prejudiced as any Federalist judge of the early 1800s. But more and more, the judges assumed the outward posture of propriety.
Meanwhile, the actual power of judges, as makers of doctrine and framers of rules, may have actually grown somewhat after 1800. The courts had to hammer out legal propositions to run the business of the country, to sift what was viable from what was not in the common-law tradition, to handle disputes and problems thrown up in the course of political, social, and technological change. Once case reports began to be published, judges had rich opportunities to mold law, as logic and social sense directed, and as the docket, responsive to outside pressure and the demands of the litigants, required. They did not let the opportunity slide. The legal generation of 1776 had been a generation of political thinkers and statesmen. The Constitution of the United States is their greatest legal monument. In the next generation, the great state papers, in a sense, were such judicial opinions as Marbury v. Madison, or the Dartmouth College Case. The best of the early 19th century judges had a subtle, accurate political sense, and firm economic and social beliefs. In particular, the judges turned their attention to law in one of its prosaic meanings: the workaday rules of American life. They built and molded doctrine — scaffolding (as they saw it) to support the architecture of human affairs.
Perhaps the greatest of the judges was John Marshall, Chief Justice of the United States. He, more than anyone else, gave federal judgeship its meaning. It was, of course, conceded that the judiciary made up a co-ordinate branch of government. They were separate, but were they equal? In Marbury v. Madison (1803), John Marshall invented or affirmed the power of judicial review over acts of Congress. But the Marbury decision was only a single dramatic instance of Marshall's work. His doctrines made constitutional law. He personally transformed the function and meaning of the Supreme Court. When he came on the bench, in 1801, the Supreme Court was a frail and fledgling institution. In 1789, Robert Hanson Harrison turned down a position on the court to become chancellor of Maryland. John Jay resigned in 1795 to run for governor of New York. In the first years, the court heard very few cases; it made little impact on the nation. By the time Marshall died, the court was fateful and great.
Marshall had a sure touch for institutional solidity. Before he became Chief Justice, the judges delivered seriatim (separate) opinions, one after another, in the English style. Marshall, however, put an end to this practice. The habit of "caucusing opinions," and delivering up one unanimous opinion only, as the "opinion of the Court" had been tried by Lord Mansfield in England. It was abandoned there; but Marshall revived the practice. Unanimity was the rule on the Court; for a while, until William Johnson (1777-1834) was appointed by Jefferson, in 1804, the unanimity was absolute, with not a single dissenting opinion. Johnson broke this surface consensus. Yet neither Johnson nor any later justices could or would undo Marshall's work. Doctrines changed; personalities and blocs clashed on the court; power contended with power; but these struggles all took place within the fortress that Marshall had built. The court remained strong and surprisingly independent. Jefferson hoped that Johnson would enter the lists against Marshall. Yet Johnson more than once sided with Marshall, in opposition to Jefferson, his leader and friend. The nature and environment of the court — life tenure, the influence of colleagues — loosened his other allegiances. It was to be a story often told. Joseph Story betrayed Madison; Oliver Wendell Holmes bitterly disappointed Theodore Roosevelt; the Warren Burger court slapped Richard Nixon in the face.
There were strong leaders and builders in the state courts, too. James Kent dominated his court in New York, in the early 19th century. As he remembered it, "The first practice was for each judge to give his portion of opinions, when we all agreed, but that gradually fell off, and, for the last two or three years before I left the Bench, I gave the most of them. I remember that in eighth Johnson all the opinions for one term are 'per curiam.' The fact is I wrote them all...." Kent's pride in his work was justified. But the opportunity for creative work was there. The judges were independent in two senses: free from England, but also free, for the moment, from stifling partisan control. Also, they were published. The colonial judges, who left no monuments behind, are forgotten men. From 1800 on, strong-minded American judges, whose work was recorded, influenced their courts and the law. In New York there was, as we mentioned, Chancellor Kent (1776-1847); in Massachusetts, Theophilus Parsons (1750-1813) and Lemuel Shaw (1781-1861); in Pennsylvania, John B. Gibson (1780-1853); in North Carolina, Thomas Ruffin (1787-1870); in Ohio, Peter Hitchcock (1781-1853); in Louisiana, Francis Xavier Martin (1762-1846).
The spheres of these state judges were less floodlit, of course, than the Supreme Court in its greater moments. Their work had less national significance. But in their states, and in the world of the common law, they made a definite impact. Some were excellent stylists; all wrote in what Karl Llewellyn has called the Grand Style: their opinions were often little treatises, moving from elegant premise to elaborate conclusion, ranging far and wide over subject matter boldly defined. They were, at their best, far-sighted men, impatient with narrow legal logic. Marshall, Gibson, and Shaw could write for pages without citing a shred of "authority." They did not choose to base their decisions on precedent alone; law had to be chiseled out of basic principle; the traditions of the past were merely evidence of principle, and rebuttable. Their grasp of the spirit of the law was tempered by what they understood to be the needs of a living society. Some were conservative men, passionately attached to tradition; but they honored
What Our Readers Are Saying
Other books you might like