Too Old to Be President?

Age has been a big factor in this election.

For the first time, two candidates in their 70s are running for the nation’s highest office. And as you’d expect, both parties are claiming the other’s candidate is feeble, disoriented, and making no sense — i.e., too old for the job.

But 70 years doesn’t mean decrepitude as it once did. “Threescore and ten” years was the lifespan the Bible allotted to a human, but today’s 70-year-olds are different. They’re generally healthier, more active, and less mentally impaired than their parents or grandparents were at that age (if they even reached that age). Can an older candidate be less competent because of age? Certainly. But incompetence can be found in candidates of any age.

Perhaps the concern with age issue is really a concern over health: can a 70-year-old endure the stress that comes with the Oval Office?

The chances are good for either candidate because presidents appear to be unusually hardy.

For example, the Republican Party tried to recruit Dwight Eisenhower to be their presidential candidate in 1948. He turned them down, concluding he would be unelectable. They expected Thomas Dewey — the candidate they chose instead — to serve two terms. Which would make Eisenhower 66 years old if he chose to serve in 1956, and the country wouldn’t want someone that old.

But Eisenhower ran in 1951 and won. Three years later, he had a heart attack, but entered the race again in 1955, and won again. After he left the White House, he continued to play a dominant role in the Republican party until he passed away at 78.

Gerald Ford was 61 when he assumed the presidency upon Nixon’s resignation in 1974. He lived 29 years more. Ronald Reagan, aged 69 years at his 1981 inauguration, served two terms and lived 16 years beyond that. George H.W. Bush was 64 when he entered the Oval Office in 1989. He lived another 29 years.

And of course, there’s Jimmy Carter, who was elected at the tender age of 52. Thirty-nine years later, he’s still with us, building homes for Habitat for Humanity.

It’s significant that, of the six presidents who have celebrated their 90th birthday, four — Jimmy Carter, Gerald Ford, Ronald Reagan, and George H.W. Bush — served in the past 50 years.

But the number of decades is just one way to consider age. We can also judge a president’s age relative to the average lifespan of his time.

Up to the 1930s, Americans could think themselves lucky if they reached their 65th birthday. But our lifespan has continually lengthened; since 1920, the average American has gained 25 years of life.

Historians have estimated that, in the centuries preceding the 1800s, the average human lived just 35 years. The number is surprisingly low because it is calculated from the ages of all deaths within a year. Nearly half of these deaths (46 percent) were among children under the age of five, which lowered the average age of mortality for adults.

One researcher has concluded that a more realistic average lifespan of a 20-year-old American in 1800 was 47 years — still not a long life. Which is what makes John Adams so exceptional. Adams became president at the age of 61 — fourteen years beyond his expected lifetime. And he lived 25 years beyond his presidency!

Adams’s son, President John Quincy Adams, lived to 80. Thomas Jefferson reached 83, and James Madison saw his 85th birthday.

Today, the average American lives 78.54 years. But an American male who reaches the age of 65, according to the National Center for Health Statistics, has a good chance of living another 19 years.

Which means either candidate might be likely to live to the age of 84 – or beyond.

It’s possible that presidents in their 70s will be looked on more favorably as the proportion of elders in the population increases. By 2060, a quarter of the U.S. population will be over 65 years and old, and the average American lifespan will have risen from 74 to 85 years.

Children today may live to hear candidates someday complain that their 100-year-old opponents are too old to be president.

Featured image: John Adams, Dwight Eisenhower, and Andrew Jackson, three of the older presidents when they assumed office (Adams: National Gallery of Art; Eisenhower: Wikimedia Commons; Jackson: whitehousehistory.org)

Considering History: The Washington Redskins, Oklahoma Territory, and the Myth of the “Vanishing Indian”

This series by American studies professor Ben Railton explores the connections between America’s past and present. 

Recently, we have seen two striking decisions on longstanding issues connected to Native American communities. On July 9, in one of its last announced decisions of this session, the Supreme Court ruled that the eastern half of the state of Oklahoma remains “Indian territory” under 19th century treaties that guaranteed the land for tribes forced west on the Trail of Tears, treaties that have never been amended by Congress and so (the Court ruled) continue to hold today. And on July 13, the NFL franchise the Washington Redskins formally announced that the team will be changing its nearly century-old racist name and logo.

Map of the Trail of Tears, from Georgia to the Oklahoma territory
Trail of Tears National Historic Trail (Wikimedia Commons / National Park Service)

The details of these two decisions are quite specific to their respective arenas, and reflect evolving conversations in each case. But taken together, they exemplify two longstanding and contrasting national narratives: those which depict Native Americans through reductive, stereotypical images in order to justify attempts to expel and destroy native communities; and those which recognize instead Native Americans’ foundational and continued presence within America.

In order to justify his proposed, genocidal Indian Removal policy which produced the Trail of Tears, President Andrew Jackson depicted Native Americans through such stereotypical and racist imagery, defining them as savages unable to coexist with European-American communities or indeed exist at all within the expanding Early Republic United States. In his December 1829 first President’s Annual Message to Congress, Jackson argued that “this fate [disappearance] surely awaits them if they remain within the limits of the States.” And in his December 1833 fifth annual message, Jackson went much further, claiming, “That those tribes cannot exist surrounded by our settlements and in continual contact with our citizens is certain…Established in the midst of another and a superior race, and without appreciating the causes of their inferiority or seeking to control them, they must necessarily yield to the force of circumstances and ere long disappear.”

Jackson’s more aggressively racist depictions of Native Americans were complemented by a gentler and more insidious exclusionary perspective that came to be known as the “Vanishing Indian” narrative. Often advanced by figures and communities sympathetic to native rights, this narrative relied on images like the admirable but doomed “Noble Savage” to depict Native Americans as tragically but inevitably disappearing from the continent. James Fenimore Cooper’s bestselling historical novel The Last of the Mohicans (1826) illustrates this trope, arguing from its title on that its heroic Native-American characters are the “last” of a tribe that in fact still existed in Cooper’s era (and still does in our own). Poet Lydia Huntley Sigourney’s “Indian Names” (1834) is even more striking, as Sigourney uses the persistence of Native-American names on the American landscape to angrily (and inaccurately) lament the destruction of native communities: “Ye say they all have passed away,/That noble race and brave/… But their name is on your waters,/Ye may not wash it out.”

Illustration from the novel, Last of the Mohicians
A 1919 illustration by artists N.C. Wyeth of The Last of the Mohicans (Wikimedia Commons)

The “Vanishing Indian” narrative became a dominant trope across the 19th century and into the 20th, and in subtle but significant ways informed numerous prominent cultural works, as illustrated by Wellesley Professor Katharine Lee Bates’s 1893 poem “America” that became the lyrics for the song “America the Beautiful” (1910). That trope is most apparent in Bates’s second verse, where she describes America’s origin point: “O beautiful for pilgrim feet/Whose stern, impassioned stress/A thoroughfare for freedom beat/Across the wilderness!” But Bates likewise disappears Native Americans from the idealized images of the national landscape with which her poem opens, and which were inspired by a cross-country train trip from her Massachusetts home to a summer teaching job in Colorado. Bates originally named her poem “Pike’s Peak,” as she wrote it after a day trip ascending the mountain — but, like that name itself, her poem leaves no place for the Ute and Arapaho tribes who had been part of that landscape long before explorer Zebulon Pike arrived.

Bates’s 1893 trip took place just two-and-a-half years after the December 1890 Wounded Knee massacre in South Dakota, an exemplification of the far more violent exclusionary attacks upon Native-American communities that continued throughout the 19th century. The late 19th and early 20th century also featured another, even more insidious form of cultural genocide designed to facilitate the disappearance of Native-American cultures: the boarding school movement, which (as Carlisle Indian Industrial School founder and former “Indian Wars” Captain Richard Pratt put it) was intended to “kill the Indian, and save the man.” Both these military and educational efforts illustrate that the “Vanishing Indian” was not just a cultural image, but also and most importantly an ongoing white supremacist goal.

Students at the Carlisle Indian Industrial school
Pupils in the Carlisle Indian Industrial School, c. 1900 (Wikimedia Commons)

As I wrote in this October 2018 Considering History column, Native-American activists and communities have consistently used legal and political means to challenge such attacks and advocate for their survival and their rights. In response to Jackson’s Indian Removal policy the Cherokee nation did so forcefully, with the tribe’s leaders drafting a series of “Cherokee Memorials” to the U.S. Congress that were published in the tribe’s newspaper The Cherokee Phoenix and submitted to Congress to request their assistance. The Cherokee also took their claims to the court system, and the Supreme Court sided with their rights to their land in the Worcester v. Georgia (1832) decision. Jackson famously ignored the Court and proceeded with removal.

Native-American legal and political efforts continued for the next century-and-a-half, including the two prominent moments I highlighted in my earlier column: Ponca Chief Standing Bear’s legal challenge which resulted in the groundbreaking 1879 decision that “an Indian is a person” for purposes of the law (and otherwise); and the numerous activists, including Zitkala-Ša and Nipo Strongheart, whose advocacy led to the watershed 1924 Indian Citizenship Act.

The 1960s and 1970s American Indian Movement, founded in Minneapolis in 1968, continued to utilize the political and legal system to advocate for native lives and rights, while fostering the Red Power and Native-American Renaissance cultural movements of late 20th century America.

Flag of the American Indiana Movement
Flag of the American Indian Movement (Wikimedia Commons)

These destructive myths and their effects have endured, as illustrated by the horrifically high numbers of COVID-19 cases and deaths on Native-American reservations — one more way (among too many, including the kidnappings and murders of native women and the destruction of native lands for pipeline construction) in which native communities continue to face the threat of disappearance. But the Supreme Court decision embodies a perspective in which Native-American rights and presence are seen and supported, and where native communities are defined as foundational and integral parts of America’s history, identity, and future.

Featured image: The delegation of Sioux chiefs to ratify the sale of lands in South Dakota to the U.S. government, December1889 / photo by C.M. Bell, Washington, D.C. (Library of Congress)

The Long Tradition of the Smear Campaign

Daddy Cleveland
"Another Voice for Cleveland"

There’s always the hope, with the start of every presidential campaign, that this time it will be different. This year, maybe the candidates will offer intelligent, practical solutions to the country’s problems. They emphasize what they’ll do, not dwell on the many shortcomings of their opponent.

And usually we’re disappointed. No matter how earnest and well-intentioned a presidential campaign begins, by the time it approaches the finish line, it usually assumes an atmosphere somewhere between a carnival midway and a bar fight.

We had an intelligent, respectable election once, and the winner was George Washington. By the time the next election came around, the gloves were off and the tar buckets filled, as Jack Anderson pointed out. [The Pulitzer-prize winning author’s article—”The Dirtiest Campaign Tricks in History”—appeared in the Post on November, 1976]

In the 1796 election, John Adams suffered a blow when the Boston Independent Chronicle alleged that during the Revolution he had publicly supported Washington while surreptitiously attempting to have the General cashiered. In truth, it was Adams’s second cousin, Sam, who had sought Washington’s scalp.

Adams’s opponent, Thomas Jefferson … was accused of being the son of a half-breed Indian and a mulatto father. Voters were warned that Jefferson’s election would result in a civil war and a national orgy of rape, incest, and adultery.

Andrew Jackson's ultimate goal, according to opponents.

Andrew Jackson [was portrayed by his opponents] as a bloodthirsty wild man; a trigger-happy brawler; the son of a prostitute and a black man… his older brother had been sold as a slave [and] Jackson … had put to death soldiers who had offended him. Worst of all, Jackson and his wife were depicted as adulterers. Through a technical mixup, Rachael Jackson had married Andrew before her first husband divorced her. “Ought a convicted adulteress and her paramour husband be placed in the highest offices of this free and Christian land?” screamed the Cincinnati Gazette. Rachael succumbed to a heart attack before the couple could move into the White House, and many of Jackson’s advocates attributed her death to the calumnious campaign of 1828.

In 1839, Martin Van Buren was accused of being too close to the Pope, when, in fact, he had done little more than correspond with the Vatican in his job as Secretary of State under Andrew Jackson. His opponents, nevertheless, spread the canard that a “popish plot” was afoot to ensure Van Buren’s election.

During the Polk-Clay race of 1844 the Ithaca, New York, Chronicle [quoted] … one Baron Roorback … [who] had witnessed the purchase of 43 slaves by James K. Polk. The entire story was a hoax. Polk had purchased no slaves; in fact, there was no Baron Roorback. But that didn’t keep the story from gaining wide attention.

During the campaign of 1864, Lincoln was tagged with every filthy name in the political lexicon, from ape to ghoul to traitor. Midway through his first term, his detractors accused his wife of collaborating with Confederates, a charge which compelled the President to appear, uninvited, before a Senate committee which was secretly considering the allegations [and swear to his wife’s innocence.]

In a rather complicated cartoon, Satan lures James Polk toward war with Britain over the Oregon territory.Click image to enlarge.

The campaign of 1884 held the dubious honor of being the dirtiest in American history. … In July, the Buffalo Evening Telegraph … accused Cleveland of fathering an illegitimate son a decade earlier in Buffalo. It turned out that Cleveland, a bachelor, had dated the child’s mother, as had several other men. The boy, therefore, was of questionable parentage. Yet the inherently decent Cleveland had provided for him. A chant soon arose in Republican ranks: “Ma! Ma! Where’s my pa? Gone to the White House, ha! ha! ha!”

Cleveland’s opponent, James G. Blaine … involved in a business scandal. A railroad line had permitted him to sell bonds for a generous commission in return for a land grant. “Burn this letter!” Blaine instructed one cohort in a cover-up attempt. Thus evolved the Democratic comeback to Cleveland’s critics: “Blaine, Blaine, James G. Blaine, the continental liar from the State of Maine.”

Warren Harding… became the subject of a whispering campaign about his ancestry. A great-grandmother, it was alleged, had been a Negro, and a great-grandfather had Negro blood.

The dirty tricks don’t end once the ballots had been cast, either.

Candidate Lincoln, according to Pro-South Democrats, would lead the country straight into insanity.

In the election of 1876, Democrat Samuel Tilden won the popular election but fell one electoral vote shy of a majority. The electoral tallies in several states were counted and recounted, juggled and changed, until finally the election was thrown into the Congress. A Republican Senate and a Democratic House set up an Electoral Commission to decide the winner. Through some political maneuvering that fairly reeked of scandal, Republican Rutherford B. Hayes was declared the victor.

Lyndon Johnson first won his Senate seat in 1948 by an 87-vote margin when 203 previously unnoticed ballots were miraculously discovered several days after the election. The “voters,” curiously, had approached the polls in alphabetical order, and 202 of them had cast their marks beside the Johnson name. This election gave LBJ his nickname of “Landslide Lyndon.”

Dead men not only vote in American elections; occasionally they are candidates. Philadelphia’s Democratic party bosses, for example, ran a dead man in last April’s primary. The cadaverous candidate was Congressman William Barrett, who departed the scene fifteen days before the election. The party hacks kept Barrett’s name on the ballot in the hope that uninformed voters would select him anyway. Thus the bosses could handpick his replacement.Barrett won.

 

Next: The Big Change in Presidential Campaigns