It’s easy to say that a show redefined television, but it’s much harder to prove. In the case of The Mary Tyler Moore Show, you might say that the proof is all around. The series recreated the mold of the ensemble comedy. It changed the way that comedy shows were directed. It had a cast that was strong enough to spin off three separate characters into their own series (one of which was a drama!). And it wasn’t afraid to engage in very serious topics, including some that were taboo for the time. The show still shows up on “Best of” lists, including best writing, acting, and direction, as well as Best Finale and Funniest Moment (seriously, the Chuckles funeral). It’s a program where everything still holds up remarkably well five decades later. That’s right; The Mary Tyler Moore Show launched 50 years ago, and TV is much better because of it.
Series co-creator Allan Burns started writing in animation, working on Jay Ward productions like The Rocky and Bullwinkle Show. He co-created The Munsters and worked as a story editor on Get Smart. James L. Brooks broke into TV news in the 1960s on the writing side. Brooks met Burns at a party, and Burns got him TV writing work. After working on several shows, Brooks created Room 222; when he left after the first year to develop other projects, he got Burns to come aboard as producer. Soon after, Grant Tinker, a programming executive at CBS, hired the duo to create a show for his wife. His wife happened to be Mary Tyler Moore, who was already beloved and famous for her long-running role as Laura Petrie on The Dick Van Dyke Show. Leaning on Brooks’s background, they decided to build the show around the goings-on in a metropolitan TV newsroom with Moore’s Mary Richards as the associate producer at the center.
The nucleus of the newsroom cast was Moore, Ed Asner, Gavin MacLeod, and Ted Knight; Valerie Harper and Cloris Leachman played Mary’s best friend, Rhoda, and neighbor, Phyllis, respectively. Over the years, as Harper and Leachman left for spin-offs devoted to their characters, the cast would add Georgia Engel and Betty White to great effect. The Mary Tyler Moore Show managed to be both entertaining and relevant. Mary Richards was single throughout the tenure of the show, and not forcing the character to be defined by a man or relationship was groundbreaking. Similarly, Rhoda grappled with body image issues. No character was one-note; even Ted Knight’s Ted Baxter, for his incompetent bluster, had moments of humanity and deepened further when Engel’s Georgette was added to the cast as his wife. The shading of each character, rather than simply assigning a type and relying on it, became a sitcom staple.
While the actors made it all work, the writing and directing had a great deal to do with it. Brooks would apply the dynamics of ensemble building to shows like Taxi and The Simpsons. When you watch an episode like “Chuckles Bites the Dust,” you’re witnessing something akin to an eight-sided tennis match; each character is bouncing jokes and responses back and forth, but the jokes are all rooted in that particular actor’s character. When Mary can’t control her giggles during the clown’s funeral, and later bursts into tears, it’s funny because a) it’s funny, b) it’s believable human behavior, c) it’s all true to what we know about Mary, and d) it’s deeply relatable. If you’re thinking that the same principles apply to many episodes of the show, you’d be right.
Another important facet of the show was the fact that it didn’t turn its back on difficult topics in society. Like the aforementioned personal struggles that Mary and Rhoda had, characters faced personal difficulties or were involved in plots that brought up issues of the day (and today). One memorable moment came in the episode “You’ve Got a Friend;” when Mary’s visiting mother told Mary’s father not to forget to take his pill, he and Mary both replied, “I won’t,” implying, of course, that Mary was on birth control. The show also addressed equal pay for women and many more storylines that remain relevant today.
As the show went on, appreciation for it grew. It pulled in 29 Emmys, including three for Outstanding Comedy Series in 1975, 1976, and 1977. Moore also won three times for Outstanding Lead Actress in a Comedy Series. It was a solid Top 20 entry in the ratings for most of its run, with three years spent in the Top Ten. When the show received a Peabody Award in 1977, it came with the state that the show had “established the benchmark by which all situation comedies must be judged.” Since the show’s end after its seventh and final season, it has routinely placed on lists recounting the best in television, including lists from TV Guide, Entertainment Weekly, and USA Today. In 2013, the Writers Guild of America put it number six on their list of the best written television series of all time.
After the series ended in 1977, a third spin-off, Lou Grant, was launched. Ed Asner led the series for five seasons, during which it won 13 Emmys, two Golden Globes, and its own Peabody. Plans were made for Mary and Rhoda to reunite in a sitcom; however, those were later abandoned. Mary and Rhoda did meet up again in the 2000 TV movie Mary and Rhoda. A 2002 reunion special brought the entire surviving cast back together (Knight had passed in 1986) for a retrospective look at the series. Today, all seven seasons are available to watch on Hulu.
After the series, Moore worked continuously across film, television, and theater. She earned an Oscar nomination for her role in 1980’s Ordinary People. She and Tinker divorced in 1981, and she married Robert Levine in 1983. A type 1 diabetic, Moore served for years as the international chairperson for the Juvenile Diabetes Research Foundation. She was also active in animal rights and in the restoration and preservation of Civil War history. Moore passed way in 2017 at the age of 80.
The Mary Tyler Moore Show left an indelible mark on television. You can see its DNA in everything from Cheers to Friends to The Office. Any show with a workplace at the center is bound to be compared to it, and any show that features adults talking to each other like adults recalls its boldness. Few shows are daring enough to make you laugh at a clown’s funeral; far fewer could make it one of the most memorable scenes in TV history. A classic by any measure, the show’s impact likely never go away. It certainly made it, after all.
During the run of the show, the Post went behind-the-scenes with Moore in a wide-ranging interview from 1974. You can read that story below.
Featured image: Cast photo from the television program The Mary Tyler Moore Show. After the news that most of the WJM-TV staff has been fired, everyone gathers in the newsroom. From left: Betty White (Sue Ann Nivens), Gavin MacLeod (Murray Slaughter), Ed Asner (Lou Grant), Georgia Engel (Georgette Baxter), Ted Knight (Ted Baxter), Mary Tyler Moore (Mary Richards). (Publicity Image from CBS Television; Public Domain via Wikimedia Commons)
Run Time: 1 hour 49 minutes
Stars: Rosamund Pike, Sam Riley, Yvette Feuer, Aneurin Barnard
Writer: Jack Thorne
Director: Marjane Satrapi
Streaming on Amazon Prime
Tackling the life story of pioneering nuclear scientist Marie Curie, Rosamund Pike continues her recent explorations of tough-to-pin-down historic women — pithy females who tossed aside their cultures’ expectations and plunged stubbornly forward, either failing to hear the cries of objection or simply choosing to ignore them.
In A United Kingdom she played Ruth Williams, a London woman who defied the social norms of two countries when she married an African king in the late 1940s. She chain-smoked and growled her way through A Private War, painting an uncompromisingly coarse portrait of war correspondent Marie Colvin. And here in Radioactive, playing the Mother of the Atomic Age, Pike strikes yet another defiant pose as a woman who, despite her obvious brilliance, battles at every turn to make her mark in the male-dominated scientific world of the early 20th century.
It’s a startling performance that commands virtually every moment of the film’s run time, as Pike’s Curie runs into one institutional blockade after another.
Unmarried and fighting to keep her position at a Paris laboratory, in one early scene Curie storms into an all-male (of course) board meeting to demand more lab space — and ends up fired. Facing professional ruin, Pike’s face swims with conflicting emotions: fury, surprise, hurt, and dread fear. But even as her expressions flit from one state of mind to another, she seems to grow in stature to the point where the guys with the cigars — and we — begin to wonder if she’s going to leap at them from across the conference table.
So prickly is Pike’s Curie that we almost gasp in astonishment when she lowers her stoic resolve long enough — but just barely so — to fall in love with and marry Pierre Curie, an uncommonly open-minded fellow scientist (Sam Reilly, channeling the same suave charm that made him the perfect alt-world Mr. Darcy in 2016’s Pride and Prejudice and Zombies).
Iranian director Marjane Satrapi — Oscar-nominated for her animated film Persepolis — knows she’s got a good thing going in Pike. When’s she’s not simply turning the whole film over to her star she’s taking devilish delight in depicting the world’s turn-of-the-century radiation mania — depicting with dark glee such products as radioactive toothpaste and chewing gum. Of course, the fun is all over when Marie, Pierre, and their fellow scientists start coughing up blood, signaling the awful realities of unchecked radiation.
Screenwriter Jack Thorne (Wonder, The Aeronauts) makes the risky choice of repeatedly flash forwarding to the decades following Curie’s death, reminding us of the world her discoveries created, from the atomic bomb to radiation therapy to Chernobyl. Indeed, at times he literally injects Madame Curie herself into these scenes, a ghostly witness to her complicated legacy. It doesn’t always work — Curie’s life is compelling enough without resorting to a tricked-up narrative — but the ploy does serve to remind us that although the events here unfolded more than a century ago, some of the modern world’s most profound dilemmas harken back to that dusty, irradiated laboratory in pre-World War I Paris.
In any case, all is forgiven whenever Pike is on the screen. History is always more fun when filmmakers leave the rough edges intact, and Radioactive does just that — thanks mainly to the superb work of an actor who thrives on showing those edges in stark, supremely human, relief.
Featured image: Rosamund Pike in Radioactive (Photo Credit: Laurie Sparham; StudioCanal/Amazon Studios)
In a time before television or Twitter, the traveling circus was a dominant, all-American form of entertainment. The young and old, rich and poor alike would gather to witness the exotic animals and superhuman acrobatics under the big top each summer. Traveling circuses would shut down entire towns with their parades and engulf the country in “sawdust and spangles.”
The larger-than-life showmen behind the country’s great circuses, like P.T. Barnum, James Bailey, and the Ringling Brothers, are still household names, but many of the performers, daredevils, and animal trainers who astounded their audiences and made weekly headlines have been forgotten.
Part of the appeal of circus folk lay in the mystery surrounding them. Who were they and where did they come from, these people who descended on the most provincial towns with strange talents and glamorous costumes? At the turn of the century, the circus spotlight began to shine on women, highlighting their athletic talents juxtaposed with dainty beauty.
Historian Janet M. Davis writes in The Circus Age that “In an era when a majority of women’s roles were still circumscribed by Victorian ideals of domesticity and feminine propriety, circus women’s performances celebrated female power, thereby representing a startling alternative to contemporary social norms.”
According to “The Circus Girl,” an article by L.B. Yates that appeared in the Post 100 years ago, the intrigue of the female circus performer was heightened by press agents who sowed fictional accounts to local papers of either some rich society girl or poor waif who had found a calling as a trick horse rider in the circus. It was usually a fabrication, as Yates claimed, since the vast majority of circus performers were born into the profession.
But the treatment of this new crop of female performers, in the press and in the culture, was a tightrope act of sorts. The circus offered a home for some outsiders and castaways and provided independence and adventure for its female stars. For audiences, it afforded titillating, exotic glimpses of the limits of the human body while promising to uphold decency in its ranks.
The women of the American circus during its golden age worked tirelessly to perfect their acts, stealing shows and breaking stunt records at a unique time in history that straddled the old conventions with the new promise of suffrage and feminism.
“I resent having people come to my tent, stare at me as though I were a freak and then turn away laughing, as if they’d seen some wild animal,” the famous aerialist Lillian Leitzel told the Post in 1920. “They seem to assume that circus people have not got beyond the primitive stage of the cave man and are an aggregation of unlettered louts wholly devoid of the commonest sense of social amenities.”
Leitzel was indeed rich in social amenities, and she was also just plain rich. According to historian Janet M. Davis, the performer was making up to $200 a week in 1917 with Ringling Brothers (worth more than $4,000 in 2020), and it wasn’t uncommon for female circus stars to rake in more than their male counterparts. She had her own train car that contained a piano, and at each stop she would dress in her own private tent. By the 1920s, she was pulling in $500 per week, according to John Culhane in The American Circus.
Known as a brassy primadonna, Leitzel grew up in a Czech circus family and began in the gymnasium at 3 years old. Her signature act was to bound up the rope and hold her hand in a loop while throwing her body up and around, dislocating her shoulder over and over again as she performed her “one-arm planges.” As a percussionist carried on a drumroll, the audience would count her swingovers, one time as many as 249.
“Accidents? Oh, well, they’re liable to happen any time,” Leitzel told the Post. “But I never think of that. Whenever a performer gets to studying about the chances he or she is bound to take, he has outlived his usefulness. An acrobat must have hundred per cent nerves.”
She entertained other elite performers, senators, businessmen, and especially children in her private car. She was known for giving parties for all of the circus children and lavishing gifts and sweets on them, possibly because she was stripped of a childhood herself, according to Culhane.
Leitzel encouraged women to exercise to stay healthy, in spite of lingering Victorian notions that athleticism made women ugly and manlike. “Down with the corset,” she told newspapers in 1923. “Put a brick in the atrocious garment and hurl it into the Niagara River.” She encouraged women to take up swimming or some other sport they enjoyed, promising “you can eat what you want and work off the energy in exercise.”
Leitzel married the Mexican trapeze artist Alfredo Codona, and their passionate celebrity marriage was rife with jealousy and resentment. When Leitzel traveled to Europe in 1931, Codona went separately with an equestrian with whom he was having an affair. In Cophenhagen, Leitzel was performing her one-arm planges when the brass ring she was holding snapped, and she fell 20 feet onto her head. Although she insisted on continuing her act, her handlers sent her to the hospital. She died the next morning from a concussion.
Mabel Stark was working as a nurse — with a stint as a burlesque dancer — when she found a calling to work with big cats in Venice, California around 1911. She stumbled onto the grounds of the traveling Al G. Barnes Circus and became obsessed with Bengal tigers. Stark joined the show, and within a few years she was a big cat trainer. Stark was the first American woman to take up the dangerous profession, let alone to achieve such renown.
Over the years, she became one of the most famous tiger (and lion and panther) tamers in the world, joining the Ringling Brothers in 1920. Newspapers gushed over her unique wrestling act. Stark would roll around with any of her 18 tigers, giving the impression of a perilous struggle.
Although Stark spent most of her time with her cats and treasured them dearly, she never lost sight of the risk of working with tigers. She gave some insight to a Public Ledger reporter in 1923: “The tone of the voice, the determination of it, has a great deal to do with subduing wild animals. Don’t let uncertainty or fear creep into your tones, or you’re gone. A great life, so to speak, if you don’t weaken.”
As with many who work with predator animals, Stark sustained serious injuries throughout her career (according to a profile in the St. Louis Globe-Democrat in 1950). In Maine, she was mauled in the ring and required 378 stitches. In Arizona, she was bitten by a tiger named Nellie and finished her act with a limp left arm. In spite of it all, she continued to defend her fierce friends and the connection she had with them.
Stark’s story ultimately ended in tragedy. After retiring from the circus in the late 1930s, she settled in at an amusement park called Jungleland. Stark was let go for insurance reasons, and then one of her tigers escaped and was shot. The devastated near-80-year-old took her own life.
Billed as “The World’s Greatest Rider,” May Wirth was only 17 when she made her first appearance in the Ringling Brothers’ show at Madison Square Garden. The Australian transplant wore a giant hair bow and leapt from horse to horse, completing flips and twists that dazzled audiences and — sometimes moreso — other riders.
Wirth was an orphan who performed as a five-year-old contortionist and was eventually adopted by the down-under circus rider Marilyas Wirth Martin. According to the Braathens, two Madison, Wisconsin circus buffs writing in The Capital Times in 1973, Will Rogers witnessed young Wirth riding in her home country and predicted “the day would come when there would be but two types of bareback riders in the world, May Wirth and all the others.”
When the Braathens asked about her legendary forward somersault, Wirth recalled that Mr. Ringling was skeptical that anyone could complete such a trick, and when she tried it for him: “I missed it and fell on my back right in front of Mr. John. I got up like a streak of lightning, ran and vaulted onto the horse again and did the stunt over for him, this perfectly, and did the somersault feet to feet. That earned me my first season’s contract with John Ringling.”
The sweetheart of the circus made the front page of The New York Times after an unfortunate slip in 1913 caused her to be dragged around the track behind her horse by her feet. Nine years later, the paper reported that she had accepted the challenge of riding a bull, and better yet: “Miss Wirth not only rode the bull, reputed to be a most ferocious beast, 3 years old and weighing 2,400 pounds, but she rode him standing on her hands upon his back.”
Wirth retired from the big top in 1937 and went to live with her mother in New York. She later moved to Sarasota, where the Ringling Museum and Circus Hall of Fame was located. She was inducted in 1964.
Featured image: 1890-1900, Calvert Litho. Co., Library of Congress Prints and Photographs Division Washington, D.C.
The Circus Age by Janet M. Davis
The American Circus by John Culhane
Women of the American Circus, 1880-1940 by Katherine H. Adams and Michael L. Keene
This series by American studies professor Ben Railton explores the connections between America’s past and present.
Late last week, Massachusetts Senator Elizabeth Warren ended her bid for the Democratic presidential nomination, leaving the primary without any of the ground-breakingly high number of women who had once been part of its slate of candidates. (Hawaii Congresswoman Tulsi Gabbard remains in the race but has received only two delegates and has never polled above low single digits.) While Warren’s campaign and exit were both influenced by a number of factors, her departure has occasioned numerous commentaries on the continued, frustrating reality (particularly when compared to most other nations in the world) that the United States has never elected a woman to its highest political office — a reality particularly worth engaging on the occasion of Women’s History Month.
The presidency is a strikingly visible element of American society, making the historic and continued absence of women likewise quite apparent. But that absence also reflects a far wider and deeper, and yet often more difficult to spot, aspect of our collective histories: the ways in which sexism and the glass ceiling have driven so many of our most talented and impressive women out of their chosen professions, leaving our entire society profoundly diminished as a result. Few American figures and stories encapsulate that hidden cost of sexism better than the architect Sophia Hayden Bennett (1868-1953).
Hayden was born in Santiago, Chile, to a Chilean mother (Elezena Fernandez) and an American father (George Hayden), a dentist who had moved to Chile from his native Boston a few years earlier. When she was six, her parents sent her back to the Boston area by herself, to live with her grandparents in Jamaica Plain and attend school. While studying at West Roxbury High School she became interested in architecture, and she would go on to attend MIT, graduating in 1890 as one of the first two female graduates of a collegiate architecture program in American history (her classmate, Lois Lilley Howe, with whom Hayden shared a drafting room at MIT, was the second).
Despite that prestigious degree, Hayden was unable to find an apprentice position at any local architecture firms and took a job teaching mechanical drawing at the Eliot School, a vocational grammar school in Boston. But less than a year later she learned of a strikingly unique new opportunity: the chance to design the Woman’s Building, one of the planned exhibition halls for the 1893 World’s Columbian Exposition in Chicago. After extensive negotiation, women’s rights groups had convinced the exposition directors to create a Board of Lady Managers comprised entirely of women, and in February 1891 that board, headed by the socialite and activist Bertha Honoré Palmer, announced a competition for the Woman’s Building design, open only to female architects.
The competition’s $1000 prize/commission was significantly less than what was offered to male architects for the exposition’s other buildings, and so Louise Blanchard Bethune, considered the first professional female architect in America, refused to submit a concept. But Hayden did submit, and out of the 13 proposed designs it was her innovative concept, based in part on concepts from Italian Renaissance classicism, that the Board (along with Chief of Construction Daniel Burnham) selected as the winner. She traveled to Chicago to begin work on turning that design into construction plans as quickly as possible, as construction needed to begin in the summer of 1891 to be ready for the exposition’s May 1893 opening.
That hugely expedited timeline was only one of many challenges that Hayden would face over the next two years. Despite having chosen Hayden’s design, the Board of Lady Managers identified a number of perceived shortcomings and requested many changes, including the addition of an entirely new third story, which required Hayden to overhaul many other aspects as well. Moreover, Board Chair Palmer decided to take control of the building’s interior design away from Hayden entirely after the architect resisted some of the Board’s other substantial proposed changes. Hayden managed to respond to, complete, and work around such extraordinary requests within that very tight schedule and on budget to boot, and the Women’s Building was formally introduced at the exposition’s October 21, 1892, dedication ceremony. But shortly after, it was reported that Hayden suffered a possible nervous breakdown (in some reports referred to as “melancholia”) in Burnham’s office, and she was confined to a “rest home” for months in order to recover.
It’s impossible to know precisely the role that Hayden’s gender played in all those developments, although it’s certainly important to note that such medical diagnoses and conditions themselves were hugely gendered in the late 19th century (as illustrated by the long history of the illness known as “hysteria,” as well as related contexts like physician S. Weir Mitchell’s popular “rest cure” for women and the depiction of its destructive effects in Charlotte Perkins Gilman’s autobiographical short story “The Yellow Wallpaper”). Moreover, many of the prominent responses to her design were overtly limited by narratives of gender, as illustrated by architect and critic Henry van Brunt’s argument (as part of a long series entitled “Architecture at the World’s Columbian Exposition”) that the building seemed “rather lyric than epic in character, and [that] it takes its proper place on the Exposition grounds with a certain modest grace of manner not inappropriate to its uses and to its authorship.”
Although Hayden was “cured” in time to leave the rest home and attend the exposition before its November 1893 closing (after which the Women’s Building, like most of the exposition’s structures, was demolished), she would as far as we know never design another building. An editorial in the journal American Architect lamented that fate while managing to reinforce sexist narratives at the same time, writing, “Miss Hayden has been victimized by her fellow-women. The unkind strain would have been the same had the work been as unwisely imposed upon a masculine beginner.” Hayden returned to the Boston area, eventually marrying local painter William Bennett in May 1900. Their wedding announcement in the Boston Daily Advertiser noted that “both are highly esteemed and respected . . . versatile as well as talented,” but again as far as we know Hayden never again publicly employed her prodigious talents, living her remaining six decades as a private person in their Winthrop, MA home.
What might American architecture, America’s cities, American society have looked like if Sophia Hayden had continued to design buildings? What would American history look like if we had elected a woman president by now (or if women had received the vote before 1920, for another example)? Such questions remain painfully hypothetical and unknowable, reflecting a collective loss that parallels the individual and personal costs of sexism so embodied by a figure like Sophia Hayden.
Featured image: The 24 female MIT students in 1888. Sophia Hayden is in the front row, far left. (MIT Museum)
Today, it’s a time capsule. A Sinatra record playing on my portable record player as I studied for an exam in U.S. History, the little known “Night We Called It A Day.” Still a favorite. There was a moon out in space/But a cloud drifted over its face/You kissed me and went on your way/The night we called it a day.
The place was my dorm room at Stephens Junior College for Women in Columbia, Missouri, in 1942.
It was a time when men weren’t allowed on dorm floors above the entry lobby except the day of arrival or departure, for hellos and goodbyes and moving trunks. That included fathers. Brothers. Fiancés. It mattered not. And it was standing policy that if you were caught (even seen) in a car with a member of the opposite sex without written proof from home that he was your brother or fiancé, you were subject to immediate and automatic expulsion.
When you went into town for dinner or over to the University of Missouri to see the latest student play, you had to sign out … and back in, a counselor at the door that would be locked (depending on your year) at 10:00 or 11:00.
My first year, my roommate was from the Lone Star State. We quickly dubbed her “Texas.” Her father had a grocery store, and her packages from home came in big, big, big boxes. One of the first things I learned in college was that Hershey bars come packaged in boxes of 24 — not a good thing for me. (One of the special offerings of Stephens College was a Diet Table, for those who could lose a few pounds while gaining a wealth of new knowledge.)
The social life of Stephens revolved around the “Blue Rooms” — areas in dorms where we gathered to smoke, because we weren’t allowed to smoke in our rooms or anywhere else on campus. The Blue Rooms also had soft drinks and various snacks, as well as light lunches.
I particularly remember my visits the Sunday afternoons following our trips to town to see the latest movie. One time, it was Now, Voyager, and when I walked into the Blue Room I frequented near my dorm it was obvious who’d also just seen the film — the girls who were lighting two cigarettes and handing one to a friend, as Paul Henreid had done in the film. One of the more famous moments in his romance with Bette Davis, handing Davis the second cigarette.
Another time, everyone was singing, “You must remember this, a kiss is just a kiss, a sigh is just a sigh ….” But not as well as Dooley Wilson in Casablanca.
I don’t remember any war movies then. But the war was a factor in my being there. Stephens was the choice my parents decided upon (principally, my mother) because I was just 16 when I graduated from high school and the war had started. She apparently foresaw that college life would no longer be a Good News movie; indeed, universities would become training centers for the U.S. Armed Services — Army and Navy Air Force pilots and bombardiers — and she liked Stephens’ approach to female education.
Its president, James Madison Wood, felt that while women were entitled to an education — equal, in principle, to that of men — they had special needs and interests. He addressed this with the creation of a group of clinics for the Stephens girls. The clinics were highly publicized, perhaps because of the novel approach or perhaps because of the nature of some of the clinics.
The Marriage Clinic drew my second year roommate, who left school to marry her hometown sweetheart, then stationed at a Naval Air Training base. There was a Budget Clinic, to which my father kept pointing me. At a Fashion Clinic, I had a formal designed for me. And a Glamour Clinic — the one that had gained the most national attention. The one to which Stephens girls would go to learn how to make the most of their physical attributes: hair, make-up, lipstick colors.
My mother, for whom the Glamour Clinic was an added attraction to the other lures of Stephens, wanted me to get an appointment shortly after classes started. But, what with one thing and another — getting textbooks, settling in to classes — I had not made an appointment. Then, word spread about those who had.
They came back with their hair cut!
If that doesn’t sound like much today, it was the kiss of death then. We wore our hair long. Mine — even with the upturned curl of an incoming wave — grazed my shoulders. No way was I going to go near a place that wanted to cut my hair! And so I resisted my mother’s increasingly persistent questions in her letters as to when I had an appointment at the Glamour Clinic. That was an important feature she’d noticed in the school catalogue. Why didn’t I take advantage of all my opportunities at Stephens?
As the time for Christmas break approached, I felt I could not go home without going to the Glamour Clinic. It meant so much to her. Could I make clear I didn’t want my hair cut? So I went to make an appointment. The only one available was a Saturday in December. I booked it, not realizing it was also the Saturday of Hell Day for sorority pledges.
We were told by our sororities what costume to prepare and wear that day. And we would march around the main part of campus after lunch in our costumes. As with all things then, World War II affected Hell Day.
My fellow pledges and I were Flying Tigers, the name of a group of American volunteers who fought the Japanese in the China Burma Theatre before the Japanese bombed Pearl Harbor.
To become a Flying Tiger in Columbia, Missouri, I went into town and bought a suit of men’s long underwear, a package of Rit dye — orange, obviously — and a packet of black crepe paper. I duly dyed the suit and sewed strips of black crepe paper to simulate stripes. I braided some of the strips of black crepe paper for the tail. I made a propeller from the cardboard at the back of a notebook pad and, later, stuck the circular center to my forehead, also front and center, the blades vertical for visibility.
On that December morning I dressed in said outfit and headed across campus to Senior Hall, where all pledges were to dine that morning. Then I returned to my room, ready to head out not to a Saturday class but my appointment at the Glamour Clinic.
I appeared in the doorway as a faculty-type lady opened up for the day. I might be there for my mother but no way was I having my hair cut. Jaw set, words as emphatic as it was in my power to make them, I said, more proclamation than statement: “I won’t let you cut my hair!”
Something about the reaction of the Glamour Clinic lady made me realize that was not my problem. The frozen smile was a clue.
The faux Flying Tiger before her was not only eye-catching for starters, I was all the more noticeably overweight in the suit of men’s long underwear dyed a Technicolor-bright orange. I bulged in all the wrong places. Corrugated cellulite comes to mind. Sometimes the bulges synchronized with my tiger stripes, sometimes not. The wet snowfall had loosened my propeller, not securely stuck to my forehead, and it fell off. When I bent to retrieve it, my tail caught on something and almost came off. I quickly secured it.
Suffice to say, I went through the routine/procedure/protocol of the Glamour Clinic, which was designed to help me learn how to make the most of my appearance, in record time.
I don’t remember a thing until the end, when the Glamour Clinic lady had me seated at one of those basins where you get your hair washed at beauty parlors. I was in the chair, tipped back, head in the big metal tray, as she talked about my eyebrows. She may have plucked one or two stragglers. I just remember she was talking about my eyebrows when a voice was heard in the doorway. I say voice because I was flat on my back and the Glamour Clinic lady had her back to the door.
“I know I’m early, but ….”
“No. No,” said the Glamour Clinic lady. “Come right in.”
It would be hard to imagine a more sincere welcome. The Glamour Clinic lady’s heartfelt Thank you! to a merciful god for her deliverance.
No need to imagine the rest. She snapped me up to a sitting position with a speed that it’s a wonder my eye balls aren’t still spinning.
When I looked to the doorway, I understood the silence that followed. Settled over the otherwise empty expanse of the Glamour Clinic.
A speechless silence.
The next appointment was having trouble getting in.
She had outdone my flying tiger. With the help of some well-shaped cardboard and gray paint, she was a battleship.
Returning to my room, I sat down at my desk, pulled out a piece of stationery, and, pen poised, began.
About my appointment at the Glamour Clinic ….
Featured image: College women playing bridge, 1942 (Wikimedia Commons)
The Pilgrims were a central part of Thanksgiving when I was a child, and well into my adulthood. The Pilgrims offering thanks for a bountiful harvest that first fall in the New World — in October 1621, not today’s November — were the centerpiece of the holiday.
It’s different now.
When Thanksgiving rolls around each November, the Pilgrims have so faded from our history that their story might have been written in invisible ink.
If you think 1620 is so long ago — who cares when the Bears are kicking off against the Lions — let me tell you about one of them, especially appropriate this year, when women have been so much in the news.
Susanna Jackson White.
She was one of the passengers on the Mayflower when it sailed out of Plymouth Harbor in England on Sept. 16, 1620. It was a small ship, by today’s standards, 90 feet long and 27 feet wide. A tennis court is 78 feet long, 26 feet wide (for singles play). The Pilgrims didn’t have assigned cabins. They crossed in the cargo area … because they were the cargo. And a larger one than intended. The Pilgrims had started out with two ships, but the second, the Speedwell, developed a leak after sailing — not just once, but twice — and they had to return to England. After the second return to port, almost 40 of the Speedwell passengers were added to the original 65 on the Mayflower. The shortage of food, the rigors of a crossing so rough that one storm caused the ship’s pitching to crack a main beam, must surely have been the more difficult for Susanna, who was seven months pregnant.
The bad weather persisted after they reached America, sighting the “hook” of present-day Cape Cod. They tried repeatedly to sail south to their original destination, the Colony of Virginia, but the winter weather and rough seas forced them back to Cape Cod. The delays in sailing, the weather that then kept them from going farther south, meant they would have to set up their settlement in the New World in New England.
While the ship lay at anchor off Cape Cod at what is now Provincetown, Susanna’s husband, William White, on November 21, 1620, with the 40 other men signed the Mayflower Compact — the document that envisioned a government of laws, not men, a government that took its consent from the governed. The document that is considered by many to be the keystone of the Declaration of Independence and the Constitution of the United States.
While the ship lay at anchor, Susanna also produced her contribution to history: the first English-born child in New England, a son named Peregine.
It took a month for the Pilgrims to find a site for their settlement, a cleared area that had once been an Indian village on a good harbor. December 20, the Mayflower dropped anchor at Plymouth. Whether or not they actually stepped out on Plymouth Rock when the Mayflower shallop (think lifeboat or really big rowboat) reached the beach is not clear. Historians dismiss it, even the story that has come to be legend for some, that the first to do so was Mary Chilton, 12. But anyone who has travelled with children constantly asking, “Are we there yet?” can easily imagine her eagerness to jump out, rock or not.
To give the families safe shelter aboard ship while they built their houses ashore, the captain of the Mayflower delayed his return voyage to England. But the days were cold. New England winter cold. And after the long voyage on short rations, a “General Sickness” began to take its toll. William White was one of the 51 who died that first winter — February 21, 1621. The loss to Susanna, left alone with a 5-year-old son and new baby, was compounded by the danger inherent in their steadily shrinking number; indeed, to keep the Indians from knowing how few there were, the dead were buried at night, in a common grave that was not marked.
With the arrival of spring and warmer weather, the captain of the Mayflower made preparations to sail. He offered to take with him anyone who wished to return to England. Although 51 of the 102 had died — literally, half — not one went back.
Still, one can only begin to imagine Susanna’s feelings as she stood on the desolate shore, even the Whites’ two servants dead, her small son at her side and six-month-old baby in her arms, watching the ship sail away.
Leafing through an old book some years ago — an old-fashioned book printed on paper so thick it might have been used for Tiffany Christmas cards — I came across an illustration. The full-page drawing, with a decidedly romantic quality, as I remember it, showed Susanna looking up, to see Edward Winslow, whose wife had died March 24, watching not only the departing ship but Susanna. Whether this has any basis in fact, I know not. I do know Susanna White and Edward Winslow were married May 12th.
Anyone feeling the marriage showed undue haste should remember that the society as well as the settlement was built around the family; hence, in an age when death was a commonplace event, it was not unusual for widows and widowers to remarry quickly. Just take a stroll through an old cemetery.
The wedding of Susanna White and Edward Winslow made Susanna the first English bride in New England.
The spring also marked better times. A treaty of peace was negotiated with the local Indian Chief Massasoit. And Squanto, a Native American who spoke English because he’d lived in England (long story), taught the Pilgrims how to use locally caught fish to fertilize the land and how to plant corn … five kernels. (One for the blackbird, one for the crow, one for the cutworm, and two to grow.)
That fall, “Our harvest being gotten in,” as Edward Winslow put it, “our governor sent four men on fowling, that so we might after a special manner rejoice together.” The 90 Indians who came contributed five deer. Susanna, one of four adult women to survive the first year, presumably was there, perhaps basting one of the fowl.
When Edward Winslow, long a leader of the colony, became the Governor of the Plymouth Colony, serving in 1633-1634 — also, 1636-1637, 1644-1645 — Susanna became the First Lady of the colony.
In 1633 it was a much larger colony, land grants having been awarded in late 1627. Although it is not known exactly when they did so, Myles Standish and John Alden — of Henry Wadworth Longfellow fame and countless grade school Thanksgiving pageants — moved north to Duxbury. The Winslows went on to Green Harbor in what is now the town of Marshfield, where they built a handsome residence, “Careswell,” named for a family seat of Winslow’s in England. Their move north was prompted not only by the granting of land but by the fast-growing Massachusetts Bay Colony at Boston, which offered a market for the cattle they could raise.
The growth of the Massachusetts Bay Colony signaled the overshadowing of the Plymouth Colony. By 1643, Plymouth had joined the New England Confederation. Josiah Winslow, the first of Susanna and Edward Winslow’s children who lived to adulthood, followed in his father’s distinguished footsteps. Educated at the new Harvard College, he served as the Plymouth Commissioner to the Confederation.
During one of Edward Winslow’s trips to England on behalf of the colony, he was appointed by Oliver Cromwell to head an expedition against the Spaniards in the West Indies in 1655. He came down with fever on the voyage and died May 8, 1655.
Susanna and Josiah were mentioned in his will. The fact that Edward Winslow made no provision for Susanna in London, where he was living at this time, leads historians to conclude that she had remained at their home in Marshfield, Mass. And was still living.
Josiah Winslow continued to follow in his father’s footsteps.
When in 1673 he became governor of the Plymouth Colony, Susanna became the mother of the first native-born governor of any of the American colonies.
Josiah died December 18, 1680, in Marshfield. Because he made no mention of his mother in his will, which was dated July 2, 1675, it is assumed she was dead.
As there is no record of the year of Susanna’s death, there is no record of where she is buried, although it is thought she rests in the Winslow cemetery in Marshfield.
A year or so ago, though, I chanced upon the possibility that she is actually buried in the Old Granary Burying Ground in Boston. Laid out in 1660, originally part of the Boston Common, it is a popular site on The Freedom Trail today, for it is the final resting place of the victims of the Boston Massacre and three signers of the Declaration of Independence — John Hancock, Samuel Adams, and Robert Trent Paine — not to mention the parents of Benjamin Franklin. If Susanna is buried there, 150 years later, give or take — May 1818 — the empty grave to one side of her final resting place was occupied by another individual who has a place in our history: Paul Revere.
Although this cannot be substantiated, I pass it along, because there is a grave next to Paul Revere that has no gravestone. And John Endicott, the first governor of the Massachusetts Bay Colony, who died March 15, 1665, and was thought for more than 300 years to have been buried elsewhere, is actually in the Old Granary Burying Ground. His gravestone had been destroyed.
Why not Susanna’s?
Why not Susanna in the historic old cemetery on the Freedom Trail?
- The mother of the first English-born child in New England.
- The first bride in New England.
- The First Lady of the Plymouth Colony … on three occasions.
- The mother of the first native-born governor of an American colony.
And Susanna was just one chapter in the Pilgrim Story.
That deserves a place at the table any day. Particularly, the Thursday each year that is Thanksgiving.
Featured image: “The First Thanksgiving at Plymouth” (1914) By Jennie A. Brownscombe (Wikimedia Commons)
When news came that the Boer women of South Africa were fighting alongside men in their war against the British, the Post applauded.
In war’s long, dreary hours of waiting, the quality of character that can endure quietly represents the very highest bravery that human nature is capable of, and in this greater heroism woman has almost a monopoly.
In their heroism, women are always better than men.
And it’s not only in great things that woman shows her nerve. The other day in Naples, two Boston ladies were leaving a shop. A man seized the purse of one of them, whereupon she took him by the throat, gave him a good shaking, slammed him upon the ground, recovered her property, and then in her cool New England way told him to move on. We can scarcely pick up any newspaper without finding a story of a woman capturing a burglar, stopping a runaway, or doing something of the instant sort that is the very essence of nerve; and we should not forget in this category the Connecticut widow who, although dreadfully afraid of mice, upon finding a lion from Mr. Barnum’s show in one of the stalls of her stable, deliberately whipped the beast away and sent him cowering down the road.
– “The Heroism of Women,” editorial by Lynn Roby Meekins, April 21, 1900.
Featured image: SEPS.
This series by American studies professor Ben Railton explores the connections between America’s past and present.
On February 14, Senator Kamala Harris introduced legislation into the Senate that would for the first time in American history make lynching a federal crime. The Justice for Victims of Lynching Act, originally drafted by Harris in June 2018, does more than just criminalize lynching—as its name suggests, it seeks to remember and in some small ways make amends for the thousands of lynchings that took place between the end of the Civil War and the 1960s. “With this bill,” Harris said, “we finally have a chance to speak the truth about our past and make clear that these hateful acts should never happen again. We can finally offer some long overdue justice and recognition to the victims of lynching and their families.”
The horrific stories of lynching are intimately intertwined with African American history, a significant factor in Harris’s choice to introduce the bill during Black History Month. Yet as with any American histories, lynching’s connections extend to every national community. For Women’s History Month, Harris’s prominent role in these unfolding 21st century accounts can also help us remember the fraught, contradictory, and crucial links between American women and the lynching epidemic.
Perhaps the single most jarring defense of lynching was offered by a pioneering feminist activist. Rebecca Ann Latimer Felton, the wife and political partner of longtime Georgia Congressman William Harrell Felton, was one of the Progressive Era’s most prominent and acclaimed women’s rights activists: an advocate of women’s suffrage, equal pay, and many other feminist causes, she became the first woman to serve in the U.S. Senate when, at the age of 87, she was honored with a single-day appointment as Senator from Georgia on November 21, 1922. Yet she was also a white supremacist and racist who openly advocated for the systematic lynching of African Americans.
Felton made her case for lynching most vocally in an August 1897 speech to the Georgia Agricultural Society. While she identified a number of problems facing (white) farm wives in the state, she focused in particular on “the black rapist” and the threat he posed to those women. She repeated the canard that Reconstruction had given African Americans “license to degrade and debauch.” And in response to those imagined terrors, she argued, “When there is not enough religion in the pulpit to organize a crusade against sin; nor justice in the court house to promptly punish crime; nor manhood enough in the nation to put a sheltering arm about innocence and virtue—if it needs lynching to protect woman’s dearest possession form the ravening human beasts—then I say lynch, a thousand times a week if necessary.”
Felton’s bigoted speech reminds us that the era’s progressive white women far too often allied with the forces of segregation and white supremacy, both to further their movement’s goals and (as in Felton’s case to be sure) out of genuine and deeply rooted racism. Yet as historian Martha Jones has recently argued, the under-appreciated contributions of African American women to the women’s suffrage movement played a crucial role in advancing women’s right to vote. One of the most prominent such African American suffrage activists, Ida B. Wells, also happened to be the nation’s leading anti-lynching journalist and crusader.
The opening pages of Wells’s first book, Southern Horrors: Lynch Law in All Its Phases (1892), reflect the intersections of her women’s rights and anti-lynching activism. In her preface, Wells acknowledges the fundraising efforts of New York City women’s rights organizations that allowed her to publish the book, writing, “the noble effort of the ladies of New York and Brooklyn Oct. 5 have enabled me to comply with this request and give the world a true, unvarnished account of the causes of lynch law in the South.” Wells highlights both her own status as a target of white supremacist violence (when her Memphis newspaper office was burned down) and her courageous response to those attacks: “Since my business has been destroyed and I am an exile from home because of that editorial, the issue has been forced, and as the writer of it I feel that the race and the public generally should have a statement of the facts as they exist.” As an African American woman speaking out against these horrors, she likewise revises Felton’s images of race and gender, noting, “[The facts] will serve at the same time as a defense for the Afro-Americans Sampsons who suffer themselves to be betrayed by white Delilahs.”
While reports of lynching usually involved African American male victims, the epidemic also extended to other communities of color, including Chinese Americans and Mexican Americans. A recent New York Times article that highlights the histories of Mexican American lynchings in particular reveals another role for American women activists: as contributors to expanded collective memories. That includes the historians upon whose work that New York Times article depends: Professors Monica Muñoz Martínez of Brown University and Laura F. Edwards of Duke University. But it also includes women like Arlinda Valencia, the Texas educator and union official whose ancestors were among the victims of the January 1918 Porvenir mass lynching in which Texas Rangers and ranchers destroyed an entire Mexican American village.
Professor Martínez’s educational nonprofit organization Refusing to Forget has in the half-dozen years since its founding done particularly impressive work recovering those histories and sharing them with audiences of all types. Those efforts include historical markers for particular sites such as Porvenir, traveling and permanent museum exhibitions, and public lectures and conversations. Martínez and her colleagues have discovered a pattern of widespread violence directed not only at individual Mexican Americans, but also and especially at entire communities, with the Porvenir massacre sadly not atypical of these outbursts of collective brutality.
Like Senator Harris, these scholarly and civic historians are working to make the lynching epidemic’s histories more consistently present in our 21st century collective memories. Doing so likewise requires remembering lynching’s complex intersections of race and gender, both in their most destructive and most inspiring forms.
Featured image: The Silent Parade in 1917 in New York City was organized by the NAACP to protest violence toward African Americans.
At the Democratic National Convention in St. Louis in 1916, Martha Gellhorn stood in a protest called the Golden Lane. She was only 8 years old, but her mother, a suffragist, had organized 7,000 women clad in yellow sashes and parasols to line Locust Street — the route delegates would take to the St. Louis Coliseum. Toward the end of the line, seven women in white represented states that endorsed women’s suffrage, and many others in gray and black stood for states unwilling to budge. Gellhorn, in a white dress, represented a future woman voter.
This initial act of defiance was only the beginning for Gellhorn. The young girl in white would go on to lead a daring and adventurous career. Throughout her life, as a war correspondent and author, she witnessed some of the most influential occurrences of the 20th century, from the Great Depression to D-Day to the U.S. invasion of Panama. From her feminist start in St. Louis, Gellhorn travelled the world and became one of the most important journalists of the century. And she did it all without permission.
Like many writers and artists of the time, Gellhorn moved to Paris in 1930. After dropping out of Bryn Mawr College and working briefly for New Republic, the young reporter was itching to become a foreign correspondent. Gellhorn landed a job with the United Press, but was fired after reporting sexual harassment by a coworker. She spent many years travelling Europe, writing for papers in Paris and St. Louis, and even covering fashion for Vogue. While noticing the rise of fascism in Germany, Gellhorn remained a pacifist in mostly leftist circles. She decided to return to the States in 1934 because, as she said, the poverty she witnessed in Europe was rampant in her own country.
Gellhorn soon earned a spot as an investigator in Roosevelt’s Federal Emergency Relief Administration, travelling around the country and giving firsthand reports of the grim living and working conditions. She trudged around North Carolina in secondhand Parisian couture, interviewing five families a day and growing ever more troubled about the widespread destitution. Gellhorn called on Roosevelt to do more about a syphilis epidemic and lack of birth control. Her boldness gained her the ear of Eleanor Roosevelt, who became a lifelong friend.
While on assignment in Idaho, Gellhorn convinced a group of workers to break the windows of the FERA office to draw attention to their crooked boss. Although their stunt worked, she was fired from FERA: an “honorable discharge” for being “a dangerous Communist,” she wrote wryly to her parents. The Roosevelts then invited Gellhorn to live at the White House, and she spent her evenings there assisting Eleanor Roosevelt with correspondence and the first lady’s “My Day” column in Women’s Home Companion.
She soon moved on, however, feeling a need to publish her experiences with impoverished America. “The material was fresh, dramatic, and intensely present in her mind,” Caroline Moorehead writes in her Gellhorn biography. “It was a question of distilling it. She had been haunted by what she had seen; now, she had to haunt others.” The resulting book, The Trouble I’ve Seen, earned Gellhorn acclaim from reviewers all over the world for its frank writing style and urgently important stories.
Around the same time, she began a relationship with Ernest Hemingway after meeting the acclaimed novelist at a bar called Sloppy Joe’s in Key West. They hopped on a boat to Spain to cover the Spanish Civil War, and Gellhorn found her beat as a war reporter. She began to publish with Collier’s and the New Yorker on the view from Madrid. Gellhorn and Hemingway travelled to China for the Second Sino-Japanese War, then back to Europe to document World War II from London. They married in 1940.
On June 6, 1944, Gellhorn did not receive official clearance to attend the Allied invasion of Normandy (as Hemingway had), but she sneaked onto a hospital ship and locked herself in a bathroom until it began making its way across the channel. They arrived on Omaha Beach, and Gellhorn became the first female reporter on the scene, disguised as a stretcher-bearer. She arrived in Normandy before Hemingway, wading to the beach to collect casualties and assist medical teams. In her report for Collier’s she observed:
Everyone was violently busy on that crowded, dangerous shore. The pebbles were the size of apples and feet deep, and we stumbled up a road that a huge road shovel was scooping out. We walked with the utmost care between the narrowly placed white tape lines that marked the mine-cleared path, and headed for a tent marked with a red cross… Everyone agreed that the beach was a stinker, and that it would be a great pleasure to get the hell out of here sometime.
Gellhorn followed the war eagerly, estranging her relationship with a famous husband who preferred to have her at home. She covered the liberation of Dachau and Paris and wrote for this magazine about the 82nd Airborne Division, utilizing her extensive knowledge of the war’s bigger picture to give context to everyday combat scenes.
After the war, Gellhorn divorced Hemingway and lived for a time in Cuernavaca, Mexico. She continued to write for The Saturday Evening Post, first to explore the toll of the war on the street children of Rome, then to document her adoption of an Italian orphan:
I knew those children in the war. I saw them scraping an existence for themselves in the rubble of Naples; I saw them brought in, bleeding and wild, to battalion aid stations because they’d walked on mines like any soldier; I saw them dumped into trucks and moved, they didn’t know where. I thought them the bravest people possible, and beautiful and quick and cheerful in the center of hell, where they lived… They ought to declare war on grownups, I thought furiously, who killed their childhood.
Her adopted son, George “Sandy” Alexander, required letters from several ambassadors and the Roosevelts to leave with Gellhorn out of Florence.
Gellhorn’s years as a war correspondent weren’t behind her. However, she lost favor with many American publications after declaring herself firmly against the Vietnam War and Lyndon Johnson. The Guardian agreed to buy six articles from Gellhorn in 1966 when she explained her intention to carry on her style of covering everyday life during conflict: “I want to write about the Vietnamese, the civilians, whom everyone has forgotten are people. I want to try, humbly, to give them faces so we know who we are destroying.” She urged American journalists to cover Vietnam fairly instead of catering to military interests.
Gellhorn covered Nicaraguan contras, the Arab-Israeli conflict, and, in 1989, the U.S. invasion of Panama. As a lifelong pacifist, she criticized U.S. foreign policies increasingly after World War II, dubbing the U.S. a “colonial power.” Her prose aimed to expose the detriments of war, documenting intimate details of ordinary citizens. She claimed to merely write what she would see instead of attempting to offer perspectives for both sides of a conflict.
At 89 years old, blind and suffering from ovarian cancer, Gellhorn swallowed a cyanide pill in her London flat. Her New York Times obituary described her as “a cocky, raspy-voiced maverick who saw herself as a champion of ordinary people trapped in conflicts created by the rich and powerful.”
Above all, Gellhorn refused to be remembered as Hemingway’s wife. After all, it was the least interesting fact about her. “I was a writer before I met him, and I have been a writer for 45 years since,” she wrote. “Why should I be a footnote to someone else’s life?”
“Title seven is an unrealistic law.” That’s what an unnamed male corporate executive claimed 50 years ago to a Post reporter when asked about hiring and promoting more women. In 1968, the Civil Rights Act had been federal law for four years, but the tangible results of Title VII’s prohibition of employment discrimination were not cutting it for women workers.
Of the 2,200,000 American earners of more than $10,000 a year in 1968, only 2.5 percent were women, according to Marilyn Mercer’s “Is There Room at the Top?,” published in this magazine. The reasons for the disparity were thought to be systemic and, perhaps, insurmountable. They included outright discrimination, subtle biases, and even a lack of ambition from women themselves.
Mercer found that while almost half of American women were working, they were finding it difficult to advance their careers the same way men could. “Full integration of women into business would mean, ultimately, changing some of our most deep-rooted ideas about sexual roles,” she opined, “And this is something that has never happened before in the civilized — or uncivilized — world.”
Throughout the decades to come, more and more women joined the white-collar workforce each year. The 1980s and ’90s were characterized by powerful women, from Margaret Thatcher and Oprah Winfrey to fictional women like Murphy Brown and Tess McGill in Working Girl. A cultural urgency for equality saw women donning power suits and going to work. The female labor force participation rate rose to 60 percent by the end of the ’90s, having started at around 30 percent after World War II.
Starting in 2000, however, the tides changed. The percentage of women in the labor force began to drop, and the numbers are now down to that of 1990. The reasons for the lapse in working women aren’t clear, since men have seen the same trend for years. An aging population, an economic recession, and rising enrollment in secondary and postsecondary education could all be factors, according to an economic brief from the Federal Reserve Bank of Richmond. The gender wage gap has also remained stagnant, instead of narrowing as it did in those years of growth.
Women in the U.S. have been taking more management positions. Most of the 4.5 million management positions created since 1980 have gone to women, but those gains have been concentrated in fields focused on people as opposed to production. Even more alarming is that as women claim the majority of any management field (like health services, education, and human resources), the gender wage gap of that field increases. The opposite effect occurred in 1980. In the makeup of CEOs and public administrators, the percentage of women has increased only one percent, and, just this year, the number of female CEOs in the Fortune 500 dropped by 25 percent.
The current obstacles to women workers in America are frequently discussed (sexual harassment, lack of paid maternity leave, unconscious bias), and — though a decades-long reexamining of gender roles has led to some progress — there still exists a more than 20 percent wage gap between men’s and women’s earnings.
Mercer’s 1968 look at gender inequality is a deep dive into the sexual psychology behind the American workplace at a time of cultural upheaval and The Feminine Mystique. Despite the disappointing state of female labor, the Post writer found plenty of women executives to celebrate. Katharine Graham had been leading The Washington Post Company for years with excellent results, and Sue Boltz had grown the Detroit industrial firm Goddard & Goddard by millions. In the current era of similar watershed women’s movements and gender revolution, Mercer’s work illuminates the fight for employment equality reaching back over 50 years.
Only within the last 70 years has it become socially acceptable for women to wear pants. Until the mid-1960s, the average American woman wouldn’t dare leave her house wearing dungarees. But as early as the mid-1800s, a few pioneering women had started quite literally making strides toward more practical women’s wear.
Dress Reform in the Mid-1800s
In the early 1800s, men’s and women’s fashion overlapped very little. Few women wore pants. For women, the purpose of clothing was not so much for function, but to make them look curvier, and it took women a significantly longer time to dress each day due to the number of layers they wore. The typical style included a dress or a long skirt with a blouse. Beneath the skirts were steel hoops and petticoats to make the skirt rounder. A corset also cinched the woman’s waist.
Because a typical woman’s life focused on her domestic duties, which in theory required less exertion than “man’s work,” the clothing a woman wore each day lacked functionality and made even the simplest tasks more difficult. Sitting down and bending over were hampered by the steel hoop, the layers beneath the dress, and the corset squeezing her middle.
Like other women, Elizabeth Smith Miller submitted to heavy and restrictive but fashionable caged dresses during the beginning of her life. But in 1851, while toiling in her garden in full dress, she got frustrated with “acceptable attire” and felt it a reasonable solution to change it. So she did.
She took inspiration from a trend she had seen in Europe, where women had taken to wearing “Turkish trousers” under their skirts — a trend not yet seen in America. Miller notably became one of the first women in the United States to brave in public the look of what would eventually be called bloomers under a knee-length skirt.
She wasn’t the only woman who felt trapped in her clothes. Miller’s cousin, Elizabeth Cady Stanton, shared her dissatisfaction and, seeing Miller’s bravery, decided to try out the same look.
Amelia Bloomer, Miller’s neighbor and friend, began promoting the new look in her newspaper, The Lily. At the time, her newspaper wasn’t known for being radical, but Bloomer hoped to spark some kind of change. She became a prominent voice of the women’s movement, using her platform to encourage other women to try out the new look themselves.
To promote this new style, Bloomer and other early feminists decided to take a particularly practical approach to bloomers. Instead of advertising comfort or gender equality or even freedom of movement, they publicized these pants as being better for women’s health: Petticoats, steel hoops, and corsets made healthful outdoor activities like hiking, swimming, and bike riding difficult for women, so they rarely participated in these activities. Bloomers, they argued, opened up these opportunities for exercise and fresh air. Occasionally, these arguments were reinforced with statements by doctors saying that the prevailing women’s fashion contributed to waves of illnesses that afflicted women.
This announcement from the August 1, 1857, issue of the Post points out that corsets and crinolines weren’t the best choices for a healthy lifestyle. Timour, also known as Tamerlane, was a 14th-century Asian conqueror who considered himself the political, if not biological, heir of Genghis Khan.
Though some younger women began wearing bloomers for bike riding, many Americans dismissed or discouraged the European trend. Miller and Bloomer were publicly shamed for their “radical dress.” In a document from the Elizabeth Smith Miller collection of the New York Public Library, she recalls enduring “much gaping curiosity and the harmless jeering of street boys.”
The movement did not escape the notice of The Saturday Evening Post, which published a short item on a gathering of the Dress Reform Association.
Miller had her own doubts and admitted to not feeling as beautiful as other women because her style didn’t accentuate the desired features of the time. However, she recalled inspiring words from her cousin, Elizabeth Cady Stanton: “The question is no longer, how do you look, but woman, how do you feel?” These words reminded her of how important this rebellion was to all women. She and other women believed women deserved more opportunities, starting with the simplest of things, like comfortable and functional clothes.
Unfortunately, outside of the bicycling trend, the movement gained little traction, and bloomers failed to become everyday wear as Miller and other feminist activists had hoped. However, the defeat was only temporary.
The fight for a woman’s right to wear pants arose again when French designer Paul Poiret’s “harem pant” hit the scene in 1909. More feminine than bloomers, these pants brought an alternative style that was both functional and flattering. Unlike bloomers, harem pants were made from silkier materials and embroidered and beaded with intricate detail.
These pants and other similarly designed trousers for women became especially popular with celebrities. In 1917, Vogue printed its first magazine with a woman wearing pants on the cover. Many more covers followed depicting women in different styles of pants.
Like bloomers, harem pants garnered a fair amount of backlash. These stylish pants were seen as too sexual for the average woman and remained in the confines of “celebrity fashion.” Like bloomers, the trend came and left, not quite making the jump to everyday wear.
In the mid-1900s, World War II created a need for women to wear pants. As more than 16 million American soldiers shipped off to Europe and the South Pacific, businesses hired women to fill empty positions. The nature of many of these jobs made wearing dresses not only impractical but dangerous. Thus, thousands of working women found themselves wearing pants every day in support of the war effort.
But with little stable ground for this trend to build upon, it largely faded away again after the war ended. Pants no longer seemed necessary for domestic wives.
Lasting change finally came in the 1960s and early ’70s. For young people, rebellion was a way of life, and the perfect opportunity for pants to take center stage again. During the feminist movements of this time, fashion began to cross gender lines. The word unisex made its first appearance in print, and men and women alike sported T-shirts, ponchos, and wide-leg denim pants.
While women in pants became more common in public in the 1960s, acceptance at the highest levels of government was slow in coming. It would be another 30 years before women would be allowed to wear pants in the U.S. Senate. In the beginning of 1993, a number of female senators wore pantsuits in protest of an ancient rule of the official Senate dress code, and it was finally amended later that year.
Modern Fashion Statements
These days, Hillary Clinton is practically synonymous with pantsuits. During her 2016 presidential campaign, she wore them to practically every public event and was rarely seen in a skirt. Her attire became a symbol among her devotees, and even spurred the creation of “Pantsuit Nation,” a Facebook group of 3.9 million Clinton supporters.
Today, women from all different backgrounds wear trousers daily. This trend has become so popular that a new era of menswear-inspired fashion for women has become a high-demand look embraced by celebrities like former Spice Girl Victoria Beckham and singer Rihanna.
Women took hits for years for even wondering what it would be like to wear pants. Today, the simple wonder for many women is why they were not given such rights of function and fashion in the first place.
Women such as Miller, Bloomer, and Stanton pushed for change that led to the social acceptance that we take for granted today. As insignificant as the right for women to wear pants may seem now, it is a historical symbol of women’s perseverance over adversity and pursuit of equality.
Topping most of our resolutions this year is a repeat from the past: weight loss. But who’s to blame for this obsessive desire to trim and slim our figure? Automation? Hollywood? Feminism? France? Here’s a 1934 doctor’s take on America’s ongoing weight-loss craze:
Originally published in The Saturday Evening Post, September 22, 1934
Before the establishing our modern knowledge of diet, it was taken for granted that the shape anyone might have had been conferred upon him by providence, and the best one could do would be to make the most of it. There was little to be done in making the least of it. Nature creates human beings and animals in all sorts of forms and sizes. A Great Dane takes many a roll in the dust, but never achieves the slimness of a greyhound; a draft horse of the Percheron type travels many a mile pulling heavy loads, but never gets small enough to be a baby’s pony. Nevertheless, the basic framework can be modified as to the amount of upholstery. Every woman knows that she can, by suitable modification of her diet and by the use of proper exercise, cause the pounds to pass away.
No one has determined certainly the cause of the recent craze for reduction. Perhaps it was the outgrowth of criticism of the female figure that was popular in the late ’90s. The textbooks of the ’90s had much to say about corset livers and hourglass shapes. The preference for the boyish form may have been the result of the gradual change in the amount of clothing worn by women. The multiple petticoats and the heavy underclothing of the late ’90s began to give way to single garments in what was called the empire style. The styles have tended toward the slim figure, covered by less and less clothing. Perhaps the change was the result of the coming of the automobile; that, too, has been a most significant factor in the change of our body weight.
A Matter of Form
Walking, up to 1900, was the accepted mode of transport for the human body in the vast majority of circumstances. Then came the motor car. Today there is in this country one motor car for each five persons, and walking is gradually becoming a lost art. Walking used to be the form of exercise primarily responsible for burning up the excess intake of food. With the gradual elimination of walking and with the coming of the machine in industry, there has been less and less demand for energy in food consumption and more and more tendency toward maintaining a slim figure by a reduction in the consumption of food. The person who takes no exercise and who eats the diet that was prevalent from 1900 to 1905 will put on weight like an Iowa hog in training for a state fair.
The suggestion has even been made that feminism was responsible for increasing the popularity of women like men. Within the last quarter century more and more women have come out of the home and into various clerical, manufacturing, promotional, industrial, and statesmanlike occupations. No doubt, the bobbing of the hair and the binding and suppression of the breasts, as well as the thinning of the figure and simplification of the costume, were women’s response to the necessity for greater ease of movement and less encumbrance while engaged in such work. A fat girl gets lots of bumps from office furniture in modern designs.
Then came the war, and with it there was intensification of all these motivations; the war made serious demands on women. The slightly suppressed desires for freedom merged into strong impulses and urges that suddenly seized every feminine mind. What had been merely a somewhat languid interest suddenly became a dominating craving. Reducing became the topic of the hour, and the craze for reduction was upon us.
It has been urged by some that the final stimulus for slenderness was a sudden change in fashions promoted by the modistes of France. Be that as it may, the French women themselves never succumbed to the craze for emaciation as did their American sisters.
The French are far too sound a race from the point of view of feminine psychology to urge the cultivation of manly traits in their women. No doubt, the French fashions did incline toward women of somewhat thinner type, but the modistes did not, like our designers of costumes, adopt an all-or-nothing policy. Individualization in form and costume has more often been the mark of France, whereas standartization and uniformity have dominated the American scene.
American manufacturers of ready-made clothing, with the beginning of the 1920s, began to produce models for slim women, hipless and bustless. As the women went into the department stores to purchase, they found it difficult to obtain anything that would fit. They came out wringing their hands and crying that most famous of all feminine laments, “I can’t get a thing to fit me.” And when a woman cannot get a thing that will fit, she is ready to fix herself to fit what she can get. There were promptly plenty of experts ready to help her through the fixing process.
To Make the Person Personable
Advertisements began to appear of nostrums to speed the activity of the body and to lessen its absorption of food. Phonograph records were sold, giving explicit instructions regarding exercise and diets. The radio poured forth systematic calisthenics and played tunes for the performance of these motions in a rhythmical manner. Plaster fell from many a living-room ceiling while women of copious avoirdupois rolled heavily on the bedroom floor. The springs and frames of many a bed groaned wearily beneath the somersaults of some damsel of 170 pounds. Pugilists who had been smacked into insensibility on the rosined floors of the squared rings became heavily priced consultants for ladies of fashion and of leisure who embarked on programs of weightlifting: Department stores offered, in the sections devoted to cosmetics, strangely distorted rolling pins with which it was claimed fat might be better distributed about the person. Shakers, vibrators, thumpers, bumpers, and rubbers manipulated electrically, by water power, or even by gas, were offered to those who cared to try them.
Out of this turmoil came a demand for a scientific study of overweight, its effects on the human body, its relationship to economics, sociology, psychology, happy marriage, the maintenance of the home, and physical and mental health. In response to this demand, research organizations in many medical institutions began to study the factors responsible for obesity and the most suitable methods for overcoming the condition without injuring the general health. Whereas, in scientific medical indexes of a previous decade, an occasional article only might be devoted to this subject, the indexes of recent years show scores of records and reports in this field.
The Do-or-Diet Spirit
The first response to the craze for reduction, as I have said, was the development of extraordinary systems of exercise, with the idea that a woman could keep right on eating the same amount of food that she formerly took and that she could get rid of the effects of this food by excessive muscular activity. Quite soon the women found out the error of this notion.
Walking 5 miles, playing 18 holes of golf, or even 6 active sets of tennis does not use up enough energy to take off any considerable amount of weight. Even the playing of an excessively severe football game removes from the body relatively little tissue. A football player, it has been reported, may be found to weigh from 5 to 10 pounds less after a football game than he weighed before, but most of this loss of weight is merely due to removal of water from the body, which is promptly restored by the drinking of water after the contest is over. Actually, the terrific strain of one hour of football burns up not more than one-third of a pound of body tissue.
Thus reduction of weight is for most people simply a matter of mathematics, calculating the amount of food taken in against the amount used up. Reduction is a matter of months and years, not of days. The investigators have shown that it is dangerous for the vast majority of people to lose more than two pounds a week. A greater loss than this places such a strain on the organs of elimination and on tissue repair that its effects on the human body may be serious and lasting.
When women found that weight could not be permanently removed to any considerable extent by excess exercise, they began to try extraordinary diets. The diets first adopted were selections of single elements. They have been characterized as perpendicular rather than horizontal reductions. The phrase refers to the nature of the diet rather than to the effect on the human form. In a perpendicular diet, the partaker eliminates everything except one or two food substances and limits himself exclusively to these. In a horizontal diet, one continues to eat a wide variety of substances, but eats only one-half or one-third as much of each. Perpendicular diets are dangerous because they do not provide essential proteins, vitamins, and mineral salts. These will be found in a properly chosen diet which includes many different foods, but smaller amounts of each. So women began eating a veal chop alone, pineapple alone, hard-boiled eggs alone, or lettuce alone. The phrase “let us alone” best expresses the proper attitude to assume toward a woman on a perpendicular diet. The constant craving for food and the associated irritability make the woman on such a diet a suitable companion only for herself, and sometimes not even for that. Certainly, she is no pleasure around a home. Among the first of the books of advice to be published on diet was one concerned only with the calories. No doubt, successful reduction of weight was easily accomplished by the caloric method, but the associated weakness, illness, and craving for food soon brought realization that there was more to scientific diet than merely lowering the calories.
The next extraordinary manifestation was the 18-day diet from Hollywood. The exact origin of this combination does not appear to be known. Perhaps it appeared first in print in the columns of criticism of motion pictures of a well-known Hollywood writer. In her statements on the subject, it was said that the diet was the result of five years of study by French and American physicians, and that the diet would be perfectly harmless for those in normal health. If the French and American doctors spent five years working out the 18-day diet, they wasted a lot of time. Any good American dietitian could have figured out an equally good combination, and probably a much better one, in an afternoon. The vogue of the 18-day diet was phenomenal. Restaurants and hotels featured it in their announcements. Hostesses, anxious to please their dinner guests, called each of them by telephone to know which day of the 18-day diet they had reached and served each guest with the material scheduled for that particular meal. It was said that a Chicago butcher bragged that he had eaten the first nine days for breakfast.
The 18-Day Sentence
The 18-day diet had peculiar psychologic appeal. For the first few days it consisted primarily of grapefruit, orange, egg, and Melba toast. Melba toast, be it said, is a piece of white bread reduced to its smallest possible proportions; then dried and toasted so as to be developed into something that can be chewed. By the second or third day, when the participant had reached the point of acute starvation, she was allowed to gaze briefly on a small piece of steak or a lamb chop from which the fat had been trimmed. Then two or three days of the restricted program followed, and again, when the desire for food reached the breaking point, a small piece of fish, chicken, or steak could be tried. Thus the addict passed the 18 days, during which she lost some 18 pounds. Then, pleased with her svelte lines, she began to eat; three weeks later she could be found at the point from which she had first departed.
For years it has been recognized that human beings need magical stimuli in the form of amulets, powders, or charms to aid in the concentration necessary for success in love, religion, health, or business. The human mind needs some single object to which it may pin its hopes, its faiths, and its aspirations. Moreover, there was the psychological appeal of mob action. There was the desire to be doing what everybody else was doing at the same time. Then there was the thrill of competition. One could hear the addicts of the Hollywood diet asking one another, “What day are you on?” And the answer came back, “I’m on the tenth day and I’ve lost eight pounds.”
With the mystic appeal of Hollywood, land of mystery, with the psychological understanding of human appetite, with the introduction of the Melba toast, the Hollywood diet swept the nation.
The Calorie Gauge
The one thing necessary to reduce weight successfully in the majority of cases is to realize just how many calories are necessary to sustain the life of the person concerned and what the essential substances are that need to be associated with those calories. Most of us enjoy our food. We eat food because we like it, and we eat without thinking what the food will do in the way of depositing fat. The researches in the scientific laboratories that have been made in the past 10 years indicate that we eat more food than we need, particularly at a time when energy consumption is far less than energy production. It has been generally assumed that the weight of the body is definitely related to health. There are standard tables of height and weight at different ages for all of us from birth to death. It must be remembered, however, that these are just averages and that any variation within 10 pounds or even 15 pounds of these averages is not incompatible with the best of health in a person who inclines to be either heavy or light in weight as a result of his constitution and heredity.
There are two types of overweight: One … in which the glands of internal secretion fail to function properly; the other … due to overeating and insufficient exercise. The glands of the body, including particularly the thyroid, the pituitary, and the sex glands, are related to the disposal of sugars and of fat in the body. In cases in which the action of these glands is deficient, a determination of the basal metabolic rate of the body will yield important information. This determination is a relatively simple matter. One merely goes without breakfast to the office of a physician who has a basal-metabolic machine, or to a hospital, all of which nowadays have these devices. One rests for approximately one hour, then breathes for a few minutes into a tube while the nose is stopped by a pinching device, so that all the air breathed out can be measured. By appropriate calculations, the physician or his technician reaches a figure which represents the rate of chemical action going on in the body. A rate of anywhere from –7 to +7 is considered to be a normal metabolic rate. A rate of anywhere from –12 to +12 may be within the range of the normal for many people. If no other special disturbance is found, the physician is not likely to be concerned about the metabolic rate within such limitations. Rates well beyond these two figures, however, are considered to be an indication of failure in the chemical activities of the body — namely, either too rapid or too slow — and measures should be taken promptly to overcome the difficulty. If the basal metabolic rate is –20, –25 or –30, the physician will prescribe suitable amounts of efficient glandular substances to hasten the activity. Moreover, he will at this time arrange to repeat his study of the metabolic rate at regular intervals. He will watch the pulse rate and the nervous reaction of the person to make certain that the effects of the glandular products that are administered are kept within reasonable limitations. If, on the other hand, the basal metabolic rate is found to be +20, +25, or +30, he will make a study of the thyroid gland and will provide suitable rest, mental hygiene, and possibly drugs to diminish this excess action. Rarely, indeed, is a person with a metabolic rate of +25 fat; in most instances, such people are thin, sometimes to the point of emaciation. There are periods in life when the human body tends to put on fat. As women reach maturity, as they have children, as they approach the period at the end of middle age, there is a special tendency to gain in weight. Men are likely to spend more time in the open air, eat more proteins and less sugar than do women, and therefore are less likely to gain weight early. The common period for the beginning of overweight is between 20 and 40 years of age; in women the average being usually around 30. Among men, the onset of overweight is likely to come on eight to ten years later.
A man doing hard muscular work requires 4,150 calories a day; a moderate worker, 3,400; a desk worker, 2,700, and a person of leisure, 2,400 calories. A child under one year of age requires about 45 calories per pound of body weight, about 900 calories a day. The number is reduced from the age of six to 13 to about 35 calories per pound, or 2,700 a day; from 18 to 25 years, about 25 calories per pound of body weight may be necessary, or 3,800 a day. Thus, a person 30 years old, weighing about 150 pounds, may have 2,700 calories; a person 40 years of age, weighing 150 pounds, may have 2,500 calories; a person 60 years of age, weighing 150 pounds, may have 2,300 calories. A calorie is merely a unit for measuring energy values. In the accompanying table examples are given of the number of calories in various well-established portions of food.
The overweight child at any age is quite a problem for the doctor. Most times it is the result of a family that tends to eat too much. Children of fat people are likely to be fat because they live under the same conditions as do their parents. If the adults of the family eat too much, the children can hardly be blamed for doing likewise. Investigators at the University. of Michigan say that the normal person has a mechanism which notifies him that he has eaten enough. Obese people require stronger notification before they feel satisfied, and many disregard the warning signal because they get so much pleasure out of eating. “Pigs would live a lot longer if they didn’t make hogs of themselves,” said a Hoosier philosopher.
If a physician has determined that excess weight in any person is not due to any deficiency in activity of the glands, but primarily to overeating, it is safe to take a diet that contains a little more than 1,000 calories a day, and that provides all the important ingredients necessary to sustain life and health. A menu like the following, outlined by Miss Geraghty, provides about 1,000 calories as well as suitable proteins, carbohydrates, fats, mineral salts and vitamins:
For those who want to reduce intelligently, here is another menu that includes all the important ingredients:
If you simply must have afternoon tea, add in 150 calories that the sugar and accompanying wafers will contain.
Every woman who has heard of these diets insists that they provide about twice as much food as she usually eats. This merely means that she is talking at random rather than mathematically. These diets do contain a wide variety of ingredients, but they are chosen with exact knowledge of what they provide in the way of calories and important food attributes. Quite likely, the women who protest eat a smaller number of food substances, but it is likely, also, that they eat so much of each of these substances that their calories are far beyond the total. Furthermore, they probably fail to keep account of the occasional malted milks, cookies, chocolates, or ice cream that they have taken on the side.
From the accompanying table of caloric values, it is possible to select a widely varied meal that will provide any number of calories deemed to be necessary; and if the meal is selected to include a considerable number of substances, it will have all the important ingredients.
In taking any diet, it is well to remember that calories are not the only measuring stick for food. A pint of milk taken daily provides many important ingredients. If bread, potatoes, butter, cream, sugar, jams, nuts, and various starchy foods are kept at a minimum, weight reduction will be helped greatly.
Over the radio and in a few periodicals that do not censor their advertising as carefully as they might, there continue to appear claims for all sorts of quack reducing methods. If only most people had some understanding of the elementary facts of digestion and nutrition, the promotion of such methods would yield far fewer shekels to the promoters. It is a simple matter to get rid of excess poundage and, in general, it is quite desirable. One merely finds out first how many calories per day constitute the normal intake, and tries to get some idea of the number necessary to meet the demands of the body for energy. One selects a diet which provides the essential substances and which permits some 500 to 1,000 calories less per day than the amount required.
Under such a regimen, steadily persisted in, the fat will depart from many of the places where it has been deposited, but not always from the places where it is most unsightly. For this purpose, special exercises, massage, and similar routines may be helpful. But persistence more than anything else is required. It is just a matter of pounding away.
Today, the idea of women serving in the military is widely accepted, if not considered downright mundane. But attitudes toward women in the military have changed a lot in the 75 years since the Women’s Army Corps was created in 1942.
Women soldiers were still a strange idea when “Those Wonderful G.I. Janes” was published on September 9, 1944. Ernest O. Hauser, describing members of the newly created Women’s Army Corps, stressed that they had retained their femininity.
“If you ask them what they want to do after the war,” Hauser reassured Post readers, “the majority will reply, ‘Have a home and babies.’”
By 1957, when “This Lady’s Army” appeared, Americans were still adjusting to the idea of women in uniform. Sidney Shalett praised their contribution to the war effort, serving with distinction while enduring vicious slander. Wartime rumors had been spread of their rampant immorality, including shiploads of pregnant WACS being sent back to the state in disgrace and “wolf packs of sex-hungry Wacs roaming the countryside…seducing innocent sailors.”
“While these libels were being broadcast, heroic women were being torpedoed en route to North Africa, undergoing the blitz in England, and hitting the ditch under Japanese strafing in Leyte,” Shalett reported.
Regardless of how they felt about women in service, Americans had to acknowledge that all the WACs had freely volunteered to serve. Unlike the men, they could have sat out the war with no risk of being drafted. And while the military no longer segregates men and women (the Women’s Army Corps was disbanded in 1978), women are still not required to register for the draft, as men are.
But it looks like that may change as well. A women’s draft was nearly made into a law last year.
The idea began as a protest. When the National Defense Authorization Act was introduced in April 2016, it contained a provision for women to serve in combat. Opposing this idea, California Representative Duncan Hunter (R) introduced an amendment that would require women between 18 and 26 to register with the Selective Service, thinking that his progressive colleagues would oppose it.
Instead of drawing opposition to the Defense bill, the amendment gathered unexpected support from both Democrats and Republicans, men and women. The amendment passed the Senate by a vote of 85 to 13.
The amendment was struck from the bill’s final version before it being sent to the House, but the idea of including women in the draft continued to arouse controversy online.
Extending the Selective Service to women will probably be included – and passed — in a future Defense bill. It’s uncertain how much controversy will accompany the change, but it’s clear that seventy-five years after the formation of the Women’s Air Corps, some Americans are still adjusting to the idea of women in uniform.
Featured image: National Archives
This article and other features about the early automobile can be found in the Post’s Special Collector’s Edition: Automobiles in America!
In 1914, the Curtis Publishing Company, parent corporation of the Post, had commissioned a study of the automobile market. Designed to help its sales force understand and capture a booming new market, it also provides today valuable insight into the thinking of the time. A follow-up study was published in 1932. The following selection from that later report documented the fact that women were not just influencing car-buying decisions but actually buying and driving cars. The modern reader may hold her nose at the presumptions made about female character, taste, and (implicitly) intelligence, but for the same reasons, this excerpt is a fascinating look, not just at car buying trends, but at gender roles and stereotypes of the 1930s.
Our 1914 marketing report said: “Whatever is bought for family use is selected largely by the wife, and the automobile is no exception. Dealers’ estimates of the proportion of sales of pleasure cars in which women are an important factor vary from 50 percent to 95 percent.” But, few women owned or drove cars in 1914. Cars were still heavy and hard-steering. Cars used by women were mostly chauffeur driven. For a woman to take a car to a service station was so unusual as to seem out of place. Driving was a novelty and a hardship; it had yet to become a matter-of-fact occurrence in woman’s everyday life.
Today, not only are women in the family the determining influence in the purchase of a car, they are at the wheel, weaving confidently through crowded traffic, driving at express-train speed along the highways, parking with the dexterity of experts, shifting gears noiselessly, and steering with one-finger control.
What has changed? Let’s begin with the self-starter, which gave the first great impetus to women’s use of the motor car. Prior to this time, the car had to be cranked by hand. Whether they would admit it or not, women were afraid of the motor car, and fear with inconvenience, discomfort, and physical inaptitude were only to give way gradually before self-starters, electric lights, cord tires, closed bodies, four-wheel brakes, easier steering, shock absorbers, and easier shifting.
Fear of the Breakdown
Good as motor cars were in 1914, there were still two bugaboos that held back the woman’s market. One was mechanical trouble and the other was tire trouble. Cars still were subject to tantrums on the road, carburetor trouble, spark plug and other ignition trouble, broken springs, leaky radiators, and a host of minor annoyances. To the dyed-in-the-wool motor car fan, these difficulties were interesting challenges to mechanical resourcefulness; to the woman they were fatal to any hope she might have to own or drive her own car. On the road mechanical trouble rendered a woman helpless. She was under obligations to the first Good Samaritan motorist passing by or at the mercy of the nearest repairman.
So, too, with tires. Jacks, rims, patches, tire cement, tire talcum, tire pumps, and tire irons were out of her line. Today with tires that frequently run over 30,000 miles without trouble, it is difficult to realize that 3,000 miles was a lot of service for some tires in the early days. With each improvement mechanically, more and more women were attracted to the utility and pleasures of car use and ownership.
With the introduction of time payments, car sales involving the woman gained new impetus. Until then sales were for cash only. To the woman who consciously or not budgeted the family expenditures, hundreds of dollars could not be spent all at one time for a motor car. It meant drawing on the family savings, or even mortgaging the home. It was beyond most women’s comprehension. Responsible for the family expenditures, women instinctively opposed so great an investment.
Broken up, however, into relatively small monthly payments, with a relatively small payment down, the purchase of a car assumed reasonable proportions. It came within the family’s ability for fulfillment without hardship. Women again came to the aid of an industry approaching “saturation.”
Meanwhile another important factor had been at work. Seventy percent of all cars produced in 1922 had been open models. Good as these open models had been, certain of their features were objectionable to women. They were cold and drafty. Curtains sometimes leaked. If tops were lowered, they had to be raised — a man’s work. Side curtains had to be furled and unfurled, snapped into place and unsnapped, stored away in the summer and brought out in the winter. Their composition windows cracked and scratched easily, became opaque and made for poor visibility.
In 1922, the Fisher Body Corporation started advertising closed bodies nationally. In 1923 the Hudson Motor Car Company announced a closed coach which sold for $5 less than the corresponding open model. This small seed was to bear great fruit. But at first the obstacle was cost. The difference between the cost of a Dodge 1921 Model Touring Car and a Dodge 1921 (closed) Sedan was $865. Few women could see the desirability of paying $865 more for the same car with a little different body on it just to be comfortable. Thus, the high price of early closed cars impeded the growth of the industry and shut out a waiting multitude of year-round women drivers.
Once prices came down, that unloosed the floodgates of buying. Vast new markets opened up. A car which the whole family could use and enjoy winter or summer became its own justification. Women’s support in car purchase could be depended upon increasingly.
Of no little importance in this development were the means by which women’s interest was intrigued by advertising. Car advertising featured fine-looking modishly gowned women at the wheel. In upholstery, finish, body hardware, flower vases, vanity cases, floor coverings, and instrument hoards, body designers got away from the utilitarian and went feminine. In effect, the motor car became a drawing-room on wheels. By 1931, 92.2 percent of all cars were closed.
Color and Design
With lacquer finish it was easier to keep cars looking well. Many retained a bright and shining newness for years instead of months. The dull appearing car, to which women were more sensitive than men, was passing. Car manufacturers, meanwhile, were giving their cars better lines. Lines, color harmony, upholstery, body hardware — this was a language a woman understood.
Important as eye appeal became in attracting the favor of the woman, mechanical improvements of value in exploiting this market were not overlooked. Shorter wheel bases and easier steering that overcame woman’s difficulty in parking came in 1924, as did balloon tires with greater safety and riding comfort. Four-wheel brakes overcame the fear many women had of not being able to stop. Automatic windshield wipers that relieved her of the necessity of taking one hand off the wheel as was required by the hand operated wiper, car heaters that maintained room temperature in below zero weather, shock absorbers as regular equipment — all contributed toward increased momentum of sales to women.
The fear that many women drivers had of traffic and cross streets passed with easier handling cars and better traffic
control. Widened streets and roads, stop lights, stop streets, one-way streets and highways, under and over passes, and standardized traffic regulations — all these factors contributed toward making driving by women easier and more pleasurable.
Throughout all this period the building of good roads had been removing one more handicap to increased use and ownership of cars by women. In 1914, and long after, road trouble was accepted as part of the game. Tow rope, shovel, and tire chains were still necessary touring equipment. Extricating a car that was stuck in the mud or sand or that had skidded into a ditch was no work for women. Today all that is changed. A woman can drive from Michigan to Florida on an uninterrupted ribbon of paved road over 1,000 miles long. She can drive almost anywhere in the United States or Canada on roads that may be better paved than the streets in her own home city.
Between 1914 and 1932 one of the drawbacks to increasing use and ownership of cars by women was fear of skidding. Muddy, gravelly, and slippery roads; two-wheel brakes; car weight and high center of gravity; hard steering — all contributed to this problem. It took an expert driver to pull himself out of a skid. It required strength of arm as well as quickness of action. The feeling of utter helplessness that came from a skidding car spoiled many a promising woman motorist.
With safe roads, better balanced cars with lower centers of gravity, easy operating four-wheel brakes, low-pressure tires, and easier steering, the skid largely disappeared as a menace to increased women’s use.
Easier Gear Shifting
In learning to drive a car, and in operating it afterward, many women were inclined to have some trouble with the clutch and the gear shift. Here, again, a certain knack was needed. By nature not mechanically inclined and often with no very clear idea as to exactly what these mysterious operations were all about, more women than one gave up the desire to drive at the first lesson. In 1928 easier gear shifting was introduced by Cadillac and LaSalle. Today syncro-mesh, syncro-shift, or some other form of easy gear shift is regular equipment even on low-price cars. At last here was a shift that any woman, no matter how great a novice, could operate without clashing. Another appreciated improvement that met with women’s favor had been made.
Another gradual improvement of considerable influence was due to increasing quietness. Grinding and clashing gears, racing and knocking motors, squealing brakes, squeaking bodies, springs, and shackles were enough to annoy, if not to unnerve, any one inclined to fear that any unfamiliar noise meant the car that cost so much money was being ruined. The fuel knock that sounded as if the car were pounding itself to pieces began to disappear in 1923 with introduction of anti-knock gasoline. Cars that run today almost as quietly as a sewing machine have everlastingly removed that barrier to woman’s use.
Her Car — A Woman’s Necessity
While from 1914 to 1931 we believe that the woman was most important as a passenger and of great influence in directing motor car purchase, from 1932 on it is our belief that her presence to the industry will be most felt as an increasing user of motor cars. In our consumer survey, women were reported as driving in 54.4 percent of the homes interviewed. The motor car today is woman’s mark of social standing.
In conclusion, between 1914 and 1931, with each new improvement in construction, that made woman’s use safer, more comfortable, or more convenient, great numbers enlisted in the ranks of women car owners and drivers. Women handle cars today easily, expertly, and fearlessly. Nervous tension has given way to relaxation and real motoring enjoyment. The road to the woman’s market at last is wide open.
—“Increasing Ownership and Use of Motor Cars by Women,”
The Passenger Car Industry, 1932
A decade before women gained the right to vote in the U.S., the women’s rights movement was working hard to claim incremental victories. Most progress came through quiet grit, relentless recruitment, and tireless organizing.
In 1909, there were many factions of the suffrage movement, all with varying perspectives. Some wanted only partial suffrage (victory at the municipal level), while others believed that they must fight for national suffrage. The National Women’s Party was often confrontational, organizing protests and marches. The National American Woman Suffrage Association focused on lobbying. Most work was done at the grass-roots level, with women holding luncheons, lectures, and letter-writing campaigns and traveling to state capitals to make their case.
An article by May K. Warwick from the June 12, 1909, issue of the Post applauded the headway the women’s rights movement had made. In the article, Warwick enumerates the many benefits of giving women the right to vote:
The suffragists assert that their movement has been related to that of higher education for women, and they point with pride to the seven thousand women doctors, three thousand ministers and one thousand lawyers in our country, to the three hundred occupations open to women and to the thousands of women’s organizations.
The author notes that in the four states where women had the right to vote in 1909, regulations protecting women and children were enacted more quickly. This included laws that raised the age of consent, gave women the right to control their own income and property, and established free kindergarten.
Of course, things were still very different in 1909. While Warwick lobbied eloquently for the right to vote, she assured readers that women were not very interested in running for elected office.
The histories of the enfranchised states show that women have not rushed into office and that those they do hold are mainly educational and charitable. They will state that in all the parties women work in harmony with the men.
Warwick notes that if women were able to show equal pluck and determination after getting the right to vote as they did in petitioning for the right, they would prove a formidable force in American politics. Her article ends on an encouraging note:
The people of America have been stirred from their apathy and are thinking about suffrage. The race is indeed not always to the swift nor the battle to the strong; but some kind of worthy success always follows energy and courage.
It’s a good thing that the women had plenty of that energy and courage; the right to vote was still more than a decade away, and the road ahead would be a bumpy one. Protests continued throughout the 1910s. In 1913, the day before Woodrow Wilson’s inauguration, Alice Paul and Lucy Burns organized a suffrage parade. Opponents brought the protest to a near riot, and mounted police were called in.
Film of suffragettes marching from Newark, New Jersey to Washington, 1913.
In 1917, 200 suffragists were arrested and half were convicted following a protest at the White House. The harsh treatment of some of the women, including forced feeding in prison, bent public sympathies toward the movement.
After several false starts, the 19th amendment was eventually passed and ratified. The first presidential election in which women were permitted to vote in every state occurred in 1920. Despite this fact, many states took their sweet time ratifying the amendment. Mississippi was the last state to do so — in 1984.
In early 1917, America was still several months away from entering the Great War, but wartime production was already underway. In article from the January 20, 1917, issue of the Post, the author was astonished to find that, when women stepped into factory jobs normally done by men, they performed as well as the men, if not better. Even as he reported anecdotes from various factory managers, he cast doubt on the idea, noting that the claims “seemed incredible and altogether too much to ask readers to believe.”
“If that’s too much for you,” exclaimed this man in authority over hundreds of women workers at lathes, punches and presses, “you certainly can’t stand for a statement of some of the things that have happened right here under my own eyes.”
“Well; tell me the worst,” I replied.
The manager went on to tell the tale of a woman who could put out 51 pieces per hour to her male predecessor’s measly six. On top of that, he paid her only 19 cents an hour compared to the man’s 60 cents. (Women also broke fewer tools.)
The need for women in munitions plants created more demand for “typical” women’s jobs – stenographers and bookkeepers. Workers could be choosey, and wages began to rise; the average pay for a stenographer was $13 a week — 60 cents more than a decade earlier.
Demand was great enough that women could be assured that not even age was a detriment. An employment agent pointed out, “Do not think for a moment that gray hairs are a handicap to a woman applicant for a position of this class and character. If anything gray hairs count as a help.”
The article moves on to average wages for other workers of the era, including railroad station agents ($75/month), grocery clerks ($50/month), and bank executives ($200/month). But the larger issue of women entering the workforce – and perhaps staying – loomed large. As one manager observed, “Both American industry and American workingwomen have found out something by this experience that neither is going to forget.”