“In the year of our Lord 1314, patriots of Scotland, starving and outnumbered, charged the fields at Bannockburn. They fought like warrior poets, they fought like Scotsmen, and won their freedom.” I wish I could take credit for this stirring poetic tribute, but any fan of historical films knows that this quote is Screenwriter Randall Wallace’s final narration for the 1995 film Braveheart. While the portrayal of Bannockburn in Braveheart bears little resemblance to the historical event, it is highly unlikely that anyone besides Scottish history enthusiasts would know about the battle had it not been for Braveheart. Today marks the seven hundredth anniversary of the Battle of Bannockburn. On June 24, 1314, Scottish forces led by Robert the Bruce dealt a decisive defeat to the armies of King Edward II of England. The battle near the “burn” or stream of Bannock was the culmination of a series of intricate maneuvers by Bruce to catch the English forces off guard. The actual site of the battle has been disputed, though descriptions of the battle place it within the vicinity of the famous burn and also near Stirling, where William Wallace had won his first victory over the English in 1297. The battle began with some skirmishes on June 23 and continued into June 24th, an unusually long battle for medieval times. Bruce and his forces were probably outnumbered at least two to one. Estimates of the number of soldiers engaged vary, and contemporary sources are not helpful in determining an exact number of combatants. Bruce attacked Edward’s army with the goal of breaking the siege of Stirling Castle. The sight of William Wallace’s claymore majestically arching over the fields of Bannockburn to imbed itself in the ground between the English and Scottish forces is one of the iconic images of Hollywood. Bannockburn in Braveheart is the result of a sudden change of heart on the part of Robert the Bruce, who has been struggling with his failure to support William Wallace earlier. Randall Wallace and Mel Gibson transformed the Bruce into a Judas figure to support the idea of Wallace as a Christ figure who dies for the national deliverance of Scotland. It has often been observed that Braveheart functions in some ways like a draft version of the Passion of the Christ with kilts. The real Robert the Bruce had been fighting alongside Wallace and other Scottish leaders for over a decade when he stepped on the battlefield at Bannockburn. He certainly had his inconsistencies, but the Bruce was much more than the second fiddle depicted in Randall Wallace’s screenplay. The film is not only poorer in terms of historical accuracy for its portrayal of Robert the Bruce, but also for lacking several important features of the actual battle. The use of elongated spears to check the rush of English cavalry (shown in Braveheart at the Battle of Stirling Sans Bridge) was crucial to the Scottish success at Bannockburn. The stream itself, which never appears in the film, was helpful to the Scots in slowing the English retreat. The waters of the burn ran red with English blood as the Scots fell upon their adversaries in the cool waters. Bannockburn’s immediate impact was not as dramatic as Braveheart would lead us to believe. It did break the English ambition to control Scotland for a while, but the Scots would have to continue to battle for recognition and security. It was not until the reign of Edward II’s son and successor, Edward III, that the English officially recognized the independence of Scotland and the Bruce as an equal ruler. William Wallace and Robert the Bruce would have been gratified to know that when the English and Scottish crowns were joined again under one ruler in 1603, it was a Scottish king, James VI and I, who traveled south to take the English throne. Despite some of the flaws of Braveheart, the film generated a great deal of interest in Scottish history and influenced current events in ways that most filmmakers only hope to achieve. The film began to spark an appetite for Scottish nationalism in the mid-1990’s that has continued to grow. On Thursday, September 18, 2014, Scottish citizens will vote on the question, “Should Scotland be an independent country?” The result of that vote has the potential to dramatically change the Scottish relationship to the United Kingdom. While I personally hope that Scotland chooses to remain within the United Kingdom, I do sympathize with their desire to assert their own culture and identity. That culture and identity has provided a wealth of richness to the overall culture that the world knows as “British” culture. The question can be legitimately posed: Are we seeing the echoes of Bannockburn or of Braveheart in the upcoming Scottish referendum? Or echoes of both historical reality and the Scottish romanticism of Walter Scott refracted through Randall Wallace? Does it matter? It would definitely seem that history, or at the very least, how we view history continues to matter a great deal.
The fractured state of American memorial observance began to heal as the nineteenth century waned and the twentieth began. A variety of factors likely contributed to a restoration of American nationalism in the aftermath of the dislocation of the Civil War and Reconstruction periods. The generation who had fought the conflict was beginning to diminish by the beginning of the twentieth century. The Grand Army of the Republic itself would dissolve in 1956 after the death of its last living member. The transition to a new generation held forth the promise of national healing. Two conflicts of the early twentieth century also helped to revive a sense of shared national purpose among Americans. The first conflict, the Spanish American War, preceded the start of the new century by two years. The Spanish-American War was the first conflict beyond American borders since the Civil War to include young men serving from both sections of the country. The First World War brought young Americans together again on the battlefields of France in 1917 and 1918. Both sections of the nation began to honor their fallen together as a renewed spirit of nationalism began to rise in the second decade of that century and continued to grow during and after the First World War. While some Southern states continue to officially observe a separate Confederate Memorial Day, they increasingly conformed to the national observance throughout the twentieth century.
Memorial Day observance was set on the last Monday in May as a provision of the Uniform Monday Holiday Acts passed by Congress in 1968. The Act took effect in 1971. The official name of the observance, Memorial Day, had been declared by Congress in 1967. The occasion is typically observed today with parades in some cities, decoration of military graves and monuments, the lowering of flags to half-mast in the morning, and a national moment of silence at three o’clock in the afternoon. While these methods of observing the day are typical, they do not compose a comprehensive list. People have crafted a variety of ways to commemorate the day and the soldiers whose sacrifice is honored on it.
Our nation is engaged at this moment in fractious mid-term election campaigns framed by the tensions of an economic recession and an increasingly divided American public. Our national rhetoric has not been as uncivil nor our divisions so violent since the 1960s and 70s. Some radical commentators on all sides of the political divide have even spoken of secession as a desirable solution to our social and political disagreements. In times like these, it seems we desperately need reminders of our shared national purpose. The awareness that our young people sacrificed, fought, and fell together has served to remind us in the past that we can be one in purpose in our times of crisis despite our differences.
There is a commonality in our grief and our pride. Confederate and Union mothers felt the pain of loss as keenly as their opposites. Fathers on both sides took pride in the service of their sons in the pursuit of higher ideals. We come together when we stand to honor the sacrifice of our fallen. By drawing us across the lines of our division in respect for their sacrifice, our brave soldiers perform one last act of service for the nation we all love and the freedoms we all value.
I can think of no more appropriate way to honor our fallen servicemen and women today than to close with the words spoken by Abraham Lincoln at the dedication of the national cemetery at the Gettysburg Battlefield on November 19, 1863. These poignant words are rendered more powerful by the awareness that the contest was still undecided when they were drafted. While written for a specific occasion, they speak universally to the significance of the sacrifice made by American soldiers throughout our history.
“Four score and seven years ago our fathers brought forth on this continent, a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal. Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived and so dedicated, can long endure. We are met on a great battlefield of that war. We have come to dedicate a portion of that field as a final resting place for those who here gave their lives that that nation might live. It is altogether fitting and proper that we should do this. But, in a larger sense, we cannot dedicate—we cannot consecrate—we cannot hallow—this ground. The brave men, living and dead, who struggled here, have consecrated it, far above our poor power to add or detract. The world will little note, nor long remember what we say here, but it can never forget what they did here. It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced. It is rather for us to be here dedicated to the great task remaining before us—that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion—that we here highly resolve that these dead shall not have died in vain—that this nation, under God, shall have a new birth of freedom— and that government of the people, by the people, for the people, shall not perish from the earth.”
The United States of America has always been a nation of tremendous diversity. Yet we have managed to draw strength from that diversity by uniting our rich population on the foundation of distinctly American ideals. These ideals include a belief in the importance of individual liberty, human rights, and equal opportunities for our citizens. This principle of unity in diversity is featured prominently on our national seal in the Latin phrase E pluribus unum. Out of many we have sought to forge a national unity. We have succeeded at times, especially during moments of national crisis when our corporate identity as Americans overshadowed our tribal differences. Other moments in our history have been much more divisive. There have been times when our national unity has been tested and strained to the breaking point. From the aftermath of our greatest national test, the American Civil War, there began a national emphasis on the collective recognition of our fallen heroes that emphasized those ideals that bind us together as Americans.
Even the warring Biblical siblings Isaac and Ishmael reunited to bury their father. Americans who had undergone the fiery trial of the American Civil War were not as quick to agree on a common memorial for their fallen. Stories abound of women in the South who decorated the graves of Confederate soldiers as the war came to an end. Many communities in both the North and South held dedication ceremonies to commemorate their fallen. These ceremonies were held at different times of the year. Different customs were practiced with no standardized format for what at that time was known as “dedication day.” Yale historian David W. Blight argues in his book Race and Reunion: The Civil War in American Memory that former slaves in Charleston, South Carolina were responsible for observing the first “Dedication Day” event on May 1, 1865. Yet there is no consensus regarding which town deserves the distinction of being the first municipality to celebrate Dedication Day. President Lyndon B. Johnson settled the question practically, if not necessarily factually, by proclaiming in 1966 that Waterloo, NY was the first community to hold a Memorial Day celebration.
The first official proclamation of Memorial Day was issued on May 5, 1868, by General John Logan. Logan was national commander of the Grand Army of the Republic, a fraternal veterans’ organization created in 1866 to serve as a support system and lobbying arm for Union Army veterans. Logan’s “General Order No. 11” stated that the memorial observance was to be held on May 30, 1868. This date was chosen because it did not coincide with any particular battle. Logan wrote of the content of the commemoration:
“Let us, then, at the time appointed gather around their sacred remains and garland the passionless mounds above them with the choicest flowers of spring-time; let us raise above them the dear old flag they saved from dishonor; let us in this solemn presence renew our pledges to aid and assist those whom they have left among us a sacred charge upon a nation’s gratitude, the soldier’s and sailor’s widow and orphan.”
The reasons for the memorial observances were stated to match the gravity of the occasion.
“Their soldier lives were the reveille of freedom to a race in chains, and their deaths the tattoo of rebellious tyranny in arms. We should guard their graves with sacred vigilance. All that the consecrated wealth and taste of the nation can add to their adornment and security is but a fitting tribute to the memory of her slain defenders. Let no wanton foot tread rudely on such hallowed grounds. Let pleasant paths invite the coming and going of reverent visitors and fond mourners. Let no vandalism of avarice or neglect, no ravages of time testify to the present or to the coming generations that we have forgotten as a people the cost of a free and undivided republic.”
Sadly, the republic was still divided in spirit if not physically. While all Northern states, starting with New York in 1873, had officially recognized Memorial Day as a holiday by 1890, the Southern states still resisted uniting their observance with the rest of the nation. Southerners seemed to feel that honoring their dead on the same day as the rest of the nation honored fallen Union soldiers cheapened the cause for which the Confederate soldiers had fought and fallen. Southern states observed a separate Confederate Memorial Day in the late spring, possibly beginning as early as 1866.
On the morning of May 19, 1536, Anne Boleyn lost her head. Literally! Convicted of treason and adultery, Anne was sentenced to death by beheading. The execution had been delayed to accommodate the arrival of the executioner, an expert French swordsman. The selection of a skilled executioner was one of the few mercies Henry VIII showed the woman who had once been his obsession. Her failure to provide a male heir condemned her, as it had her predecessor, to the status of irritating inconvenience in the eyes of her husband. The energetic efforts of Thomas Cromwell, Chief Minister to the king, elevated court rumors to the status of official attainders against five men who were accused of having carnal knowledge of the Queen. The most shocking of these charges was the allegation that Anne had committed incest with her own brother, George Boleyn. Anne was also accused of joking about the king’s death with Henry Norris, Knight of Berkshire and one of Henry’s close friends. Most historians today generally reject the validity of these charges. Most of them appear to be based on rumor, comments taken out of context, and Anne’s own flirtatious demeanor with her favorites which seemed to go no further than playful banter. Thomas Cromwell utilized Anne’s inconvenience in Henry’s eyes, her general unpopularity with the English people, and the rumors circulating at court to engineer the downfall of his former ally.
Anne was certainly the sort of person who elicited strong responses from both admirers and detractors alike. Her rise to power had come at the expense of Queen Catherine of Aragon, a popular and pious queen. Anne had managed to secure Henry’s affection and to provide him with a second chance to produce a male heir for his throne. The price had been incredibly high. Henry transformed the ecclesiastical and political order of his realm through a series of Parliamentary Acts that made him Supreme Head of the English Church and the primary beneficiary of confiscated monastic wealth.
I recently engaged in a historical simulation with one of my classes at Dordt College in which we recreated the Reformation Parliament and the circumstances leading to Anne’s execution. As my students channeled the fractious personalities and events of the sixteenth century, I was reminded of the radically different interpretations of Anne Boleyn’s legacy in the sixteenth century. Her role as a central figure at the beginning of England’s ecclesiastical transformation meant that she was readily adopted as a heroine by Protestant chroniclers such as John Foxe and demonized by Roman Catholic leaders such as Reginald Pole and Stephen Gardiner. Catholic detractors suggested that she had bewitched Henry with dark magic, an idea that may have originated with Henry himself after her miscarriage in January 1536. Stories circulated that her internal evil was mirrored on the outside by physical deformity, including an extra finger. There is a fascinating and entertaining account of the legends surrounding Anne in Susan Bordo’s excellent book The Creation of Anne Boleyn (Houghton Mifflin, 2013). After the rise of Anne’s daughter, Elizabeth I, writers attempted to craft a more honorable legacy for their new Queen’s mother. John Foxe praised Anne in his famous Acts and Monuments, more popularly known as the Book of Martyrs. He identified her contribution to the English Reformations through her presentation of Protestant writings to Henry and her promotion of Protestant officials such as Thomas Cromwell and Thomas Cranmer. Centuries of historical research have gradually chipped away at some of the more fantastic Catholic legends about Anne while also tempering some of the more heroic images advanced by Protestant writers.
What generally remains after historians have sifted through the ashes is a portrait of an interestingly complex woman who was neither witch nor saint. Historians, both academic and popular, have and will always have a variety of responses to Anne. Generally, they do agree that the worst of Anne is not true and the best of her needs some qualification. It is true that Anne’s influence probably aided the influence of the Protestant faction at Henry’s court. She was most definitely sympathetic to Protestantism. Her charge to Matthew Parker, her personal confessor and later Elizabeth’s first Archbishop of Canterbury, to care for her daughter may have left a Protestant influence for the future queen. That Protestant influence would be cultivated by Elizabeth’s stepmother, Katherine Parr, and the influence of Protestant tutors such as Roger Ascham. She certainly had a genuine motherly affection for her daughter. And she was probably innocent of the charges that resulted in her execution.
But Anne’s status should definitely read “it’s complicated.” Protestant admirers often overlooked the fact that Anne achieved influence through the employment of what Henry referred to admiringly as her “French ways.” Her path to power was paved with the seduction of another woman’s husband, the displacement and humiliation of Queen Catherine, and the rejection of Henry’s daughter Mary by her father. One has to remember when admiring her many good qualities that her dark side could be very dark. She could be explosive and vengeful when things did not go her way. And would we really have it any other way? It is the explosive mix of virtue and vice, calculation and volatility that makes her story so interesting.
Historians often use the disastrous campaign of the Spanish Armada in 1588 as the perfect illustration of why strictly providential history is so problematic. I think the polarized interpretations of figures like Ann Boleyn illustrate the potential pitfalls of providential history as well. Providential history could be defined as history which operates on the assumption that the interpreter understands divine agency in history and can relate every historical event to their interpretation of God’s plan for the ages. I see this kind of historical interpretation as different from the work that academic historians of faith do. A person can have a faith commitment in their historical studies and a general sense that the will of God or ultimate realities guide historical events in a coherent fashion without necessarily engaging in providential history. Providential history pushes interpretation of historical events to the point of identifying precisely what God was up to in specific historical circumstances. Mature people of faith know that what God is doing is not always obvious nor does it make sense from a human perspective. What we judge to be evil God may intend for good, and what we judge to be good could have the unintended consequence of promoting evil in human societies. It is hard enough to see the divine plan in our present circumstances. It is just as hard if not harder to discern the hand of divine providence in the past. Historical interpretation, much like Anne, is complicated. And that is what makes it fun! My students are most engaged in my classes when we move beyond the Joe Friday, “Just the Facts Ma’am” approach to history to actually engaging the interesting questions of historical interpretation. Those questions and issues are the stuff that makes history perennially relevant.
The Immortal Sherlock: The Resurrection of Sherlock Holmes and an Author’s Quest to Accept His Place in the Great Story
Arthur Conan Doyle was done. In December 1893, he finally disposed of the burden that had dogged him since 1888. Ironically, that perceived burden was also the means of his economic success and growing fame. Sherlock Holmes had become odious to his creator and therefore Sherlock must go. Despite the ravenous appetite of a growing reading public, Arthur Conan Doyle decided to dump his famous detective in the swirling waters at the bottom of the Reichenbach Falls near Meiringen, Switzerland in The Final Problem. The Final Problem was an unusual Holmes story. There was no puzzle or mystery involved. Most of the story involved Holmes and Watson traveling across Europe to elude the vengeance of Professor James Moriarty. Viewers of Sherlock or Elementary who are not familiar with the Holmes literary canon might be surprised to learn that Moriarty was created solely for the purpose of offering an antagonist worthy of Holmes’ final case. Moriarty only appeared in The Final Problem (1893), Holmes’ account of his own survival in The Empty House (1903), and as the unseen orchestrator of events in The Valley of Fear (1914). Fans of subsequent Holmes books, movies, and television series who have enjoyed the intellectual rapier thrusts exchanged between the legendary consulting detective and the “Napoleon of Crime” are indebted to Doyle’s frustration with Holmes for the creation of his brilliant opposite.
For those of us who love history and love to write, the drama behind the story is often as compelling as what unfolds on the page or screen. Doyle had filled the empty hours of his struggling medical practice with the composition of his Sherlock Holmes tales. These stories brought him success and public recognition. They eventually came to frustrate Doyle as he began to feel that the persistent deadlines for Holmes stories prevented him from pursuing work on historical novels. He felt that these novels were a greater literary contribution than what he considered to be frivolous detective fiction. His wife’s terminal illness sealed Doyle’s resolution to free himself from the demands of writing Holmes so that he might devote more time to her care and to those literary pursuits that he deemed more serious. Having completed The Final Problem, Doyle tersely observed in his notebook: “Killed Holmes.”
Doyle may have been casual about disposing of his famous creation, but his fans were most definitely not satisfied to leave their hero below the swirling depths in Switzerland. Fans expressed emotions ranging from mild disappointment to outrage. One woman famously began her letter to Doyle with the salutation, “You Beast!” Despite the furious clamor to have Holmes return, Doyle resisted for several years. It was not until 1901 that Doyle finally relented with the publication of The Hound of the Baskervilles, arguably the most famous Holmes tale. He stressed that this story happened before The Final Problem and did not constitute a resurrection for Holmes. Having tested the waters, Doyle finally resolved to restore Holmes to life with the publication of The Adventure of the Empty House in 1903.
Public pressure is often credited with pushing Doyle to revive Holmes. While that is probably true to a degree, Doyle also seemed to have reached a point of acceptance in regard to being known as the creator of Sherlock Holmes. Graham Moore’s 2010 novel The Sherlockian offers an entertaining fictional scenario of Doyle working through his frustrations with Holmes in the midst of a real case. Most of Doyle’s other pursuits, while successful in different ways, did not insure him the visibility or immortality that Holmes gave him. No one living today besides a few specialists and avid Doyle fans have actually read The White Company (1891), Doyle’s gripping tale of the Hundred Years War. The only reason anyone besides a few paranormal aficionados cares that Doyle was an avid spiritualist is because he was also the creator of Sherlock Holmes. His chronicle of the Boer War and service to the British Empire was notable at the time, but it is Holmes who inspires non-Brits to care about those aspects of his career.
It is ironically often harder for historians and authors to accept the reality that none of us usually gets to write our own story. Irony ensues because we who write everyone else’s story are accustomed to controlling the narrative. However Doyle saw himself, the world would forever see him as the creator of Sherlock Holmes. Whether or not Doyle ever fully understood that, he did come to accept that he had created something worthwhile. We do not get to choose our times or the challenges and opportunities they yield. Sometimes we are not the best judges of our most lasting contributions. What we assume to be fluff or frivolity may indeed be impacting people in positive ways beyond our understanding.
One hundred twenty years after the first appearance of the Final Problem, American audiences are preparing for the premier of the fourth season of the BBC series Sherlock on January 19. The first episode, The Adventure of the Empty Hearse, will unveil the resolution of yet another adaptation of Doyle’s famous plunge into the falls. The next Guy Ritchie Holmes film will also feature the explanation of Holmes’ escape from his plunge with Moriarty into the falls at the end of A Game of Shadows (2011). Long after The Final Problem, Sherlock Holmes is alive and well. And his survival insures the continued cultural immortality of Arthur Conan Doyle, the creator who once sought so eagerly to destroy his creation.
A number of significant historical events will be commemorated this weekend. The assassination of John F. Kennedy and the death of the Christian writer C. S. Lewis occurred on November 22, 1963, with Kennedy’s death overshadowing that of Lewis. Coverage of Kennedy’s assassination was the dominant news item on November 23. Amidst the buzz of history unfolding, a program premiered on the British Broadcasting Network. The pilot episode was repeated later the same week because network executives were concerned that the rollout of the show had been adversely affected by extended media coverage of the Kennedy assassination. The new program was a quirky science fiction serial about a time travelling alien and his companions who hopped across time and space in a time machine disguised as a blue police box. Fifty years later, he continues to traverse the stars and the ages with no sign of stopping any time soon.
Fans of the recent incarnations of Doctor Who often forget that the program was initially launched in 1963 as an educational series. This goal was explicitly indicated by the inclusion of the Doctor’s initial companions in the pilot episode, “An Unearthly Child.” Ian Chesterton and Barbara Wright (portrayed by William Russell and Jacqueline Hill) were both school teachers at the fictional Coal Hill School in London. Wright was a history teacher while Chesterton taught mathematics. They were swept into the adventures of the Doctor when his granddaughter, Susan (portrayed by Carole Ann Ford), became one of their pupils. Using the surname Foreman, taken from the name of the junkyard where the Doctor had hidden the TARDIS (Time and Relative Dimension in Space, an acronym we later learn was coined by Susan), Susan enrolled at Coal Hill School in order to satisfy her craving for learning and desire to experience life on earth. Her two teachers were fascinated with Susan’s advanced knowledge in their areas of expertise, but simultaneously puzzled by her lack of familiarity with aspects of basic popular culture. They were also concerned about Susan’s comments that her grandfather was reclusive and did not enjoy visitors. Their concern led them to Foreman’s junkyard where they discover the secret of the TARDIS and its occupants. The Doctor, in his first incarnation played by William Hartnell, basically carried them into the past against their will to prevent them from sharing the true origins of the Doctor and Susan.
From their first adventure dealing with a power struggle between Stone Age rivals to current Doctor Matt Smith’s chummy relationship with Winston Churchill, history has been an integral part of Doctor Who. The series regularly accomplishes one of the greatest challenges facing historians, to educate in such an entertaining way that people barely realize how much they are learning in the process. There are a number of ways in which the series conveys profound lessons about the nature of history, the historical profession and the wisdom to be gleaned from historical awareness.
The ageless nature of the Doctor provides the opportunity for people to discover the longer perspective that can only be gleaned from historical awareness. Much like one of my other favorite series, the departed but fondly remembered Highlander the Series, Dr. Who offers viewers a hero whose extended lifespan enables him to experience the grand sweep of historical events in a way that the rest of us, with our shorter lifespans and temporal limitations, can never achieve. We can only do so vicariously through the medium of historical accounts. The Doctor has seen it all and learned profound lessons through what he has seen. Much like historians, one of his greatest delights is to share the lessons he has learned on his historical journeys with others. Associating himself with companions is one way he is able to share his knowledge and experience. The Doctor has an advantage over the rest of us in terms of being able to actually take his companions into the actual events, with all the associated sights, smells, and sensations. And all the dangers too. From interactions with Queen Victoria to negotiations with Richard Nixon, the Doctor enjoys the unlimited access to all cultures, all times and all locations that is the intellectual stock and trade of the historian. Though strangely, as has often been noted, this free spirit of time and space does seem to have an unusual affinity for late twentieth and early twenty-first century London.
The Doctor’s adventures have demonstrated time and again the temporal nature of our existence and the impermanence of our cultures. The ninth doctor (Christopher Eccleston) literally takes his companion Rose Tyler (Billie Piper) to the end of the world in the episode fittingly entitled, “The End of the World.” A persistent theme of the series is the impermanence of all things and the inevitability of change. Even the Time Lords were “exterminated” in their war with the Daleks with some help from the Doctor that will hopefully be explained in the fiftieth anniversary episode. This theme is reflected in the Doctor himself who undergoes a regenerative cycle when his current life ends. This regenerative cycle allows the Doctor to survive but with a different appearance and personality. In the episode “The Name of the Doctor,” we learn that even the Doctor himself will one day die in the distant future. The rise and fall of human civilizations as well as the extinction of fictional alien civilizations testifies to the constant of change in the unpredictable universe of Doctor Who.
A certain degree of cynicism and detachment is one of the continuing risks of historical awareness. One of my faculty colleagues once remarked that historians are, “some of the most cynical people I know.” The Biblical lament of Quoheloth (the Preacher) in the book of Ecclesiastes that “there is nothing new under the sun” easily becomes the weary Shibboleth of the cynic whose perspective on the endless tragedies of history has induced a sense of numbed detachment. The Doctor combats this detachment in part through his attachment to his companions. These relationships anchor the Doctor to the importance of particular existence and individual needs as a counter to the sweeping perspective he gains from temporal freedom. This crucial function of the companions was illustrated by Amy Pond (Karen Gilland) when she urged the Doctor to risk the destruction of the futuristic “Starship U. K.” to save a single creature in “The Beast Below.” Her persistence ultimately prompts the Doctor to resist his practical urge to sacrifice one creature to save many others and find a solution that could benefit all parties. While the Doctor is not “human,” his companions do serve to keep him in touch with his “humanity.” There is a sense in which the student of history must be anchored to both past and present, theory and application, wisdom and compassion. It is exceedingly tragic when someone lives in the past with no reference to the concerns of the present. Enamored with grand ideas but disconnected to vital human contact. What an irony when someone appreciates the sweep of history while ignoring the drama of their own times! The sense that particular persons matter even in the endless unfolding of broader events fuels the Doctor’s strong sense of ethics.
Dr. Who has underscored the role of history as a laboratory for ethical issues throughout its run. One early serial, The Aztecs, posed the impossible dilemma of whether we should change history if we could. Barbara Wright is presented with the incredible opportunity to influence Aztec beliefs regarding human sacrifice when the Aztec people mistake her for a deity. Barbara’s compassion and idealism prompt her to consider forbidding the continuation of human sacrifice. The Doctor insists that she does not have the right to change history even for such a noble cause. He proclaims, “What you are trying to do is utterly impossible. Believe me, I know.” Much more is at stake in this debate than just the fictional conceptions of how space and time operate. Idealism and sober experience are colliding in the face of human depravity and atrocity. Doctor Who embraces the notion that human history is an interweaving series of light and darkness, nobility and atrocity. Doing the right thing is never simple. Good intentions often do great damage while those with the darkest tendencies are often co-opted despite themselves to serve the cause of justice. The historical phenomenon of unintended consequences is frequently on display as the Doctor tackles ethical issues. The greatest example of this theme may be yet to come with the official explanation of the Doctor’s role in ending the Time War which will be a central story point of the fiftieth anniversary special. John Hurt’s shadow Doctor, referred to as the “War Doctor” in the recently released prequel “The Night of the Doctor,” proclaims, “What I did I did in the name of peace and sanity.” To which Matt Smith’s eleventh Doctor replies, “But not in the name of the Doctor.” Moral choices are never simple nor are their consequences easily controlled in either the Doctor’s universe or our own.
C. S. Lewis, who died the day before Doctor Who premiered, advocated the importance of fiction as an “escape to reality.” Journeying to fantastic worlds became a window for connecting with profound theological concepts in the fiction of C. S. Lewis that were valid in the real world. Doctor Who fits Lewis’ concept of an escape to reality quite well. For fifty years, fans have been treated to an entertaining escape to a parallel world. In that fantastic reality, the Doctor and his companions engage us with reality in a way that is profound, insightful and thoroughly enjoyable. The Doctor, in all his incarnations, has been our eccentric, yet trustworthy guide for fifty years. Happy Fiftieth Doctor and may your journeys continue for at least fifty more!
Few individuals have had a more interesting life and afterlife than Guy Fawkes (1570-1606). Fawkes provided the basis for one of the most interesting annual celebrations in Great Britain when he participated in a plot to destroy the meeting hall of the House of Lords on November 5, 1605. Of course, the plan also involved destroying the Parliamentarians themselves along with King James I. The resulting political vacuum was intended to provide an opportunity for the Catholic faction led by Robert Catesby that had recruited Fawkes to seize power through James’ daughter, Elizabeth. The plot was a comedy of errors. It began to capsize when an anonymous note arrived at the home of the M. P. Lord Monteagle on October 26 warning him not to attend the upcoming Parliamentary session. The plot continued to unravel when Fawkes was apprehended leaving the cellars beneath the Houses of Lords. Despite Fawkes’ attempts to persuade his captors that his intentions were honorable, the sizable cache of gunpowder he had hidden in the cellar told a different tale. Fawkes suffered torture on the rack and eventually revealed the names of eight out of thirteen conspirators. They suffered execution on January 31, 1606. Some chroniclers wrote that Fawkes leaped from the scaffold after the noose was around his neck in order to avoid more protracted suffering.
The Gunpowder Plot itself is a fascinating episode in British history, but the unusual celebration that arose in its wake was equally fascinating. Parliament passed an act in January of 1606 authorizing annual public celebrations to be held on November 5 each year in commemoration of God’s preserving the life of the king. The idea was proposed by a Puritan M.P., Lord Montague, and continued to be a popular observance for Puritans during early seventeenth century. The Gunpowder Plot observance in Britain and her colonies often involved fireworks, bonfires, and the burning of an effigy known by the nineteenth century as “the Guy.” This practice replaced previous observances in which the effigy represented the Pope. It became customary to burn effigies of controversial political and social figures. The celebration also became increasingly known as Guy Fawkes Day during the eighteenth and nineteenth centuries. Children often canvassed their neighborhoods during the mid-twentieth century asking for “a penny for the Guy.” A ceremonial “search of the cellars” is still part of the festivities that accompany the opening of the British Parliament. The celebrations waxed and waned at different times through the centuries. The holiday has struggled in the last couple of decades to retain its hold on the British imagination.
The world of imagination has intervened to preserve the memory of Fawkes himself, albeit in a somewhat altered form. Alan Moore’s comic strip and graphic novel V for Vendetta, published between 1982 and 1989, featured a protagonist codenamed “V” who wore a chalky white Guy Fawkes mask. “V” was essentially a terrorist attempting to subvert a totalitarian regime in a future Great Britain. The movie adaptation starring Hugo Weaving and Natalie Portman was released in 2005. This film adaptation of V for Vendetta introduced Moore’s image of Guy Fawkes as a model for resistance to tyranny to a broader audience. “V” is portrayed as a terrorist, but a terrorist who does what he does for noble ends. This generous interpretation of “V”’s motivations is implied of Fawkes as well, who was just trying to blow that pesky Parliament up for the cause of freedom.
One only has to note how the visage of Guy Fawkes has been adopted by movements such as Occupy Wall Street to see the popularity of this interpretation. In fact, the contemporary popular evaluation of Guy Fawkes is a great example of how iconic historical figures can become completely separated from their real life inspiration. The real Guy Fawkes was acting to secure religious freedom for Catholics, but only so they could ban Protestantism in England. His methods, which would have caused many deaths, are certainly questionable. This sobering reality is important especially in our troubled times when religious radicals are all too ready to strap on explosives and broadcast their beliefs with violence. What did the Guy say? What does the Guy say? Not what many people suppose he said.
The Gunpowder Plot and the story of Guy Fawkes are reminders that coercing the human conscience is never the proper path to fostering true belief or morality. It is a reminder that violence is not the solution to our disagreements regarding temporal or ultimate questions. And it is a reminder that we ought not to take ourselves too seriously. For out of conspiracy, folly, and tragedy can arise festivity and celebration. So cast aside your cares. Stoke the fire. Throw your blazing effigies to the flames. And, if you come this way, spare a penny for the Guy.
To learn more about Guy Fawkes and the Gunpowder Plot, check out the resources below.
Antonia Fraser, Faith and Reason: The Story of the Gunpowder Plot. Anchor, 1997.
John Davis, Pity for the Guy. Peter Owen Publishers, 2010.
James Sharpe, Remember, Remember: A Cultural History of Guy Fawkes Day. Harvard University Press, 2005.
Protestant Christians observe two important anniversaries the last week of October. Protestants recognize October 31 as the anniversary of the formal beginning of the Protestant Reformation or Reformations in Europe. This annual observance commemorates the public presentation of Martin Luther’s “95 Theses” on October 31, 1517. Some Roman Catholic and Orthodox Christians also note the day and provide their own interpretation of the events that created the third major wing of Christianity. Celebrations of “Reformation Day” have become more elaborate as momentum builds toward the five hundredth anniversary of Luther’s “95 Theses” in 2017. A smaller subset of Protestants is also marking the one hundred seventy-sixth anniversary of the birth of Abraham Kuyper on October 29, 1837. Though Kuyper’s life and career are not as well-known as Luther’s, a significant number of Reformed Protestants and evangelical scholars hold a deep appreciation for Kuyper and his influence.
The close association of these two significant dates serves as a great reminder that Christianity and the academy have had a long relationship in western culture. The relationship between the life of faith and the formation of the intellect has sometimes been mutually supportive and, at other times, extremely contentious. Luther and Kuyper both exemplify the emphasis many Protestants have placed on education over the last five hundred years. That emphasis posits a crucial relationship between vibrant Christian intellectual pursuits and the continuous need for constant reform of the church embodied in the Reformation credo Semper Reformanda (Always Reforming).
Martin Luther (1483-1546) was a product of both the Augustinian monastery and the academy. People today so often focus on his career as a polemicist that they forget his daily roles as priest and professor occupied most of his time. Throughout his career he continued to emphasize the life of the mind as a natural extension of the life of faith. His deepest insights regarding the nature of justification by faith and sanctification as gratitude emerged in the context of his preparation to teach the scriptures to Wittenberg students. Education was central to the Reformation project of disseminating scripture to every person.
Abraham Kuyper (1837-1920) lived three centuries after Luther and dealt with the emerging clash between modernism and Christianity in his native Netherlands. Kuyper was a Renaissance man whose pursuits ranged from journalism to theology, education, and political activism. His primary obsession was the application of a Reformed Christian perspective to the totality of human experience. He believed that God had ordained a creational structure composed of separate spheres of human governance and activity. These spheres had different functions and modes of operation, but all were identified by Kuyper as legitimate and necessary arenas for Christian influence. Kuyper expressed his strong support for a Christian role in education through advocacy for open schooling in the Netherlands, supporting a system of Christian schools in the Netherlands, and by helping to found the Free University of Amsterdam in 1880. It was at the opening of the Free University that Kuyper made his famous assertion that, “In the total expanse of human life there is not a single square inch of which the Christ, who alone is sovereign, does not declare, ‘That is mine!’”
Luther launched a reformation of the soul and spirit while Kuyper urged a reformation of culture and creation. The differences between those programs are not as pronounced as one might think at first glance. Luther’s spiritual reformation had profound cultural implications and Kuyper’s cultural reformation was firmly grounded in the spiritual and theological tradition of the Protestant Reformers. Both depended on the importance of the essential unity of the life of the mind and the life of faith.
Church and academy, even the Christian academy, are sometimes wary, lukewarm allies in the contemporary world. Rank and file Christians often assume that Christian academics are disconnected ivory tower dwellers while many academics are equally guilty of assuming that the “uninformed masses” are unable to offer anything of true substance in the great conversations of our day. Martin Luther and Abraham Kuyper might suggest that Christians should take better care of their academics and Christian academics should demonstrate a deeper commitment to the church.
Why does it matter? In part because a Christian church that is truly committed to the lived reality of Semper Reformanda is constantly in need of the resources that come from both the heart and the head. The integral unity of faith and reason, credulity and critique, exhortation and accountability is essential to continuous improvement in academic parlance or constant reformation in theological terminology.
The forces that war against Semper Reformanda are legion. The maverick movements that Luther and Kuyper founded have now been the establishment themselves long enough to spawn their own maverick movements. Aspiring young Luthers today often find that the very Christian entities that taught them to revere the German Reformer do not look too kindly on young upstarts who dare to upset the ecclesiastical applecart. American Christians have far too often in recent years discarded Christian models of servant leadership and community accountability in favor of the stark authoritarian pragmatism advocated by Martin Luther’s contemporary, Niccolo’ Machiavelli.
Despite the challenges, the examples of Martin Luther and Abraham Kuyper are a reminder that the vision is worth the struggle. “The Reformation” can never be a past event. The entire worldview of the Reformers hinged on the reality that humans are saved by grace alone and that we never reach a finished state in this life. We are always in need of further growth. Our institutions are equally imperfect and in need of constant advancement. True Reformation must always be a continuing process requiring the combined energies of church and academy.