What witch-hunters can teach us about today’s world

David Frankfurter, Boston University

It is hardly a new observation that political leaders seeking populist appeal will exacerbate popular fears: about immigrants, terrorists and the other. The Conversation

President Donald Trump plays to fears of immigrants and Muslims. Benjamin Netanyahu inflames Israeli fears by constantly reminding citizens about the threats around them. And many African leaders bring up fears of satanism and witchcraft. In earlier times, too, American and European leaders invoked threats of communists and Jews.

Such observations explain how leaders use fear to create popular anxiety. But this focus on fear and evil forces, I believe, does something else as well – it could actually contribute to a leader’s charisma. He or she becomes the one person who knows the extent of a threat and also how to address it.

This path to leadership takes place in much smaller-scale situations too, as I have studied in my own work.

In my book “Evil Incarnate,” I analyze this relationship between claims to discern evil and charismatic authority across history, from European and African witch-finders to modern experts in so-called satanic ritual abuse.

How charisma works

In popular parlance one calls a person charismatic because he or she seems to possess some inner force to which people are drawn.

Social scientists have long perceived this ostensible inner force as the product of social interaction: Charisma, in this interpretation, arises in the interplay between leaders and their audiences. The audiences present their own enthusiasms, needs and fears to the leader. The leader, for his part, mirrors these feelings through his talents in gesture, rhetoric, his conviction in his own abilities and his particular messages about danger and hope.

Witch doctors in Africa resting between dances.
Library of Congress Prints and Photographs Division Washington, D.C. 20540

In sub-Saharan Africa, over the course of the 20th century, charismatic witch-finders swept through villages promising the cleansing of evil. In both Africa and Europe, communities had long been familiar with witches and their modes of attack in general. It has been common in many cultures throughout history to attribute misfortune to witches, who are both a part of society and also malevolent. Misfortunes can thus seem to be the product of human malevolence rather than some abstract divine or natural cause.

Witch-finders, as I see it, have offered four new elements to the “basic” image of witches:

  • They proclaimed the immediacy of the threat of witches.
  • They revealed the new methods witches were using to subvert the village or afflict children.
  • They offered new procedures for interrogating and eliminating witches.
  • And most importantly, they proclaimed their own unique capacity to discern the witches and their new techniques to purge them from community.

The witch-finder could show people material evidence of witches’ activity: grotesque dolls or buried gourds, for example. He – rarely she – could coerce others to testify against an accused witch. Often, he would present himself as the target of witches’ active enmity, detailing the threats they had made against him and the attacks he had suffered.

The witch-finder’s authority over – and indispensability to – the growing crisis of threatening evil shaped his charisma. People came to depend on his capacity to see evil and on his techniques of ridding it from the land. An uncleansed village felt vulnerable, awash in malevolent powers, one’s neighbors all suspect; while a village that a witch-finder had investigated seemed safer, calmer, its paths and alleys swept of evil substances.

Witch hunts, satanic cults

Of course, in order for a witch-finder to be successful in activating fears, there were many extenuating circumstances, both historical and social, that had to work in his favor. These could be catastrophes like the plague, or new ways of organizing the world (such as African colonialism), or political tensions – all of which could make his identification of evil people especially useful, even necessary. Also, he had to come off as professional and he had to have the ability to translate local fears in compelling ways.

Indeed, there were many situations in both Europe and Africa when such claims to authority failed to stimulate a sense of crisis or to legitimate witch-finders’ procedures.

St. Bernardino of Siena.
Fr Lawrence Lew, O.P., CC BY-NC-ND

For example, in 15th-century Europe, the Franciscan friar Bernardino was able to instigate horrific witch-burnings in Rome but failed to persuade the people of Siena of the dangers witches posed.

But there are times when this pattern has come together and witnessed outright panic and ensuing atrocities. As historians Miri Rubin and Ronald Hsia have described, various such charismatic discerners of evil in medieval and Renaissance Northern Europe (often Christian clergy and friars) promoted false charges against local Jews that they hungered for stolen Eucharists or for the blood of Christian children.

These charismatic leaders organized hunts through Jewish houses to uncover signs of mutilated Eucharist or children’s bones – hunts that swiftly turned into pogroms, as participants in these hunts felt a conspiracy of evil was emerging before them.

The contemporary West has in no way been immune to these patterns on both large and more restricted scales. During the late 1980s and early 1990s, the United States and the United Kingdom found themselves facing a panic over satanic cults, alleged to be sexually abusing children and adults.

In this case, a number of psychiatrists, child protection officers, police and evangelical clergy were styling themselves as experts in discerning the abuses of satanists both in daycare centers and among psychiatric patients. Many people came to believe in the urgency of the satanic threat. Yet no evidence for the existence of such satanic cults ever came to light.

Needs of an anxious culture

In many ways we can see a similar interplay between charisma and the discernment of evil in those modern leaders that seek a populist appeal.

For example, in his campaign Trump insisted that he alone could utter the words “radical Islamic terrorism” which assured members of his audience that only Trump was calling out “the terrorist threat.” In Philippines, President Rodrigo Duterte threatened publicly to eat the liver of the terrorists there. These leaders, I believe, are trying to convey that there is a larger threat out there and, even more, they are assuring people that the leader alone understands the nature of that larger threat. Trump’s several attempts to ban Muslim visitors since his election have made his supporters feel understood and safer.

As my work on witch-finders shows, an anxious culture may invest itself in a leader who, it feels, can discern and eliminate a pervasive and subversive evil. Perhaps, in today’s world, the terrorist has become the new “witch”: a monstrous incarnation of evil, posing a unique threat to our communities and undeserving of normal justice.

Do our leaders provide the charismatic leadership for this current era?

David Frankfurter, Professor of Religion, Boston University

How the hijab has grown into a fashion industry

Faegheh Shirazi, University of Texas at Austin

Nike, the well-known U.S. sportswear company, recently introduced a sports hijab. The reaction to this has been mixed: There are those who are applauding Nike for its inclusiveness of Muslim women who want to cover their hair, and there are those who accuse it of abetting women’s subjugation. The Conversation

Nike, in fact, is not the first corporate brand to champion the hijab. I am the author of “Brand Islam,” and I have seen how it is commonly assumed, particularly in the West, that Muslim women are indifferent to fashion.

Nothing could be further from the truth: My research shows that Islamic fashion is a rapidly growing industry.

History of sports hijab

The use of an official sports hijab in competition dates back to July 2012 when the International Football Association Board (IFAB), custodians of the rules of soccer, overturned a 2007 ban which had argued that the hijab was “unsafe” for sports persons as it could “increase” the risk of neck injuries.

While overturning the ban, the IFAB noted that there was nothing in “the medical literature concerning injuries as a result of wearing a headscarf.” The sports hijab is secured in place with magnets. If it does get pulled off, another cap remains underneath, to cover the sports person’s hair without causing any injuries.
In 2012, Muslim athletes wearing the hijab received considerable media attention. Wearing the hijab set them apart from other Olympic athletes. Since then, several lesser-known, sports hijab companies – much before Nike’s pro hijab – have come to be in this business.

History of Islamic fashion

The marketing of Islamic fashionable clothing, however, is older than the sports hijab.

In my research, I found that it started in the 1980s when ethnic grocery dealers in Western Europe and the United States began importing modest fashion clothing along with other items for the Muslim population. That proved to be a successful business.

Prior to that, most Muslim women would put together their own styles.

These small endeavors ultimately morphed into a competitive and lucrative Muslim fashion industry. Islamic fashion in general is understood as women wearing modest clothing with long sleeves, descending to the ankle and having a high neckline. The outfits are nonhugging, with some form of head covering that could be draped in a variety of styles. Women who prefer to wear pants combine them with a long sleeved top that covers the buttocks and has a high neckline, along with a head covering.

Over time, national and international designers came to be involved in the sale of chic Islamic fashions. Today, Muslim fashion is a lucrative global industry with countries such as Indonesia, Malaysia and Turkey leading the way outside the Western countries. In 2010 the Turkish newspaper Milliyet estimated the global Islamic clothing market to be worth around US$2.9 billion.

The Global Islamic Economy report for 2014-2015 indicated Muslim consumer spending on clothing and footwear had increased to $266 billion in 2013. This represents a growth of 11.9 percent of the global spending in a period of three years. The report predicted this market to reach $488 billion by 2019.

The Islamic brand

This growth has had its share of controversies: Many designers use the term “Islamic” for their clothing. Religious conservatives and Muslim scholars have raised questions about what types of apparel would fit that category and whether defining clothing as “Islamic” was even permitted or lawful by Islamic principles – a concept known as “halal.”

In particular, critics have objected to the fashion catwalk presentations, which actually draw the gaze and attention of spectators to the bodies of models, while the purpose of a hijab is to distract and move the gaze away from the body. In Iran, for example, Islamic fashion is viewed by the ulama (religious scholars) as another Western influence and referred to as “Western Hijab.”

Defining clothing as Islamic has been controversial.
karmakazesal, CC BY

Nonetheless, the Islamic fashion industry has managed to initiate marketing campaigns that capitalize on the very core of Islamic precepts: Sharia, or the Islamic religious law. A Malaysian apparel company, Kivitz, for example, uses the phrase “Syar’i and Stylish.” In Malay, Syar’i is the same as Sharia.

In establishing a nominally Islamic brand, marketers make every effort to align their products with the core value of Islam. So, even when following the trendy fashionable seasonal colors and materials, clothing styles would include some sort of head covering.

Who are the consumers?

The question still remains: What led to such a rapid growth over a span of just three years?

My research has demonstrated that Muslims are more brand aware than the general population. However, in the past they were largely ignored by the fashion industry, perhaps, due to misconceptions that being a Muslim restricted people’s lifestyle.

And now, with a growing Muslim population, there is an increased demand for modest but also fashionable clothing for the youth, who have significant spending power. At the same time, traditional elite and wealthy Middle Eastern consumers who used to shop for fashionable clothing from European nations now prefer to shop from homegrown Muslim fashion designers.

Indeed, the halal logo on food and other products in addition to modesty in clothing has proved to be an effective strategy in creating a global Islamic identity.

As I have seen in my research, consumerism is changing what is means to be modern and Muslim today. As Vali Nasr, a Middle Eastern scholar, explains,

“The great battle for the soul of the Muslim world will be fought not over religion but over market capitalism.”

Faegheh Shirazi, Professor, Department of Middle Eastern Studies, University of Texas at Austin

Blasphemy isn’t just a problem in the Muslim world

Steve Pinkerton, Case Western Reserve University

Ireland’s state police recently concluded their investigation of comedian Stephen Fry, who stood accused of criminal blasphemy. The Conversation

In an interview that aired on Irish public television, Fry had described God as “capricious, mean-minded, stupid,” and “an utter maniac.” And Ireland’s Defamation Act of 2009 clearly prohibits the “publication or utterance of blasphemous matter.” Yet on May 8 the police closed the case, explaining they’d been “unable to find a substantial number of outraged people.”

The mild resolution to this incident stands in stark contrast to recent news out of Pakistan – which has seen a spike in blasphemy-related violence – and Indonesia, where the outgoing governor of Jakarta was just sentenced to two years in prison for speaking irreverently against Islam.

The Irish case is also a timely reminder, though, that anti-blasphemy laws are hardly unique to the Muslim world. According to the Pew Research Center, nearly one-fifth of European countries and a third of countries in the Americas, notably Canada, have laws against blasphemy.

In my research for a new literary study of blasphemy, I found that these laws may differ in many respects from their more well-known counterparts in Muslim nations, but they also share some common features with them.

In particular, they’re all united in regarding blasphemy as a form of “injury” – even as they disagree about what, exactly, blasphemy injures.

The hurt of blasphemy

In dropping their investigation of Stephen Fry, for example, the Irish police noted that the original complainant does not consider himself personally offended. Therefore they’ve determined he is “not an injured party.”

In the Muslim world, such injured parties are often a lot easier to find. Cultural anthropologist Saba Mahmood says that many devout Muslims perceive blasphemy as an almost physical injury: an intolerable offense that hurts both God himself and the whole community of the faithful.

For Mahmood that perception was brought powerfully home in 2005, when a Danish newspaper published cartoons depicting the prophet Muhammad. Interviewing a number of Muslims at the time, Mahmood was “struck,” she writes, “by the sense of personal loss” they conveyed. People she interviewed were very clear on this point:

“The idea that we should just get over this hurt makes me so mad.”

“I would have felt less wounded if the object of ridicule were my own parents.”

The intensity of this “hurt,” “wounding” and “ridicule” helps to explain how blasphemy can remain a capital offense in a theocratic state like Pakistan. The punishment is tailored to the enormity of the perceived crime.

That may sound like a foreign concept to secular ears. The reality, though, is that most Western blasphemy laws are rooted in a similar logic of religious offense.

As historians like Leonard Levy and David Nash have documented, these laws – dating, mostly, from the 1200s to the early 1800s – were designed to protect Christian beliefs and practices from the sort of “hurt” and “ridicule” that animates Islamic blasphemy laws today. But as the West became increasingly secular, religious injury gradually lost much of its power to provoke. By the mid-20th century, most Western blasphemy laws had become virtually dead letters.

That’s certainly true of the U.S., where such laws remain “on the books” in six states but haven’t been invoked since at least the early 1970s. They’re now widely held to be nullified by the First Amendment.

Yet looking beyond the American context, one will find that blasphemy laws are hardly obsolete throughout the West. Instead, they’re acquiring new uses for the 21st century.

Religious offense in a secular world

Consider the case of a Danish man who was charged with blasphemy, in February, for burning a Quran and for posting a video of the act online.

In the past, Denmark’s blasphemy law had only ever been enforced to punish anti-Christian expression. (It was last used in 1946.) Today it serves to highlight an ongoing trend: In an increasingly pluralist, multicultural West, blasphemy laws find fresh purpose in policing intolerance between religious communities.

Instead of preventing injury to God, these laws now seek to prevent injury to the social fabric of avowedly secular states.

That’s true not only of the West’s centuries-old blasphemy laws but also of more recent ones. Ireland’s Defamation Act, for instance, targets any person who “utters matter that is grossly abusive or insulting in relation to matters held sacred by any religion, thereby causing outrage among a substantial number of the adherents of that religion.”

With its emphasis on the “outrage” blasphemy may cause among “any religion,” this measure seems to be aimed less at protecting the sacred than at preventing intolerance among diverse religious groups.

Illustrations of prophecy: particularly the evening and morning visions of Daniel, and the apocalyptical visions of John (1840).
Internet Archive Book Images. Image from page 371.

The law itself has caused outrage of a different sort, however. Advocacy organizations, such as Atheist Ireland, have expressed fierce opposition to the law and to the example it sets internationally. In late 2009, for instance, Pakistan borrowed the exact language of the Irish law in its own proposed statement on blasphemy to the United Nations’ Human Rights Council.

Thus, Atheist Ireland warns on its website that “Islamic States can now point to a modern pluralist Western State passing a new blasphemy law in the 21st century.”

Blasphemy in modernity

That warning resonates with the common Western view of blasphemy as an antiquated concept, a medieval throwback with no relevance to “modern,” “developed” societies.

As Columbia University professor Gauri Viswanathan puts it, blasphemy is often used “to separate cultures of modernity from those of premodernity.” Starting from the assumption that blasphemy can exist only in a backward society, critics point to blasphemy as evidence of the backwardness of entire religious cultures.

I would argue, however, that this eurocentric view is growing increasingly difficult to sustain. If anything, blasphemy seems to be enjoying a resurgence in many corners of the supposedly secular West.

The real question now is not whether blasphemy counts as a crime. Instead it’s about who, or what – God or the state, religion or pluralism – is the injured party.

Steve Pinkerton, Lecturer in English, Case Western Reserve University

Bible classes in schools can lead to strife among neighbors

A federal lawsuit was filed recently against the Mercer County, West Virginia Board of Education, challenging a Bible program in the elementary schools. The plaintiffs are the Freedom From Religion Foundation and two parents and their children. One parent and both children have kept their names anonymous due to fear of reprisal. The Conversation

The Bible class was listed as an elective, but almost all students enrolled. The complaint alleges that the few who opted out were harassed and discriminated against. One of the plaintiffs in the case had already suffered harassment.

In my research for the book I wrote in 1999, “School Prayer and Discrimination,” I explored what happens to religious minorities and dissenters when public schools engage in sectarian prayer and Bible reading.

There is a long history of discrimination and even violence linked to Bible reading and school prayer.

What the law says

Students have always been free to pray or read the Bible on their own or with friends during free time at school. In public schools these days, student religious groups have access to school facilities before and after school to the same extent as any other noncurriculum-related student group. Any school that violates these principles would also violate the Constitution.

The Bible.
feryswheel, CC BY-ND

In contrast, school-endorsed Bible courses that promote a religious perspective have been unconstitutional since a 1963 U.S. Supreme Court ruling prohibited school-sponsored prayer and Bible reading.

Over the years, there have been a number of attempts – often supported by state legislatures – to get around the prohibition on Bible reading by offering Bible courses. If offered as electives and taught “objectively,” such classes could be considered constitutional.

What this means is if a public school offers a class that focuses on religion or the Bible, the material would have to be taught without promoting any particular religious position. Those of us in the law and religion field sum it up as, “Teach it, don’t preach it.”

These classes cross the line if they endorse or favor a particular religious view. The Mercer County case will thus examine if the class, as alleged, is overtly sectarian and promoted by the school, and hence unconstitutional.

Significantly, in many such cases where students opt out or dissent, parents have evidence of discrimination and harassment aimed at their children.

In fact, the United States Supreme Court acknowledged the concern about community members and school officials engaging in harassment of dissenters in an important footnote in a recent school prayer decision, Santa Fe Independent School District v. Doe. The case challenged the practice of having a student deliver a prayer over the public address system before each home varsity football game.

The United States Supreme Court quoted a lower court order prohibiting any attempt “to ferret out the identities of the plaintiffs in this cause, by means of bogus petitions, questionnaires, individual interrogation, or downright ‘snooping.’” The lower court had said it wanted the “proceedings addressed on their merits, and not on the basis of intimidation or harassment…”

As is evident from the above example, there are good reasons why families are often afraid to challenge these practices and want to remain anonymous, even when they raise the issue.

History of persecution

The history related to school-sponsored or endorsed Bible reading and prayer in the public schools is full of harassment, discrimination and even violence.

It goes back to the early 19th century, when Bible reading and prayer, in what were then known as “common schools” (a precursor to today’s public schools), were used to promote anti-Catholic sentiments. At that time, Catholics were persecuted for refusing to participate in Protestant Bible reading in the common schools.

King James version of the Bible.
Sarah Nichols, CC BY-SA

Nineteenth-century Catholic Canon Law specified that Catholics reading the Bible in English could read only the Douay Bible, which was the English translation approved by the Vatican. Schools, however, often required reading from the King James Bible, which, aside from its controversial history and translation concerns, included a dedicatory preface that referred to the pope as the “man of sin.”

Priests were literally tarred and feathered for encouraging their congregants not to participate in Protestant Bible reading and prayer in schools. Conflict over Catholic objections to Protestant Bible reading and prayer in the common schools was used as an excuse by anti-Catholic individuals, who opposed Irish immigration to the United States, to start riots. Several people were killed during such riots in Philadelphia in 1844.

Douay Bible.
Burns Library, Boston College, CC BY-NC-ND

Such instances continued well into the 20th and 21st century. In 1995, for example, students in a Mississippi elementary school who refused to take part in prayer that violated their family’s Lutheran faith faced serious discrimination. One such student was forced to wear headphones so he could not hear the prayer.

The family finally filed a lawsuit (which they won). However, they received bomb threats and death threats as a result of filing the lawsuit; the harassment of the children got even worse. In other cases children were beaten up for refusing to take part in school-endorsed religious activities in public schools.

And in what is perhaps one of the worst cases in the last half-century, a family in Little Axe, Oklahoma, who belonged to the Church of the Nazarene, a Protestant Christian church, had their house firebombed after they objected to school-supported religious activities favoring a different Protestant sect.

Why it can hurt religion

The truth is, school-sponsored Bible classes and Bible reading don’t hurt only religious minorities and dissenters, they can also hurt religion generally.

As Roger Williams (the founder of the current state of Rhode Island) noted more than 100 years before the founding of the United States, government support and influence can corrupt religion even if it seeks to promote it.

In several of his most famous writings, he wrote about the corrupting effect government can have on religion:

“God requireth not a uniformity of religion to be enacted and enforced in any civil state; which enforced uniformity (sooner or later) is the greatest occasion of civil war, ravishing of conscience, persecution of Christ Jesus in his servants, and of the hypocrisy and destruction of millions of souls.”

This history shows how such insistence on sectarian Bible teaching could lead a dominant religious group to discriminate and even act violently. From my perspective as a law and religion scholar, these situations do not seem to promote the values of loving one’s neighbor and protecting those in harms way that a number of religions, including Christianity, espouse.

Frank S. Ravitch, Professor of Law & Walter H. Stowers Chair of Law and Religion, Michigan State University

The Case for Christ: What’s the evidence for the resurrection?

Brent Landau, University of Texas at Austin

In 1998, Lee Strobel, a reporter for the Chicago Tribune and a graduate of Yale Law School, published “The Case for Christ: A Journalist’s Personal Investigation of the Evidence for Jesus.” Strobel had formerly been an atheist and was compelled by his wife’s conversion to evangelical Christianity to refute the key Christian claims about Jesus. The Conversation

Paramount among these was the historicity of Jesus’ resurrection, but other claims included the belief in Jesus as the literal Son of God and the accuracy of the New Testament writings. Strobel, however, was unable to refute these claims to his satisfaction, and he then converted to Christianity as well. His book became one of the bestselling works of Christian apologetic (that is, a defense of the reasonableness and accuracy of Christianity) of all time.

This Friday, April 7, a motion picture adaptation of “The Case for Christ” is being released. The movie attempts to make a compelling case for historicity of Jesus’ resurrection. As one character says to Strobel early in the movie, “If the resurrection of Jesus didn’t happen, it’s [i.e., the Christian faith] a house of cards.”

As a religious studies professor specializing in the New Testament and early Christianity, I hold that Strobel’s book and the movie adaptation have not proven the historicity of Jesus’ resurrection for several reasons.

Are all of Strobel’s arguments relevant?

The movie claims that its central focus is on the evidence for the historicity of Jesus’ resurrection. Several of its arguments, however, are not directly relevant to this issue.

For instance, Strobel makes much of the fact that there are over 5,000 Greek manuscripts of the New Testament in existence, far more than any other ancient writings. He does this in order to argue that we can be quite sure that the original forms of the New Testament writings have been transmitted accurately. While this number of manuscripts sounds very impressive, most of these are relatively late, in many cases from the 10th century or later. Fewer than 10 papyrus manuscripts from the second century exist, and many of these are very fragmentary.

I would certainly agree that these early manuscripts provide us with a fairly good idea of what the original form of the New Testament writings might have looked like. Yet even if these second-century copies are accurate, all we then have are first-century writings that claim Jesus was raised from the dead. That in no way proves the historicity of the resurrection.

What do the New Testament writings prove?

One key argument in the movie comes from the New Testament writing known as First Corinthians, written by the Apostle Paul to a group of Christians in Corinth to address controversies that had arisen in their community. Paul is thought to have written this letter around the year 52, about 20 years after Jesus’ death. In 1 Corinthians 15:3-8, Paul gives a list of people to whom the risen Jesus appeared.

New Testament.
Ty Muckler, CC BY-NC-ND

These witnesses to the resurrected Jesus include the Apostle Peter, James the brother of Jesus, and, most intriguingly, a group of more than 500 people at the same time. Many scholars believe that Paul here is quoting from a much earlier Christian creed, which perhaps originated only a few years after Jesus’ death.

This passage helps to demonstrate that the belief that Jesus was raised from the dead originated extremely early in the history of Christianity. Indeed, many New Testament scholars would not dispute that some of Jesus’ followers believed they had seen him alive only weeks or months after his death. For example, Bart Ehrman, a prominent New Testament scholar who is outspoken about his agnosticism, states:

“What is certain is that the earliest followers of Jesus believed that Jesus had come back to life, in the body, and that this was a body that had real bodily characteristics: It could be seen and touched, and it had a voice that could be heard.”

This does not, however, in any way prove that Jesus was resurrected. It is not unusual for people to see loved ones who have died: In a study of nearly 20,000 people, 13 percent reported seeing the dead. There are a range of explanations for this phenomenon, running the gamut from the physical and emotional exhaustion caused by the death of a loved one all the way to the belief that some aspects of human personality are capable of surviving bodily death.

In other words, the sightings of the risen Jesus are not nearly as unique as Strobel would suggest.

A miracle or not?

But what of the 500 people who saw the risen Jesus at the same time?

First of all, biblical scholars have no idea what event Paul is referring to here. Some have suggested that it is a reference to the “day of Pentecost” (Acts 2:1), when the Holy Spirit gave the Christian community in Jerusalem a supernatural ability to speak in languages that were unknown to them. But one leading scholar has suggested that this event was added to the list of resurrection appearances by Paul, and that its origins are uncertain.

Resurrection Chapel mural at the National Cathedral in Washington, D.C.
Tim Evanson, CC BY-SA

Second, even if Paul is reporting accurately, it is no different from large groups of people claiming to see an apparition of the Virgin Mary or a UFO. Although the precise mechanisms for such group hallucinations remain uncertain, I very much doubt that Strobel would regard all such instances as factual.

Strobel also argues that the resurrection is the best explanation for the fact that Jesus’ tomb was empty on Easter morning. Some scholars would question how early the empty tomb story is. There is significant evidence that the Romans did not typically remove victims from crosses after death. Therefore, it is possible that a belief in Jesus’ resurrection emerged first, and that the empty tomb story originated only when early critics of Christianity doubted the veracity of this claim.

But even if we assume that the tomb really was empty that morning, what is there to prove that it was a miracle and not that Christ’s body was moved for uncertain reasons? Miracles are, by definition, extremely improbable events, and I see no reason to assume that one has taken place when other explanations are far more plausible.

Who are the experts?

Apart from all of these other weaknesses in Strobel’s presentation, I believe that Strobel has made no real effort to bring in a diversity of scholarly views.

In the movie, Strobel crisscrosses the country, interviewing scholars and other professionals about the historicity of Jesus’ resurrection. The movie does not explain how Strobel chose which experts to interview, but in his book he characterizes them as “leading scholars and authorities who have impeccable academic credentials.”

Yet the two biblical scholars who feature in the movie, Gary Habermas and William Lane Craig, both teach at institutions (Liberty University and Biola University, respectively) that require their faculty to sign statements affirming that they believe the Bible is inspired by God and is free of any contradictions, historical inaccuracies or moral failings. For example, the Liberty University faculty application requires assent to the following statement:

“We affirm that the Bible, both Old and New Testaments, though written by men, was supernaturally inspired by God so that all its words are written true revelation of God; it is therefore inerrant in the originals and authoritative in all matters.”

The overwhelming majority of professional biblical scholars teaching in the United States and elsewhere are not required to sign such statements of faith. Many of the other scholars he interviews in his book have similar affiliations. Strobel has thus drawn from a quite narrow range of scholars that are not representative of the field as a whole. (I estimate there are somewhere around 10,000 professional biblical scholars globally.)

In an email reply to my question about whether most professional biblical scholars would find his arguments for the historicity of Jesus’ resurrection to be persuasive, Strobel said,

As you know, there are plenty of credentialed scholars who would agree that the evidence for the resurrection is sufficient to establish its historicity. Moreover, Dr. Gary Habermas has built a persuasive “minimal facts” case for the resurrection that only uses evidence that virtually all scholars would concede. In the end, though, each person must reach his or her own verdict in the case for Christ. Many things influence how someone views the evidence – including, for instance, whether he or she has an anti-supernatural bias.“

No compelling evidence

Easter Cross.
Art4TheGlryOfGod by Sharon, CC BY-ND

In response to Strobel, I would say that if he had asked scholars teaching at public universities, private colleges and universities (many of which have a religious affiliation) or denominational seminaries, he would get a much different verdict on the historicity of the resurrection.

Christian apologists frequently say that the main reason that secular scholars don’t affirm the historicity of the resurrection is because they have an “anti-supernatural bias,” just as Strobel does in the quote above. In his characterization, secular scholars simply refuse to believe that miracles can happen, and that stance means that they will never accept the historicity of the resurrection, no matter how much evidence is provided.

Yet apologists like Gary Habermas, I argue, are just as anti-supernaturalist when it comes to miraculous claims outside of the beginnings of Christianity, such as those involving later Catholic saints or miracles from non-Christian religious traditions.

I have very little doubt that some of Jesus’ followers believed that they had seen him alive after his death. Yet the world is full of such extraordinary claims, and “The Case for Christ” has provided, in my evaluation, no truly compelling evidence to prove the historicity of Jesus’ resurrection.

Brent Landau, Lecturer in Religious Studies, University of Texas at Austin

Why does Pakistan’s horror pulp fiction stereotype ‘the Hindu’?

Jürgen Schaflechner, University of Heidelberg

A terrible war is going on in the ghost world. Two opposing sides — the Muslim spirits and the Hindu spirits — face each other in a horrific combat. Suddenly, a gigantic fire-spitting demon with three eyes and a snake dangling from his neck appears on the frontline. The terrible creature reinforces the Hindu side and the Muslim ghosts need to retreat. The Conversation

Thus begins The Wharf of Death (Maut ke Ghat), a ghost story published in the Karachi, Pakistan-based Dar Digest’s January 2015 issue. Such “digests” have a long history in Pakistan’s print media, and these little booklets are widely sold in markets and at train or bus stations, often for about 50 rupees (less than a euro).

Mister Magazine: Surya Mandir ki Devadasi, 2013.
J.S., Author provided

Each digest is devoted to a particular genre, from detective or science-fiction to love and horror stories. Charmingly idiosyncratic and often ending with deus ex machina figures, their colloquial style entertains a wide Urdu audience, and print runs number from 10,000 to 30,000 copies per month.

One particularly intriguing sub-branch of this genre features tales of horror published in magazines such as Dar Digest (The Fear Compendium), which often combine classical gothic motifs with South Asian mythology. Scenes of dissatisfied jinns (spirits) terrorising their kin, evil snake-demons pretending to be innocent virgins, or haunted house situations (out of which many dear protagonists do not emerge alive) all feature.

Horror and ‘the other’

What makes these stories intriguing is their ability to disseminate ideology: the fantastic and the uncanny constitute a smooth canvas for the projection of stereotypes and simplifications.

Free from the fetters of common natural laws, horror stories represent a society’s fears and prejudices. The transgression of the everyday links itself to notions of good and evil and promises creative ways of engaging with what cultural theory calls “the other”. Horror stories are one way of representing evil in a society and the heroes capable of countering these threatening influences.

Hostility (Dushmani), a story from the May 2014 Dar Digest, offers this example:

‘Stop, Dr. Shankar!’ A deep manly voice appeared. Dr Shankar, Nirmala and Mohan turned around. An old man with a radiant face and a white beard stood at the laboratory’s entrance door. He was wearing a long white dress and had prayer beads in his hand.

These Urdu tales, composed by freelance authors living all over Pakistan, commonly portray both Hindus and Muslims, frequently revealing a straightforward division between good and evil.

Evil Hindus may plan world domination, sacrifice young virgins for gaining immortality, or simply terrorise others for no obvious reason; their Muslim counterparts, meanwhile, emerge as noble saviours and wise father figures who righteously guard their religious community and the rest of the world from the claws of “Hindu spiritual imperialism”.

Khuni Rat (The Bloody Night) Dar Digest February 2015.
J S, Author provided

Such reoccurring stereotypes reflect a certain view of South Asian history. The Islamic Republic’s central founding myth, the two-nation theory, claims that Muslims and Hindus are two distinct nations that can only thrive when separate.

The movement for Pakistan, however, was not the straightforward development that it is often portrayed as today. While the events that led to the partition of India have many layers of complexity, the simplified notion of an incompatible Hindu and Muslim population continues to be widely promoted in both Pakistan and India. This has often served as a retroactive explanation for the division of the subcontinent.

Ideology’s material existence

We know from Louis Althusser that ideology has a tangible and material existence. Studying ideology, thus, we must begin with institutions, organisations, and media outlets, which are crucial distributing ideological content.

Cover of Dar Digest, May 2013.
J S, Author provided

Ideology in this context does not pertain to right or false consciousness, but rather implies how we perceive the world around us, a process that can be either inconspicuous or blatant.

Discussing cultural identities, sociologist Stuart Hall emphasised how a nation’s story needs to be continuously narrated to its members.

A variety of platforms — such as schoolbooks, TV programmes, and literature — recount a nation’s history and position among other nations and future development. Such forms of media not only describe nations, but also prescribe what it means to be part of them.

Considering Pakistan’s history, the fact that stereotypical depictions of Hindus should resurface in Urdu pulp fiction is unsurprising. In the aforementioned Maut ke Ghat, the spirit world features a variety of constantly warring ghost tribes.

Kala Mandir, Dar Digest, September 2012.
J S, Author provided

The Hindu ghosts, also described as “immoral Satan-worshipping spirits”, are notorious for their aggression; they attempt to convert every ghost in the netherworld to Hinduism. The story’s foundation is an unavoidable struggle in the spirit world, which renders Hindu-Muslim conflicts as metaphysical truth.

To wit, this excerpt from the tale Kala Mandir (Dar Digest, September 2012):

‘Look son, without becoming Muslim you won’t be able to do anything … Lilavati has nine more days to live. You need to act as soon as possible,’ Babaji explained. Mahindra thought for a bit and then said, ‘Ok. I am ready to become Muslim.’ ‘Mashallah! You have taken the right decision. May God help you,’ Babaji said.

Julia Kristeva meticulously analysed the role of the ambivalent and its relation to the uncanny in her 1982 essay Powers of Horror. Exploring the work of Freud, Lacan, and Mary Douglas, Kristeva develops a distinctive approach to the genre by developing the concept of “the abject.”

Her complex theory is best illustrated by the image of the corpse: it was at one point a living organism, a living character with a certain place in society. But with death this organism becomes a removed object, thwarting its former characteristics. The corpse symbolises a “sudden emergence of uncanniness, which, familiar as it might have been in an opaque and forgotten life, now harries me as radically separate, loathsome”.

Dar Digest January 2011.
J S, Author provided

In my interpretation of Kristeva, the abject is not understood only as a psychoanalytical category, but also as an historical development: it implies a process in which formerly close elements are rejected to solidify one’s own identity.

As such, the abject position of the evil Hindu surfacing in Pakistan’s pulp fiction can be seen as representative of previous relationships within pre-partition India that have today been rejected by many parts of society.

Particularly within the context of Pakistan’s nationalist ideologies (which claim Islam as the raison d’étre for the Islamic Republic), “the Hindu” takes on an abject position which is an ambivalent and thus frightening role.

Hindus are, on one hand, a threatening enemy across the border; and on the other, they are the defining and constitutive foundation of the Islamic Republic: a nation allegedly built on the concept of not being Hindu, rather than on simply being Muslim.

Jürgen Schaflechner, Assistant professor, University of Heidelberg