The Value of a Sermon Critical Edition. Part 4.

[Part 3 is here.]

The terminology used in textual studies has changed as the philosophy of the nature of texts has changed and it also varies depending on the methodological branch being considered. So, I’m going to avoid the technical language as much as I can. Also, I’m leaving out a lot, believing that brevity is the soul of something or other.

The idea of reconstructing a text from fragmentary sources is an old one. In early biblical studies the process of attempting to move from fragments of early manuscripts to an autograph or something near it became known (1885) as “lower criticism.” The idea could be applied to other ancient texts or modern ones. But the study of texts since the 19th century, originally obsessed with the idea of actually rebuilding the autograph, or original text, has changed, methodologies bifurcating in the process.

In biblical criticism, where numbers of witnesses (manuscript source texts) may be very large, the result of the process may be a text with readings drawn from these various witnesses. It is not a copy of any particular manuscript, and it could differ from many existing sources. Called an eclectic text such a construction values no single witness above another in principle. (It’s still true that some sources are more equal than others in biblical studies. The earliest are more highly valued, generally speaking, and this can be true for “modern” texts as well.) The editor forms opinions about individual sources based on various kinds of evidence, usually classified as external or internal. Such a process may seem appropriate in some degree for Joseph Smith’s sermons, when a number of contemporary witnesses exist.

Several streams of thought develop naturally from considering the eclectic method, one of which is stemmatics. Here the editor/critic seeks to place manuscripts in a kind of taxonomy, moving from a family of texts back in time to the single autograph (maybe). The essential idea is that shared differences in readings imply a common source. So, if two witnesses have a number of errors in common this suggests that they were derived from a common intermediate source, (called a hyparchetype – say it 5 times really fast). Relations should exist, in theory, that allow all extant manuscripts to be placed in a genealogy or stemma or tree, coming down to a single archetype.[1] This kind of reasoning could be used in cases where a family of differing imprints exists and one desires to establish dependence. The problem with the method is that it more or less requires that we assume each text has just one predecessor. Even with a rather brief history, this can be difficult to prove (or believe).

Another idea. In cases where an author left a manuscript(s) or edited his own manuscript, say for publication and for different possible reasons (like mean editors), the editor/critic might select a base text to work from, offering changes or emendations to such a text (copy-text) based on what she may feel are better readings or where the base text is obviously in error. Walter Greg began a refinement of this idea, making distinctions between variant readings which he called substantive, as opposed to say changes in capitalization or spelling (accidentals).[2] Greg also offered that the base text itself should not necessarily be favored in terms of substantive readings. Fredson Bowers applied and expanded Greg’s ideas and it became known as the Greg-Bowers method. One result of this was a change in the way the edited text may be presented. Edited texts would be shown without much or any reference to the sources of base text alteration (clear text).[3] The Greg-Bowers ideas (which I have only hinted at) were extremely influential in text studies, even enforced via funding procedures for critical text projects (that is no longer true). More recently, “social constructionism” has offered a competing lens.[4]

In sum, there are various opinions with regard to text recovery. Working back to an archetype or earliest text is still one view. For example, this is Royal Skousen’s goal in his Book of Mormon critical text project (even though he presents a clear text version in his Yale UP book – see Grant Hardy’s introduction esp. pp.xv-xxi). In other cases this kind of thing is viewed with suspicion. Here philosophical terms invade but the idea is that while a platonic ideal may be immanent, the particular is what’s going on. The ideal is made of unobtainium.

A recent critical edition of sermon texts is Kenneth Newport’s edition of sermons of the brother of Methodist founder John Wesley, Charles Wesley. Wesley illustrates an important point about sermon sources and textual studies that I will mention in the part 5.

One point here is that the clouds of documents related to Joseph Smith’s sermons have varying characters in nearly every case. Methodology is not a religion to be pursued no matter what the state of affairs in a given exemplar. In some cases, it may make sense to pursue something like an archetype.[5] In others, more emphasis in intentional criticism makes sense.[6] Editorial procedure should go where a text leads in part. That’s my story, and I’m sticking to it. The tale of text criticism is fragmented with various brand names multiplying.[7]

What can we ask of a text? How do we address those questions? Even if an archetype (in the sense of a strictly read teleprompter text) were to exist in perfection, it would still be simply a text, not the sermon. A sermon belongs to the realms of sight, sound and language. Just as the Church at the time Joseph Smith delivered a sermon was not stone or font or office only, but faithful people, acts of ritual, devotion, charity, disaffection and departure. Joseph’s sermons are a part of them, and through them, us. In the end, every method involves editorial opinion and no edition will ever be the “final” one.

Next time, we look at some examples of “apparatus.” Fun stuff.

————————–
[1] The word may refer to earliest common source of a surviving tradition or it could refer to the autograph, if the text admits that.

[2] The following joke is apropos:

Indeed, when a different Reading gives us a different Sense,
or a new Elegance in an Author, the Editor does very well in
taking Notice of it; but when he only entertains us with the
several ways of Spelling the same Word, and gathers together
the various Blunders and Mistakes of twenty or thirty different
Transcribers, they only take up the Time of the learned
Reader, and puzzle the Minds of the Ignorant
[from Thorpe's book in note 4]

Of course, changes in capitalization, punctuation, paragraphing, etc. could be significant in some cases.

[3] The theory is that the editor is to establish the text, under whatever method, and that overt “apparatus” on the page distracts from the text and makes it difficult to quote anyway. Many editors don’t share this ethic, preferring to be a little more modest about the surety of their productions.

[4] A nice example of a modern textual study is in Shawn St. Jean, et al., edition of Charlotte Perkins Gilman, The Yellow Wall-Paper. (Athens, Ohio: Ohio UP, 2006). For a readable intermediate view of things, see James Thorpe, Principles of Textual Criticism. Huntington Library, 1972, 1990.

[5] “Authentic” or “authoritative” probably better fits this kind of re-engineering of the eclectic.

[6] Intentional criticism. Using what’s available to come to what the author “intended” the text to be, whether this text actually ever existed or not. The idea is to build an authoritative text which expresses the authors intention. For example, an author may have written a final manuscript based on a publisher’s restrictions of length or language, but left notes or earlier versions that indicate his true intention. That sort of thing. In Smith’s case, there are a very few instances where he may have revamped diary entries reporting a sermon. Balancing this still is that nagging idea of finding the original expression in the case of an “oral” text. In Skousen’s case, if he was trying for JS’s intention, he might have started with the 1840 imprint.

[7] See for example, Greetham, Textual Scholarship. Garland, 1994. chap. 9. For more fun, see Mary-Jo Kline, A Guide to Documentary Editing chap 6, G. T. Tanselle, “Historicism and Critical Editing,” Studies in Bibliography 34(1986) 1-46, Robert D. Hume, “The Aims of Uses of Textual Studies,” The Papers of the Bibliographical Society of America 99(2005) 197-230, also Greetham, “Textual Forensics,” PLMA 111/1 (Jan. 1996), 32-51.

Comments

  1. I’m pleased to see people thinking.

  2. If you’re flattering me Stephen, I’ll take all I can get.

  3. I keep meaning to comment about how much I’m enjoying your posts. I’m currently reading A Guide to Documentary Editing and enjoying it greatly, and it’s interesting to read your musings on the Mormon texts, since I have thought a lot about them while reading the book.

    And, a note on your second footnote (“the various Blunders and Mistakes of twenty or thirty different Transcribers …”) — I’m currently posting the transcription of a mid-20th century Mormon funeral service on my family history blog and have recently corrected some amusing errors in the transcription of a reading from Revelation. In the transcription, John receives his revelation on the Isle of Peebles, and instead of eating manna, those that overcome will be eating mammals. (Etc.)

  4. I prefer mammals to manna, especially bbqed.

    This is another great installment, with a lot to think about. I think sometimes there is a tendency as we have so many good Mormon documents to not think critically about them. I think we are moving into a sort of golden age of documents editing with things like the JSP. But I am completely impressed with the work you are doing, WVS; and this is another example of giving me something to think about.

  5. I’m with you on the mammals (or fish). The JSPP will definitely be the seed bed for a lot of interesting scholarship.

Follow

Get every new post delivered to your Inbox.

Join 9,514 other followers