I have spent a lot of time this week thinking about what I understand is known in artificial intelligence as the hill-climbing problem. (Some in the audience will have heard me talk about this topic before; I apologize for the repetition.)[1]

Imagine some space of variables — two variables are easiest to visualize — and some evaluation function that evaluates every position in that space and provides a value. If we have two variables in the space, and a third value giving a quality value for each point in the two-dimensional space, we get a surface in space. The challenge in the hill-climbing problem, as AI people formulate it, is to find the best (highest) location in that space.

We can of course imagine a lot of variants of the problem, with complications of various sorts. We might need to find the best location given a specific starting point, or to find the best location given specific constraints on where we can go and how much it costs to go there, and so forth. but the simple case will suffice for purposes of this discussion. We imagine that we are in some location in a multi-dimensional space and we want to seek the best location.

Now, if we face a situation like this in real life, we look around us and survey the terrain to see where the highest point in the area can be found. And once there, we can survey the terrain again and find, perhaps, a higher point. Once we see a point we want to go to, we can plan a route to it. The problem is less straightforward from the standpoint of a program, because the only way to assess the height (or quality score) of any point is to go there and evaluate the scoring function. (This is what it means for the program to be at a particular location.) If we put ourselves in the program’s position, we must imagine that we can look at the world only through a periscope, or that we are blindfolded: we cannot see much of anything. We have some way to sense height (we can evaluate the scoring function for our current location), but we have no easy way to see what other location in the neighborhood would have a higher, or lower score, than our current location. Pause for a moment, now, and think of an algorithm. You are blindfolded, with a voice-activated, voice-output altimeter, and your job is to find the top of the highest hill in the neighborhood. How do you go about it?

The simplest algorithm, and the one that I expect three-fourths of the audience just thought of, is something like this: Try moving right one step, and check the altimeter. If we find the we have one downhill, go back to the starting point (i.e., move left one step). Then try moving one step forward, or in any randomly selected direction. Again, try the altimeter, and again, if we are are going downhill, then come back. Essentially, the algorithm is: Go uphill, in any direction; never go downhill.

If we follow this algorithm, we are assured that we will never end up at a spot that’s worse than our starting point. It’s not hard to see that this approach works fine for some spaces; whenever the space is occupied by a simple surface with a single optimum location, something like a dome, then this simple algorithm — never go downhill — will get us to the global optimum. Also, it’s easy to implement, easy to understand, and its simplicity makes it easy to know that we got the implementation right. In a more complicated space, however — imagine the surface of the moon, or any reasonably realistic mountain landscape on earth, where the surface is pitted with craters or ravines and has many, many peaks — this simple algorithm has a fatal flaw. It traps us in a local optimum. We go uphill, and we will reach the peak of the hill we were on when we started. But we will completely miss the much, much better location we could have reached, if only we had been willing to go downhill just a little bit first. If we had been willing to step across the Pecos River here, we could have climbed El Capitan.[2] So we could have done much better, but our algorithm didn’t allow us to.

I think about this problem because a few years ago I acquired the unsettling conviction that, when it comes to technology adoption, most users follow what is essentially that simple algorithm: They never want to go downhill. They never want to accept any pain in order to get some benefit. Jean Paoli formulated this idea concisely some years ago, and I think of this as Paoli’s Principle. (And since Jean Paoli, my co-editor on the XML 1.0 specification, managed to sell one product group after another at Microsoft on the use of SGML and XML, when he talks about how to sell things, I think we should listen.) He said if you ask someone to put in five cents of effort in the first fifteen minutes, then they want a nickel back after fifteen minutes. They’ve got to at least break even; it’s much better if they get ten cents back, and better still if they get a quarter or a dollar. But you’ve got fifteen minutes, or you’ve got an hour. You have a very short time for them to break even on the time and effort they have invested. If they can see that they have gotten enough benefit for the effort they have put in, they’ll continue.

It’s easy to be irritated by this, especially when we see it in other people, because in reality we know that if we want to find a global optimum, to find the highest peak in a given land mass, then you have to be willing to cross a few valleys. You cannot climb Everest without crossing some valleys to get to the foot of Everest. An unwillingness ever to go downhill, an unwillingness to invest any sizable amount of time and effort before that time and effort begins to pay off, traps people and organizations in short-term thinking, just as it traps companies in quarterly-balance thinking. If a company insists on seeing a return on any investment within a quarter, then that company will never find the money to pay for long-term investments, which in this case means anything that will pay off in some period greater than three months.

It’s extremely difficult for countries that can’t do long-term investment to produce intellectual products that require thirty years of investment or effort. It is exceptionally difficult in some countries to do critical editions of major authors, because critical editions of major authors don’t happen in six months. It’s a fortunate case if it takes only a decade. Dictionaries are even worse. No one makes a major historical dictionary of any important language in ten years. The great standard dictionary of Middle High German, for example, began as an index to a pre-existing dictionary. The compiler thought it was going to take him two years; it took twelve. And no one has been willing to invest that kind of time in a Middle High German dictionary since, for the very simple reason that academics, in Germany as in the US, have a similar kind of short-term pressure. One can’t start a Middle High German dictionary as an assistant professor, because it cannot be finished in time for the author to get tenure. And if the author doesn’t get tenure, they will never be able to finish the dictionary. But if the scholar waits until they have tenure, then they are too late: now, they will not finish the dictionary before they die.[3]

This kind of short-term thinking seems to be exactly what Matt Patterson was talking about in his talk Patterson. He gave a vivid (and I think to some of us deeply disturbing) account of the demographic fact that users of technology want quick payoffs. They want an easy on-ramp; they want to avoid all threshold difficulties. And with regard to the adoption of technologies, that desire plays out in ways that may make some of us unhappy. It can be very upsetting to contemplate the refusal of other people to use technology that we use and that we like and that we think would solve many important problems for them. Their reluctance to expend effort to invest the time it takes to learn how to use the technology and then to apply them can seem unwise and obstinate and can induce all sorts of bad-tempered behavior on our part.

But if we are honest, some of us at least will realize that with respect to at least some technologies we ourselves also follow that rule: We explore new technology by looking at it for an hour or a day or fifteen minutes or a week, and then we either decide that we’re going to continue or we cut our losses by stopping. I spent an hour once trying to understand better how XML gets processed in conventional programming languages by trying to work through the first examples in a book on XML processing in Python. At the end of the hour my Python interpreter had still not managed to find the SAX interface, so I had not managed to get a single example working. My hour was up and I stopped. And the next time I had time to work on this larger project, I took up the next book on the shelf and I spent an hour working through the first few examples of XML processing in Java, and it worked, and I continued to use Java.[4]

That means I spent time focusing on one programming language rather than another, based not on any intelligent comparison of the strengths of the two programming languages, nor on the quality of their implementations. Essentially, my choice of programming language was based on a completely irrelevant property of one particular implementation running on one particular operating system (and possibly the initial problem only resulted from a malconfiguration of that system). But life is sometimes too short to do all the homework that we should do, or even all of those things that we should do that we also think would be fun to do and that we’d like to do. So sometimes our decisions are based on irrelevant properties and short-term thinking.

That same topic came up again in another context this week. Andreas Tai gave, I thought, an admirably calm and evenhanded account of the dynamics of one instance of this phenomenon, in an area that is important to all of us both for contemporary accessibility and for long-term preservation Tai. You can see, perhaps, why the hill-climbing problem and problems of optimization have seemed a useful way to organized my thoughts about the Balisage conference this week.

Another metaphor can also be useful sometimes in thinking about optimization. How can we get a better place to live? If we don’t like our house or we think it might be nice to have a nicer house, we have a variety of options:

  • We can move. We can abandon the house we live in and move to another place.

  • We can change our house a little bit, possibly in simple ways: vacuum the carpet, pick up the mess, make it neat. Now it’s more attractive, how interesting. Maybe I don’t have to move after all.

  • Maybe we need to knock out a wall, maybe we need to renovate the bathroom, maybe we need to patch the foundations.

  • In some cases, we may need to lay new foundations and then carefully separate the existing house from its old foundations and move it onto the new foundations and drop it.[5]

  • Or we can add onto the house.

    That means we don’t need to change things in the house we’ve got. We could have a bigger house just by adding a new room. We lay new foundations for that room. And maybe the old part of the house is kind of funky and the new part of the house is different, better insulated for example. Or we could even build outbuildings that are physically not contiguous to the old house but are still in the same place.

All of those are possible.

And all of those have been instantiated in talks at this conference. On Wednesday Sanders Kleinfeld showed us a carefully reasoned argument for moving house, moving to a new place Kleinfeld. Just now Tommie Usdin gave us a similarly carefully reasoned explanation for why this conference is going to move to a new place.[6]

But sometimes we just need to clean up a few things, or buy some new furniture. Sometimes, when you stop and think about it, the current house looks pretty good. Brent Nordin’s talk this morning about the history of Canada’s model building codes Nordin made me think “Wow! You know, some of those technologies actually work the way we hoped they would. They work well for the things that they were meant for, and for some others. Something went right!” Peter Flynn showed that XML can coexist happily with other markup languages and can be used to solve important and useful literate programming challenges Flynn. There have been several papers that talk in interesting ways about change tracking and textual variation or information variation — this was not always the main focus but they stick in my mind for their commonalities: on Monday at the user-interface symposium, Charles O’Connor and his co-authors presented (among other things) a very nice description of the ways in which standard string comparison functions, and in particular the technique of finding the longest common substring, fall short of what you really need in an editorially oriented change tracking system O’Connor, Gnanapiragasam, and Hepp. On Tuesday, Tristan Mitchell and Nigel Whitaker’s paper on change tracking in ISO standards illustrated the same point and talked about how to record changes in a way that was useful for editors thinking about the text Mitchell and Whitaker. This morning Daniel Röwenstrunk talked about work on the Freischütz Digital Project and the work they have done on the encoding of variations in the incredibly complicated world of XML encoding of musical material Kepper, Roland, and Röwenstrunk. And Ari Nordström showed the continuing relevance of the old rule that many, many problems in computing can be solved if you can just introduce one more layer of indirection — in this case for semantic profiling Nordström.

And we continue to explore new ways of achieving old goals of reuse and single-source use of data. Eliot Kimber talked this morning specifically about an architecture for generating slide presentations Kimber; Alan Bilansky gave us a thoughtful consideration of the issues that arise in slide preservation and the trade-offs between preserving detailed information about some aspects of slides, and discarding that information in the interests of being able to preserve anything at all Bilansky. And this morning Jerome MacDonough gave us a deceptively calm account of the absolutely terrifying prospects that open up in front of anyone who thinks hard about what it is going to mean to preserve computer games for people who are going to be born two hundred years from now McDonough. I’m astonished that he is as calm and collected as he is. Many people faced with prospects like that would have run screaming; you’re a strong man!

Liam Quin talked about a community building effort of the kind of many of us have participated before, that is going to be very important in making sure that publishing and those who are responsible for the commercial propagation of our cultural heritage can live in this new world. So I thank Liam for his talk about the new W3C Publishing Activity Quin. And Tony Graham’s work on decision making in XSL FO Graham seems to illustrate an important case: sometimes all you have to do to make a house a better place to live is to fix the problems that arise, and sometimes it’s just a question of sitting down and doing it.

It’s very useful if you’re going to do that to make sure you understand what is going on. You want to pause for reflection and you want to begin by gathering the facts. We have a number of papers that seem to me to make their contribution by gathering the facts without leaping to conclusions about what they mean. Let’s start with the facts. Thank you, David Lee, for dealing with the myths of fat markup Lee. Thank you, Peter Flynn, for actually asking authors what it is they think a user interface is going to do Flynn. Thank you, Mary Holstege, for showing how we can actually turn our tools to look at themselves (in the way that Gadamer and every other of every other writer in the history of hermeneutics would recognize as impossible), how to allow our tools to examine themselves in a useful way, even for such pragmatic tasks as figuring out where to put our QA effort Holstege.

Of course, sometimes when we look at the facts, we may decide that we need to change our ways. Sometimes the best way to become happier with our house is not to move or change anything in the house, but to change the way we think. And in this vein Simon St.Laurent gave a characteristically erudite and thought-provoking analysis of two ways of thinking about and building systems St.Laurent. As I understood his line of thought, he believes we’ve leaned too far in the direction of industrial standardization and rigidity in our systems, and we have spent too little time building systems to encourage and care for and preserve variation and individuality. We can believe that that leaning toward rigidity and industrial standardization is inherent in our technologies (I think he may be inclined to believe that but in fact he stopped short of actually saying it), or we can believe that it’s a matter of how we use our technologies. Either way, it’s an important thing to think about.

David Dubin and his colleagues talked about what I think is a related difference Dubin, Senseney, and Jett. Sometimes the goal of a specification or a specification community is to prescribe common practice, in the hopes of achieving network effects and interoperability. And sometimes, on the other hand, the goal of a specification is to achieve a better understanding of existing practice and better fit for local variations of usage. I take them to be talking about the same dichotomy that Simon St. Laurent was exploring, between a style oriented toward industrial mass production on the one hand and what we can call a more Gothic style on the other.

Or you can just add on to the house.

In technology, instead of just cleaning things up or renovating inside, you can say, in effect: let’s leave the house alone. Let’s build another building. Or: let’s build an extension. Instead of changing technologies that we would like to improve, we can layer something on top of them. And then we don’t have to change those technologies. This can be convenient, since change is (as we keep seeing in various ways) often painful. One of the nice things about layering in technologies is that, depending on how we feel about the lower levels in your layering, we can use layering to exploit properties we like in the lower levels, or to hide the lower level so we no longer have to think about it.

This year I think the papers we’ve heard about XForms provide the single biggest block of examples of judicious technology-layering. As a vocabulary, XForms is designed to be embedded into an appropriate host document language — any host document language. XForms thus avoids the need to change a host vocabulary by revision. There is no need for the designers of the host language to have integrated forms support into the language; XForms can be added on later without disturbing the rest of the language. Just by adding something new XForms opens up a huge number of possibilities. Steven Pemberton gave an introduction to XForms on Monday Pemberton (symp.). Mustapha Maalej and Anne Brüggemann-Klein from Munich reported on some really beautiful technical work that makes some things possible in XForms that I have wanted to do often and have never known how to do Maalej and Brüggemann-Klein. I’m very grateful to them for that. Ari Nordström talked about using XForms in practice Nordström (symp.). Éric Sigaud and Éric van der Vlist also talked about using XForms forms in practice Sigaud et al. — we have this nice balance between theory and practice here, where did that come from?

During the conference itself, Tobias Niedl (again working with Anne Brüggemann-Klein) talked about the possibilities for XForms implementations in an HTML5 environment Niedl and Brüggemann-Klein. And the other day Stephen Cameron and William Velásquez outlined approaches to building design frameworks using XForms Cameron and Velásquez. Other forms of layering can also be seen in other talks here this week. On Monday, George Bina talked about building an authoring customization layer on top of Oxygen Bina. That technique can be used to build an authoring layer over any editor that is sufficiently sophisticated to be heavily customizable. Different editors will of course offer different opportunities for customization: it’s a good example of the virtues of layering.

Jonathan Robie and his co-authors give us another good example Robie et al.: the Restful Service Description Language layers on top of XML and HTTP.[7] People sometimes complain about the reinvention of the wheel, but what I always think is: if enough of us spend enough time reinventing the wheel it increases the likelihood that sometime, some day, somebody manages to get the axle in the middle! So the wheel works better!

I’m not sure I understood all of the details but I believe that Michael Sokolov’s paper fits here too Sokolov. One way to look at his work is to say that he has layered a form of indexing into, or onto, Saxon’s XQuery processor. Or we can turn it around and say that he has layered an XQuery processor on top of the Lucene full-text indexer. Either way, it is very cool work.

And speaking of cool, Michael Kay and O’Neil Delpratt have layered an XSLT 2.0 processor on top of Javascript Delpratt and Kay. How cool is that! Among other things, an XSLT processor layered on top of Javascript will insulate users of XSLT from the people who control the browser, in ways that give us more control over our own fate. I don’t mind nearly as much my web application running in a Javascript interpreter, as I would mind having to write the damn thing in Javascript.[8]

It’s unusual, I realize, to conceive of a layering solution as involving the insertion of a lower level beneath existing technology, as opposed to a higher level above it, but we have an example of that inversion here as well. We owe this unusual example to Dimitre Novatchev, who has made a career out of seeing in common technologies properties that others have not perceived Novatchev. It may be a good thing that nobody was sitting immediately beside me when I understood his technique for defining recursive functions in XPath 3.0. As many in the audience know quite well, XPath 3.0 only has anonymous functions. And the one thing I understand about recursion is that to make it comprehensible, it requires named functions.[9] It’s a good thing no one was sitting close beside me because when your head explodes that way sometimes shrapnel escapes and other people can get hurt. It’s one of the risks we take, reading Dimitre Novatchev: we are in grave danger of learning something unexpected. Anonymous recursive functions without the Y combinator: we heard it here first.

And then there is the possibility of laying new foundations for the house, and moving the existing superstructure over onto them. XML technology offers us a very good example of this technique. XPath 1.0, some of you will recall, has expressions which evaluate to node sets. These are defined as sets of nodes, by definition unordered. But the only place most users ever saw node sets was (is) in the context of XSLT implementations. And XSLT specifies that node sets will always be processed by XSLT in document order. XPath 2.0, by contrast, does not node sets. Its expressions evaluate to sequences. But all of the expressions that used to work, still work. They all have pretty much the same observed behavior. But if you ask formally what exactly is happening, what does this expression mean, the foundation has changed. Those changes to the foundations help make other things work better, and — because almost all user-visible behavior is unchanged — they do not inconvenience users, or cause users to shy away from XPath 2.0.

There are other examples of the same new-foundations approach, farther back, in more abstract fields. As far as I know,[10] it never occurred to a single mathematician before the middle of the 19th century to think about the possibility of defining axioms for arithmetic. Axioms are for geometry. Arithmetic doesn’t need axioms — arithmetic doesn’t work that way! But David Hilbert and Gottlob Frege and Giuseppe Peano and Alfred North Whitehead and Bertraind Russell poured new foundations for mathematics, and carefully picked up the building, and carried it over, and dropped it onto a logical foundation that essentially wasn’t where it had grown. One of the chief criteria for those new foundations was that they should allow as much mathematics as possible to continue to work in a recognizable way. Now, changing the foundations often seems — as in the case of XPath that I just mentioned — to involve defining new semantics for things that already exist, and this is often rather mysterious. As Tommie observed on Tuesday, the word semantics quite often denotes precisely those things that we don’t actually know how to define or to do Usdin.

Micah Dubinko showed us yesterday some of the difficulties that can arise when you try to provide foundations for certain kinds of things (in his case, semantic applications) Dubinko. John Cowan has thought about foundational questions and they have led him to reintroduce in an XML context the idea of architectural forms originally introduced for HyTime in an SGML context Cowan. Maybe he has the axle in the right place. That would be interesting. It was certainly interesting that even in such a determinedly simple context as this one, we immediately got both push and pull styles of transformation. I have to think about what that means.

Sometimes, though, new foundations just feel right, without need for long thought about the new location of the axle. No one who considers the declarations necessary to define DITA using DTDs, and then also considers the declarations shown by George Bina and Eliot Kimber in their fill-in talk on defining Relax NG schemas for DITA, can think that what they describe was anything but a good move. When you go from this big [gesture] to this big [gesture], it suggests very strongly that you have gotten at least some of the basic primitive notions right.

Because one of the characteristics of good fundamental notions is that they allow you to say things concisely.

At Balisage, one of the most obvious places to look for talks about new foundations is to look at the talks considering markup that’s not shaped like trees: standoff markup, overlapping markup, and so forth. Yves Marcoux, who did a remarkable job at making a very abstract theorem about serialization of graphs comprehensible first to his co-authors and then to you, deserves special praise Marcoux, Sperberg-McQueen, and Huitfeldt. We’ve had several talks about APIs for standoff annotation. Peter Bouda and his co-authors talked about the Poio API and GraF-XML format Blumtritt, Bouda, and Rau. Nils Diewald reported on work he did with Maik Stührenberg on an API to make it easier to work with the X-Standoff format Diewald and Stührenberg. Maik Stührenberg himself talked about X-Standoff 2.0, in particular the features added to support spatial and temporal annotation Stührenberg. I think those are analogous to the kind of annotation that Micah would like to add to RDF triples. The work on XStandoff 2.0 also seemed to me to exemplify the use of schema languages to explicitly license and enable variation in documents and styles, and not to constrain them tightly. If Simon St. Laurent is correct that we have been too rigid in defining many of our vocabularies, Maik Stührenberg’s work on XStandoff 2.0 may be an indication that the fault lies in our system thinking and not necessarily in our schema languages. Another fundamental change, less radical in some ways but extremely radical at the practical level, is the introduction of streaming and streaming processing that Abel Braaksma talked about on Thursday in another fill-in slot.

But we have heard reports on a number of even more dramatic, even deeper re-foundations. To start, at the very beginning of the conference, Rob Cameron and Nigel Medforth and their colleagues reported on work that turns the basic notions of parsing inside out, or sideways, and shows that it is possible to achieve dramatic speedups by rethinking fundamental questions Medforth et al.. What a dynamite, dynamite talk! And then today, Michael Kay invited us to follow the lines of thought taken by his students in Ftan, about how one might go about building an XML-like system if we were starting in 2013 instead of having started in 1996, or 1986, or 1973 Kay. That thought experiment feels to me a little bit like laying out a nice new clean set of foundations and moving the house a fair distance. Later in the day we got from Alain Couthures a different kind of rethinking about foundations, that felt more like repouring the foundation with the house remaining in situ Couthures. His extensions to the Document Object Model felt a little disorienting to me, but perhaps at some level his changes are less disruptive — some of the inhabitants of the house may not even notice that the house has acquired new foundations. If we exploit the gaps in the specification of the DOM as he suggested, we can work with a much broader range of data types. And if you’re not sure why you would want to do, all you have to do is think about the talk that Hans-Jürgen Rennau gave the next day, that deep and thought-provoking consideration of what it is that is most important in our current technologies and how to preserve and protect and extend those advantages Rennau. There is an old joke that says that the programming language Prolog was first implemented in about 1971 by Colmerauer and colleagues in Marseilles, and then designed two years later by Robert Kowalkski in Edinburgh.[11] Once the design had been published, people started to understand the language. This week, I felt a little bit as though Hans-Jürgen Rennau had provided the theoretical underpinnings that helped motivate Alain Couthures’s work. Alain Couthures’s suggestions would be one way, although not the only way, to proceed along the lines that Hans-Jürgen Rennau laid out. Steven Pemberton also found found a way to exploit a joint, an open space in the definitions of our current system Pemberton. It is easy for naïve readers to believe that dereferencing a URI invariably requires that we contact the server identified in the URI, via an HTTP request, and take exactly what that server gives us back. But that’s not necessarily true. Web architecture tells us that that server is the authoritative source of information about the resource denoted by that URI, but if the client had to contact that server every time it dereferened the URI, then proxies would be impossible. But on the contrary, the Web is designed to make proxies possible, not impossible. And a proxy that provides XML lenses and allows us to read arbitrary data as if it were XML, sounds like a great way to achieve world domination to me: a kind of XML injection attack. I think I could live with world domination. At the very least I could certainly live with being able to read CSS and XPath and other expression languages and handle them using XSLT templates, as I would be able to do if they had been written in XML angle-bracket syntax from the beginning. But they weren’t: it turns out that not everyone wants to write things in angle-bracket syntax: it requires too much work. It has too high a threshold, too steep an on-ramp. (Remember that theme? This is where we came in.) I notice in passing that the grammatical annotations Steven Pemberton introduces into his Van Wijngaarden syntax seem remarkably similar in spirit to the schema annotations that John Cowan introduces into his Examplotron schemas to guide the implicit transformation Cowan. And I like that echo, too. I have to think about it, about what it means.

As we think about how to improve our house and where to site it, let us remember a principle that is attributed to Frank Lloyd Wright. Never, he is supposed to have said, never build a house on the top of a hill. Build the house at the bottom of the hill, or at least a little way down from the top. Save the top of the hill as a place to walk to, occasionally, for the view. The top of the hill, in this account, is a place where we can go to take a wider view of things than we normally do, that can make it a place to think, to reflect on our situation, to make plans for how to improve our situation. You don’t want the house to be at the top of the hill, because then you have no place to go for that wider view that the top of the hill affords. If you see that view all the time, it doesn’t provide the same benefit, because the benefit comes partly from the wider view and partly from the alternation between the wider view at the top of the hill and the narrower view that is what we see when we focus on the work immediately before us. Mary Holstege said the other day that one reason to come to Balisage is to step back a bit from our day job. It is appropriate perhaps that for some years now, we have come to a city with a hill, for this exercise in stepping back from things and taking a wider view, in reflecting on our situation, and in making plans to improve our situation. Thank you all for making Balisage a place where we can do all those things.

References

[Bilansky] Bilansky, Alan. “A Proposed Model for the Collection and Curation of Slide Sets.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Bilansky01.

[Bina] Bina, George. “Customizing a general purpose XML editor: oXygen’s authoring environment.” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.Bina01.

[Blumtritt, Bouda, and Rau] Blumtritt, Jonathan, Peter Bouda and Felix Rau. “Poio API and GraF-XML: A radical stand-off approach in language documentation and language typology.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Bouda01.

[Cameron and Velásquez] Cameron, Stephen, and William David Velásquez. “A Data-Driven Approach using XForms for Building a Web Forms Generation Framework.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Velasquez01.

[Colmerauer and Roussel] Colmerauer, Alain, and Philippe Roussel. “The birth of Prolog.” Presented at History of Programming Languages — II, 1993. In History of Programming Languages — II, ed. Thomas J. Bergin, Jr., and Richard G. Gibson, Jr. New York: ACM Press; Reading, Mass.: Addison-Wesley, 1996, pp. 331-364.

[Couthures] Couthures, Alain. “My document object model can do more than yours: Extending the DOM for data manipulation.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Couthures01.

[Cowan] Cowan, John. “Transforming schemas: Architectural Forms for the 21st Century.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Cowan01.

[Delpratt and Kay] Delpratt, O’Neil, and Michael Kay. “Interactive XSLT in the browser.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Delpratt01.

[Diewald and Stührenberg] Diewald, Nils, and Maik Stührenberg. “An extensible API for documents with multiple annotation layers.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Diewald01.

[Dubin, Senseney, and Jett] Dubin, David, Megan Senseney and Jacob Jett. “What it is vs. how we shall: complementary agendas for data models and architectures.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Dubin01.

[Dubinko] Dubinko, Micah. “Transcending Triples: Modeling semantic applications that go beyond just triples.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Dubinko01.

[Flynn] Flynn, Peter. “Could authors really write in XML one day?” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Flynn02.

[Flynn (LaTeX)] Flynn, Peter. “Markup to generate markup to generate markup: Using XML to create and maintain LaTeX packages and classes.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Flynn01.

[Graham] Graham, Tony. “Decision making in XSL-FO formatting.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Graham01.

[Holstege] Holstege, Mary. “Where Are All The Bugs? Introspection in XQuery.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Holstege01.

[Kay] Kay, Michael. “The FtanML Markup Language.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Kay01.

[Kepper, Roland, and Röwenstrunk] Kepper, Johannes, Perry Roland and Daniel Röwenstrunk. “Musical Variants: Encoding, Analysis and Visualization.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Kepper01.

[Kimber] Kimber, Eliot. “General Architecture for Generation of Slide Presentations, including PowerPoint, from arbitrary XML Documents.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Kimber01.

[Kleinfeld] Kleinfeld, Sanders. “The Case for Authoring and Producing Books in (X)HTML5.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Kleinfeld01.

[Kowalski] Kowalski, Robert. “Predicate Logic as a Programming Language.” In Proc. IFIP Congress, ed. J. Rosenfeld (Amsterdam: North-Holland, 1974), pp. 569-574.

[Lee] Lee, David. “Fat Markup: Trimming the Myth one calorie at a time.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Lee01.

[Maalej and Brüggemann-Klein] Maalej, Mustapha, and Anne Brüggemann-Klein. “Generating Schema-Aware XML Editors in XForms.” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.Bruggemann-Klein01.

[Marcoux, Sperberg-McQueen, and Huitfeldt] Marcoux, Yves, Michael Sperberg-McQueen and Claus Huitfeldt. “Modeling overlapping structures: Graphs and serializability.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Marcoux01.

[McDonough] McDonough, Jerome. “Some Assembly Required: Reflections on XML Semantics, Digital Preservation and the Construction of Knowledge.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.McDonough01.

[Medforth et al.] Medforth, Nigel, Dan Lin, Kenneth Herdy, Rob Cameron and Arrvindh Shriraman. “icXML: Accelerating a Commercial XML Parser Using SIMD and Multicore Technologies.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Cameron01.

[Mitchell and Whitaker] Mitchell, Tristan, and Nigel Whitaker. “Marking up changes to ISO standards: A case study.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Mitchell01.

[Niedl and Brüggemann-Klein] Niedl, Tobias, and Anne Brüggemann-Klein. “Processing XForms in HTML5-Enabled Browsers.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Niedl01.

[Nordin] Nordin, Brent. “Markup and Canada’s National Model Building Codes.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Nordin01.

[Nordström (symp.)] Nordström, Ari. “ProX: XML for interfacing with XML for processing XML (and an XForm to go with it).” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.Nordstrom02.

[Nordström] Nordström, Ari. “Semantic Profiling Using Indirection.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Nordstrom01.

[Novatchev] Novatchev, Dimitre. “Programming in XPath 3.0.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Novatchev01.

[O’Connor, Gnanapiragasam, and Hepp] O’Connor, Charles, Antony Gnanapiragasam and Michael Hepp. “ProofExpress: An Online, Browser-Based XML Article Proofing System for STM Journals.” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.OConnor01.

[Patterson] Patterson, Matt. “Where did all the markup kids go? Open-source, markup, and the casual developer.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Patterson01.

[Pemberton] Pemberton, Steven. “Invisible XML.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Pemberton01.

[Pemberton (symp.)] Pemberton, Steven. “Using XForms for interfaces to XML data.” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.Pemberton02.

[Quin] Quin, Liam R. E. “The New W3C Publishing Activity.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Quin01.

[Rennau] Rennau, Hans-Jürgen. “The XML info space.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Rennau01.

[Robie et al.] Robie, Jonathan, Rob Cavicchio, Rémon Sinnema and Erik Wilde. “RESTful Service Description Language (RSDL): Describing RESTful Services Without Tight Coupling.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Robie01.

[Sigaud et al.] Sigaud, Éric, Romain Tailhurat, Franck Cotton and Éric van der Vlist. “XForms generation: a real world example.” Presented at International Symposium on Native XML User Interfaces, Montréal, Canada, August 5, 2013. In Proceedings of the International Symposium on Native XML User Interfaces. Balisage Series on Markup Technologies, vol. 11 (2013). doi:https://doi.org/10.4242/BalisageVol11.Cotton01.

[Sokolov] Sokolov, Michael. “Indexing Queries in Lux.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Sokolov01.

[Sperberg-McQueen et al.] Sperberg-McQueen, Michael, Oliver Schonefeld, Marc Kupietz, Harald Lüngen and Andreas Witt. “Igel: Comparing document grammars using XQuery.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Schonefeld01.

[St.Laurent] St.Laurent, Simon. “The Allure of Gothic Markup: Prioritizing Local Adaptation.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.StLaurent01.

[Stührenberg] Stührenberg, Maik. “What, when, where? Spatial and temporal annotations with XStandoff.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Stuhrenberg01.

[Tai] Tai, Andreas. “WebVTT versus TTML: XML considered harmful for web captions?” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Tai01.

[Usdin] Usdin, B. Tommie. “The semantics of ‘semantic’.” Presented at Balisage: The Markup Conference 2013, Montréal, Canada, August 6 - 9, 2013. In Proceedings of Balisage: The Markup Conference 2013. Balisage Series on Markup Technologies, vol. 10 (2013). doi:https://doi.org/10.4242/BalisageVol10.Usdin01.



[1] My thanks to Roger Sperberg and to Tonya Gaylord for their help in preparing this piece for publication. Some sentences have been reformulated for clarity and ease of reading, but for the most part the substance of these remarks remains unchanged. In some cases I have added footnotes to correct factual errors in the text.

[2] I seem to be playing a little fast and loose with Texas geography here. The text is probably correct to suggest that El Capitan would be a better result than we could reach if we were on the other side of the Pecos River, but in fact El Capitan is not the highest point in Texas; that’s Guadalupe Peak, nearby. El Capitan is the eighth highest.

[3] This account is accurate as far as it goes, but incomplete. It took more than a century, but eventually an editor of Middle High German texts, a scholar skilled in the ways of grantsmanship, found a way to arrange for funding and to start a project of which he knows he cannot live to see the completion. We will have a new dictionary of Middle High German, thanks to the work of Kurt Gärtner.

[4] The experience has perhaps made me a little less of an XSLT/XQuery bigot than I would otherwise be, but in the end, my exploration of XML processing in conventional programming languages merely makes me wonder why I would want to use conventional programming languages when I’ve got XSLT and XQuery. But at least the examples worked with Java.

[5] That happened, actually, at a house I was visiting over the weekend before the conference. I thought it was very rare but it appears not to be nearly as rare as I had thought.

[6] Balisage has taken place in Montréal each August from its beginnings through 2013; in 2014 it will move to a new location.

[7] In some ways their talk could also be regarded as an example of moving to a new house entirely: they didn’t try to change the existing Web application description languages, they just said “That one doesn’t work for us, we’ll build a new one.”

[8] Admittedly, even running in XSLT layered over Javascript is likely to make me a little crazy, but that’s nothing compared to the craziness of writing an application directly in that language of twisty little passages all alike, where the book most frequently recommended to serious programmers is a book by Douglas Crockford called Javascript: the Good Bits (a notably short work), which describes the small, quiet, clean language inside of Javascript, trying to get out. Without such a book, programmers would miss that small clean language, because it’s living incognito and gives very little sign of itself.

[9] It is possible, I have been told by people with Ph.D.s in computer science, to use recursion without naming the function, using the Y Combinator. Since I’m interested in this kind of thing I have asked pretty much every Ph.D. in computer science I’ve run across for several years if they will explain it to me. And I have yet to find a Ph.D. in computer science who claims to understand it themselves, let alone to understand it well enough to be willing to explain it to me over coffee.

[10] I am not, however, a historian of mathematics.

[11] I have been unable to find any written version of this joke suitable for quotation; it appears to be a reflection of the fact that Kowalski’s account of the philosophy of logic programming Kowalski was better known to Anglophone computer scientists than the initial report by the Francophone designers of Prolog, who seem to have published more about work done using Prolog than about the development of Prolog itself Colmerauer and Roussel.

Michael Sperberg-McQueen

Senior consultant

Black Mesa Technologies

C. M. Sperberg-McQueen is the founder of Black Mesa Technologies LLC, a consultancy specializing in the use of descriptive markup to help memory institutions preserve cultural heritage information for the long haul. He has served as co-editor of the XML 1.0 specification, the Guidelines of the Text Encoding Initiative, and the XML Schema Definition Language (XSD) 1.1 specification. He holds a doctorate in comparative literature.