Why I hate LaTeX

Motivation

Most works on LaTeX extol its virtues. This one won't; I hate the thing. Nevertheless, it has become a standard for many scientific journals, so I'm forced to use it. Having finally learned a little about how to avoid its inherent problems, I thought I'd point out some trouble spots, and a few painful lessons, in hopes of saving other writers some of the agony I went through.

Gripes

Unhelpful reference works

LaTeX is one of those complicated computer programs that makes simple things easy — which allows authors of books touting the package to present “toy” problems that are trivial to solve, and then say “See how easy it is?” But when you have real-world publishing problems, books like Lamport's LaTeX: A Document Preparation System don't provide the answers.

And I never can find what I'm looking for in his index, either. There are no index entries for “resetting values” (or “variables”); “numerical registers”; “PostScript”; “setting parameters”; “leaders”; “debugging”; etc., etc. — just to name a few terms I've searched for in vain.

Frequently, Lamport tells you to go read Knuth's TeXbook. But Knuth's book is the same way, for me: nearly unreadable. (In fact, I can't read any of Knuth's books; his mind works differently from mine.)

Off-putting jargon and cutesy terminology

Another irritation is the overly-cute jargon, like “glue”. What's the matter with “space”? Or, for that matter, standard typesetting terms like “leading”?

One reason I can't find what I'm looking for in the books mentioned above is the use of idiosyncratic names for things. What anybody normally would call “numerical registers” turn out to be called “counters,” for example.

TeX's \the command has another strange and confusing name. Why can't we have something more descriptive, like “value”?

While I'm complaining about nomenclature, I may as well complain about verbose command names, too. I realize there's a problem with namespace collisions in a language with hundreds of commands; but a little more terseness would have saved lots of typing errors that are conducive to unhelpful error messages. Why have \renewcommand to reset a variable when \reset would have made more sense and invited fewer mistakes? When you see things like \righthyphenmin, \normalbaselineskip, and \interdisplaylinepenalty, it reminds you of Cobol. And how about \texttt, with those 3 t's? That's asking for typos.

Another troublesome name is the inevitable \makeatletter that's required to make any but the most trivial changes to formatting rules. It's essential to LaTeX, but it never appears in Lamport's book. Where are you supposed to find out about this thing?

Furthermore, although you'd think the verbosity would avoid namespace near-misses as well as collisions, there are things like the table, tabbing, and tabular environments that are guaranteed to produce confusion. You always guess the wrong one the first time.

Finally, let's not forget Knuth's quirky logo for TeX — an unnecessary nuisance compounded by Lamport's even more complicated logo for LaTeX. How many hours have been wasted by people trying to reproduce them?

Inconsistent language design

How about the capricious uses of optional arguments and alternate forms of commands? There's no consistency in these things; you have to learn a new set of rules for every command. Why not at least deal with the alternate forms by using an optional argument, instead of an asterisk? The matter is compounded by the use of both “star-forms” and optional arguments with some common constructs, like the \\ command — which itself violates the otherwise useful rule that special characters can be inserted in literal text by prefixing the escape character to them.

Hostile user interface

Some of these gripes are matters of taste. But even TeX's advocates admit that it has

. . . error messages that are mystical, opaque, and vaguely frightening. Experience soon teaches you that the best thing is to ignore the messages . . . .

(to quote from p. 447 of Alan Hoenig's TeX Unbound). Even the MIT website says that

The error messages produced by LATEX are the most confusing part of using it. Most people have trouble understanding what LATEX is telling them.

The misleading error messages are a hallmark of immature software. I haven't seen anything so bad since the early days of FORTRAN, circa 1960.

One of the most user-hostile features of the error facility is that beginners can't get out of it easily to fix a problem. Try typing quit or exit or bye — standard responses that work with most programs — and you just get more errors. Even the “kill” key fails to stop this Frankenstein's monster. (How to quit is explained on the introductory page.)

On top of this, the messages are all written in the first person — a style that many people find distracting — and have a glib tone that's doubly offensive, given their frequent inaccuracy. It suggests the programmer didn't take error messages seriously.

Bad program design

And speaking of FORTRAN, the intermixing of high- and low-level statements in LaTeX is reminiscent of FORTRAN III, which allowed machine-language statements to be intermixed with regular high-level statements. (This practice was quickly considered harmful, and banished from FORTRAN; but LaTeX still allows it.)

This invites the common situation where an apparently innocent shortcut gets LaTeX so confused that it produces nonsense TeX code — which in turn elicits a error message from TeX complaining about bad commands that aren't in your input file, because they're buried deep in some broken LaTeX macro. (These interactions are explained further elsewhere.)

Additional irritation is the need to put a \/ pair (the command for a thin space) at the end of every section of italicized text. This is so regular that it ought to have been built into TeX from the beginning, or at least into LaTeX. The whole point of having the computer do the work is that it should free the user from such routine tasks. Talk about broken software … .

A similar aggravation is the need to use machinery that would have baffled Rube Goldberg just to line up numerical data at the decimal point in table columns. It's not that it can't be done; it's that it requires a ridiculous amount of human effort to do it. Even the much-maligned troff can do this very easily with its tbl pre-processor: you just give n as the column-alignment code, instead of l or r or c. Why isn't LaTeX this easy to use?

And how about the continual need to re-run LaTeX? Instead of producing a warning message, why doesn't the damn thing just re-run itself and shut up?

Chauvinism

A further problem for astronomers, meteorologists, and many others is that the symbol for “degrees” should naturally be invoked by \deg — but that command is reserved for some obscure mathematical operation we've never heard of. So even in packages like AASTeX, one is forced to use the unnatural \degr, which is impossible to remember and takes an extra keystroke. I realize Knuth comes from a mathematical background; but why do the rest of us have to put up with mathematical obscurantism? It would be sensible for LaTeX to re-define this command.

Ugly fonts

I think the Computer Modern fonts are really ugly. That means I have to switch to decent-looking PostScript fonts, to make what I grind out bearable to look at; and, as this isn't the default, it causes more problems.

Larger problems

In addition to the difficulties of TeX and LaTeX, there is the problem that you can't print out anything they format without a large infrastructure of fonts and helper programs. That's probably as it should be, in accordance with the UNIX philosophy of cooperating programs that each do one thing well. But it means that, when difficulties arise, and what you see on your screen isn't what you wanted, it's hard to tell where the trouble came from: LaTeX, dvips, some font that's wrong, or some underlying misconfiguration of behind-the-scenes directory trees and kpathsea variables that are normally invisible to the user.

The complexity is unavoidable, because formatting and typesetting documents are complex tasks. We've become accustomed to indexes, cross-references, footnotes, tables, figures, displayed equations, and other niceties of modern printing, without thinking about them. But these things all have to be explained to computers, which are incredibly stupid, and can only do what they're told in the minutest detail.

A related reason for the complexity, and the myriad of rules with exceptions (and exceptions to the exceptions), is that documents embody human languages. And human languages aren't really capable of being analyzed in minute detail. (If they were, machine translation would be a reality, and “Optical Character Recognition” wouldn't require meticulous proofreading. The mistaken notion that documents can be forced into a rigidly-specified mold leads to nonsense like pages that have “This page intentionally left blank” printed on them.) You're asking the computer to do something it really isn't well suited to, here. So a lot of human intervention is inevitably required.

But, given the complexity of the task, it behooves designers of software to reduce the burden on users as much as possible. Clear notation and terminology, consistent use of arguments, accurate rather than misleading error messages: these are basic matters that I find overlooked — no; deliberately spurned with contempt — in TeX and LaTeX.

 

Copyright © 2005, 2006, 2010 Andrew T. Young


Back to the . . .
LaTeX overview page

or the alphabetic index

or the website overview page