On Sat, 17 Aug 2002 books at bofh.com wrote: #>very particular sort of meaning. But programming languages are #>very, very strictly defined. Unlike natural language, programming #>languages very rarely possess symbols that have simultaneous #>meanings depending on the context. How does a compiler handle #>ambiguity? It doesn't. We define the rules so there is no # #Counterexample: # #sendmail.cf -> $: #(as well as any number of other $ variables) Is that a counterexample? Variables have, well, variable meanings, but in any context -- at runtime -- only one of them will be in effect. I'm putting this as a question rather than an assertion because I don't know the language of this construction. But am I right here? -- Mark A. Mandel