From 00537961a05301e245a002110ed9d3da0e0ee2ca Mon Sep 17 00:00:00 2001 From: Peter Verthez Date: Thu, 10 Jan 2002 19:51:42 +0000 Subject: [PATCH] Small fix in link. --- doc/parser.html | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/doc/parser.html b/doc/parser.html index e6581b7..fbaefdf 100644 --- a/doc/parser.html +++ b/doc/parser.html @@ -1,8 +1,6 @@ - The Gedcom parser library - - + The Gedcom parser library

The Gedcom parser library

@@ -125,7 +123,9 @@ containing a lot of special characters.
test file.  Simply cat the file through the lexer on standard input and you should get all the tokens in the file.  Similar tests can be done using make lexer_hilo and -make lexer_lohi (for the unicode lexers).  In each of the cases you need to know yourself which of the test files are appropriate to pass through the lexer.
+make lexer_lohi + (for the unicode lexers).  In each of the cases you need to know +yourself which of the test files are appropriate to pass through the lexer.

This concludes the testing setup.  Now for some explanations...


@@ -164,7 +164,7 @@ corner) in the old DOS days, to be able to draw nice windows in text mode.
However, these last characters are strictly spoken not part of the ASCII set.  The standard ASCII set contains only the character positions from 0 to 127 (i.e. anything that fits into an integer that is 7 bits wide).  An -example of this table can be found here.  Anything that has an ASCII code between 128 and 255 is in principle undefined.
+example of this table can be found here.  Anything that has an ASCII code between 128 and 255 is in principle undefined.

Now, several systems (including the old DOS) have defined those character positions anyway, but usually in totally different ways.  Some well -- 2.30.2