Re: Win98 i18n Unicode fonts trouble possibly solved.


Subject: Re: Win98 i18n Unicode fonts trouble possibly solved.
From: Mike Nordell (tamlin@algonet.se)
Date: Thu Dec 28 2000 - 14:45:33 CST


Saint-Denis wrote:
> > I just saw Mike Nordell commit for the af/gr/win/gr_Win32Graphics.*
files,
> > I will download and try them.
>
> The change for the LOGFONT to OEM_CHARSET did not help. Most unicode
> characters form the unicode1.abw test file are replaced with box
characters.

This is expected and exactly the same behaviour I get in NT5 (aka w2k) why I
think this part works. Looking at the UC fonts I have installed, they don't
have the other characters why it renders them as "the empty box".

> Greek and Cyrilic letters are correct. Half of the Card suite characters
are
> represented.

Same thing here.

Does your font (really) have the other half of the characters? If so, is it
a free True-/OpenType font I can get my hands on? I only have the w2k
default UC font's available (just copied one of'em to a win98 box to test
the patch, and it seemed to work).

> Insert Symbol dialog characters are shown as boxes for the
> Symbol font, Wingdings and other fonts are shown as letters and numbers
and
> others ASCII characters representation.

Ouch! That's no good, is it. :-(
Looking at it now, I see my patch screwed it up even in NT. I also think a
larger problem surfaced because of this: we apparently don't have an
attribute telling us if it's a "real" font (containing readable characters)
or a "symbol" font (just to contain "pretty pictures"). I have tested a
*really* ugly patch for our current problems, and it seems to work, but this
is IMO only a workaround for the problem.

In gr_Win32Graphics.cpp, GR_Win32Graphics::findFont, after the lone "else"
after checking if it's a monotype font, you might try adding the following:

 if (!UT_stricmp(pszFontFamily, "symbol") ||
  !UT_stricmp(pszFontFamily, "wingdings") ||
  !UT_stricmp(pszFontFamily, "webdings") ||
  !UT_stricmp(pszFontFamily, "marlett"))
 {
  lf.lfCharSet = DEFAULT_CHARSET;
  strcpy(lf.lfFaceName, pszFontFamily);
 }

while keeping lf.lfCharSet default to OEM_CHARSET.

As I said, ugly as h*ll and it *will* probably break in future why I think
we need another argument to findFont, telling it if it is a symbol-ish font
or not.

While at it, I might also add that I think the paramters of xxx::findFont is
*really* strange. Could anyone explain the following:

- what's the meaning of a "stretch" parameter (which is a string) getting
the value "normal"? It makes even less sense since it's *never ever* used
*anywhere* in AbiWord.
- What is a "font variant"? This is also never used anywhere in AW.
- Why is a font size argument a string? I can't ATM come to think of
anything less efficient.
- Why is the font weight a string argument when *the only* value accepted is
empty string or "bold"? Wouldn't an UT_Bool bBold be correct here?

Perhaps it should be changed to the following declaration:

    xxx::findFont(const FontDescriptor&);

? Does it make sense to anyone but me?

The FontDescriptor class should (of course) be a "lightweight" class.

> Any ideas to make it work for unicode and Symbol, Wingdings and other
fonts
> on Win9x?

This patch *should* fix it, short term. I think we will have to come up with
a long-term solution since the current argument list for xxx:findFont (IMO)
really sux^H^H^H has some problems.

/Mike - please don't cc



This archive was generated by hypermail 2b25 : Thu Dec 28 2000 - 14:46:24 CST