console: Fix rendering of VGA underline
vga_putcharxy()'s underline code sets font_data to 0xffff instead of
0xff. vga_putcharxy() then reads dmask16[0xffff >> 4] and
dmask4[0xffff >> 6]. In practice, these out-of-bounds subscripts
"only" put a few crap bits into the display surface.
For 32 bit pixels, there's no array access. font_data's extra bits go
straight into the display surface.
Broken when commit
6d6f7c28 implemented underline.
Spotted by Coverity.
Signed-off-by: Markus Armbruster <armbru@redhat.com>
Signed-off-by: Anthony Liguori <aliguori@us.ibm.com>