The pretree decode table was not declared to be 16 byte aligned, as
expected by make_huffman_decode_table().
This bug had no effect if the compiler aligned this table on a 16 byte
boundary anyway, but if not it caused a segmentation fault on x86 platforms
where the SSE2 instructions were available, since in that case stores
requiring 16 byte alignment are used to fill in table entries.
* @decode_table: The array in which to create the fast huffman decoding
* table. It must have a length of at least
* (2**table_bits) + 2 * num_syms to guarantee
* @decode_table: The array in which to create the fast huffman decoding
* table. It must have a length of at least
* (2**table_bits) + 2 * num_syms to guarantee
- * that there is enough space.
+ * that there is enough space. Also must be 16-byte
+ * aligned (at least when USE_SSE2_FILL gets defined).
*
* @num_syms: Number of symbols in the alphabet, including symbols
* that do not appear in this particular input chunk.
*
* @num_syms: Number of symbols in the alphabet, including symbols
* that do not appear in this particular input chunk.
{
/* Declare the decoding table and length table for the pretree. */
u16 pretree_decode_table[(1 << LZX_PRETREE_TABLEBITS) +
{
/* Declare the decoding table and length table for the pretree. */
u16 pretree_decode_table[(1 << LZX_PRETREE_TABLEBITS) +
- (LZX_PRETREE_NUM_SYMBOLS * 2)];
+ (LZX_PRETREE_NUM_SYMBOLS * 2)]
+ _aligned_attribute(DECODE_TABLE_ALIGNMENT);
u8 pretree_lens[LZX_PRETREE_NUM_SYMBOLS];
unsigned i;
unsigned len;
u8 pretree_lens[LZX_PRETREE_NUM_SYMBOLS];
unsigned i;
unsigned len;