This is the mail archive of the libc-alpha@sourceware.org mailing list for the glibc project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

PATCH: [BZ #14562] Properly handle fencepost with MALLOC_ALIGN_MASK


Hi,

fencepost is aligned at MALLOC_ALIGNMENT:

     /* Setup fencepost and free the old top chunk with a multiple of
         MALLOC_ALIGNMENT in size. */
      /* The fencepost takes at least MINSIZE bytes, because it might
         become the top chunk again later.  Note that a footer is set
         up, too, although the chunk is marked in use. */
      old_size = (old_size - MINSIZE) & ~MALLOC_ALIGN_MASK;
      set_head(chunk_at_offset(old_top, old_size + 2*SIZE_SZ), 0|PREV_INUSE);

We need to align fencepost to MALLOC_ALIGNMENT in heap_trim and count
the misaligned extra bytes in new size.  Tested on ia32, x86-64 and
x32.  It fixes all issues on x32.  OK to install?

Thanks.

H.J.
---
2012-09-08  H.J. Lu  <hongjiu.lu@intel.com>

	[BZ #14562]
	* malloc/arena.c (heap_trim): Properly get fencepost and adjust
	new chunk size with MALLOC_ALIGN_MASK.

diff --git a/malloc/arena.c b/malloc/arena.c
index 97c0b90..dd4f3e3 100644
--- a/malloc/arena.c
+++ b/malloc/arena.c
@@ -655,15 +655,18 @@ heap_trim(heap_info *heap, size_t pad)
   unsigned long pagesz = GLRO(dl_pagesize);
   mchunkptr top_chunk = top(ar_ptr), p, bck, fwd;
   heap_info *prev_heap;
-  long new_size, top_size, extra;
+  long new_size, top_size, extra, misalign;
 
   /* Can this heap go away completely? */
   while(top_chunk == chunk_at_offset(heap, sizeof(*heap))) {
     prev_heap = heap->prev;
     p = chunk_at_offset(prev_heap, prev_heap->size - (MINSIZE-2*SIZE_SZ));
+    /* fencepost must be properly aligned.  */
+    misalign = ((long) p) & MALLOC_ALIGN_MASK;
+    p = (mchunkptr)(((unsigned long) p) & ~MALLOC_ALIGN_MASK);
     assert(p->size == (0|PREV_INUSE)); /* must be fencepost */
     p = prev_chunk(p);
-    new_size = chunksize(p) + (MINSIZE-2*SIZE_SZ);
+    new_size = chunksize(p) + (MINSIZE-2*SIZE_SZ) + misalign;
     assert(new_size>0 && new_size<(long)(2*MINSIZE));
     if(!prev_inuse(p))
       new_size += p->prev_size;


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]