From: David Rientjes Date: Mon, 15 Feb 2010 21:43:25 +0000 (-0800) Subject: x86, numa: Fix numa emulation calculation of big nodes X-Git-Tag: v2.6.34-rc1~253^2~2 X-Git-Url: http://git.openpandora.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=68fd111e02b979876359c7b471a8bcbca0628b75;p=pandora-kernel.git x86, numa: Fix numa emulation calculation of big nodes numa=fake=N uses split_nodes_interleave() to partition the system into N fake nodes. Each node size must have be a multiple of FAKE_NODE_MIN_SIZE, otherwise it is possible to get strange alignments. Because of this, the remaining memory from each node when rounded to FAKE_NODE_MIN_SIZE is consolidated into a number of "big nodes" that are bigger than the rest. The calculation of the number of big nodes is incorrect since it is using a logical AND operator when it should be multiplying the rounded-off portion of each node with N. Signed-off-by: David Rientjes LKML-Reference: Signed-off-by: H. Peter Anvin --- Reading git-diff-tree failed