using (DirectorySearcher srch = new DirectorySearcher(String.Format("(memberOf= {0})",p_Target.DistinguishedName)))
{
srch.PageSize = 2;
SearchResultCollection results = results = srch.FindAll();
int count = results.Count;
}
count = 3 (THREE) 而不是 2.这是为什么?我不想在一页中显示所有结果.我知道 PageSize = 2 很小,但我在这种情况下设置该值只是为了测试目的(实际上它会更多).
count = 3 (THREE) and not 2. Why is that? I don't want to have all results in just one page. I know that PageSize = 2 is silly small but I set that value in this case just for testing purpose (in reality it will be more).
pageSize 是设置一次分页搜索返回的记录数.分页搜索是 LDAP 协议级别的基础.对你来说是透明的.尽管您将 PageSize 设置为 2,DirectorySearcher 仍会为您返回所有结果,但在您的情况下会返回两个分页搜索回复数据包.
The pageSize is to set the number of records returned in one paged search. Paged search is an underlying thing at LDAP protocol level. It's transparent to you. Although you set the PageSize to 2, DirectorySearcher will return all the results for you but in your case in two paged search reply packets.
为了做你想做的事,你应该改用 SizeLimit.它将控制总共返回多少条记录.
To do what you want, you should use SizeLimit instead. It will control how many records returned in total.
还有一件比较棘手的事情.Windows Server 在服务器端设置了限制.在每个分页搜索结果中,最多只能返回 1000 个条目.因此,如果结果超过 1000 个条目,则需要小心设置 PageSize 和 SizeLimit.如果你设置 PageSize=0(意味着无限制)和 SizeLimit=0(意味着无限制),你会得到一个错误,因为 Windows 服务器不能在一个页面中返回超过 1000 个条目.如果您设置 Pagesize = 800 和 SizeLimit=0(意味着无限制),您将获得所有结果,如果您查看网络嗅探器,您会看到有一堆 LDAP 分页搜索结果.在每个分页搜索结果中,您会看到 800 个条目.
Here is one more tricky thing. Windows Server has a limit set on the server side. In each of the paged search result, it can only return at most 1000 entries. So, you need to be careful setting the PageSize and SizeLimit if you have the results more than 1000 entries. If you set PageSize=0 (meaning unlimited) and SizeLimit=0 (meaning unlimited), you will get an error because Windows server cannot return you more than 1000 entries in one single page. If you set Pagesize = 800 and SizeLimit=0 (meaning unlimited), you will get all your result and if you look at the network sniffer, you will see there are a bunch of LDAP paged search result. In each of the paged search result, you see 800 entries.
编辑
这是对您评论中问题的更详细的答复.
Here is a more elabrated reply to the question in your comment.
嗯,有趣.请帮我更好地理解这个机制:如果在 AD 中,我有 5000 行,PageSize 为DirectorySearcher 设置为 1000,SizeLimit 设置为 0 和最大服务器限制为 1000.多少次调用directorySearcher.FindAll() 我需要在我的代码中有获得所有 5000结果?5 或 1
Hm, interesting. Please help me understand better this mechanism: if in AD I have 5000 rows, PageSize of DirectorySearcher is set at 1000, SizeLimit is set to 0 and max server limit is 1000. How many call of directorySearcher.FindAll() I need to have in my code to get all 5000 results? 5 or 1
无论返回多少条记录,您始终只需要对 DirectorySearcher 进行 1 次调用.DirectorySearcher 将为您处理剩下的事情.即使数据可能来自不同的回复数据包,它也会聚合分页搜索结果并在一个 IEnumerable 中呈现给您.我猜您想设置 PageLimit 是因为您不希望一次返回所有 5000 个结果并占用您的内存.别担心.只要您没有对每个返回的 SearchResult 进行引用,DirectorySearcher 就不会将所有 5000 个结果存储在您的内存中.它也不会等到所有回复数据包都回来.一旦第一个回复数据包回来,FindAll() 就会将结果返回给您.如果您的程序速度如此之快,以至于您处理了 1000 个结果后,第二个分页搜索结果包仍未到达.对 MoveNext() 的调用将被阻塞并等待第二个分页搜索结果包收到.
Regardless of how many number of records going to be returned, you always need only 1 call on DirectorySearcher. DirectorySearcher will handle the rest for you. It will aggregate the paged search result and present to you in one single IEnumerable even though the data might be from different reply packets. I guess you want to set PageLimit because you don't want all 5000 results returned all at once and occupy your memory. Don't worry on that. DirectorySearcher won't store all 5000 results in your memory as long as you don't hold reference on each of the returned SearchResult. It won't wait till all reply packets coming back either. As soon as the first reply packet coming back, the FindAll() returns the result to you. If your program is so fast that after you process the 1000 results, the second paged search result packet has still not yet arrived. The call on the MoveNext() will be blocked and wait till the second paged search result packet recevied.
这篇关于DirectorySearch.PageSize = 2 不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!