Saturday, 9 July 2011

Open Source Directory benchmark – part two

Open Source Directory benchmark – part two

This is the final part of a two part blog entry about an open source directory benchmark we performed. Please find the first part here. In this second part, we focus on the results of our benchmark.

After testing with our own custom developed tool for several hours we came to the results below, which are the average of 20 testruns. Our final report (available on request) contains more information about how the individual benchmarks were performed and how the directory servers were tuned for performance.

Search

We started by testing the amount of search operations a directory server can perform per second. As directory servers should be very fast in search operations, we expected these results would be high.

The fastest searches were performed on OpenDJ, previously known as OpenDS. OpenLDAP is a close second and as third we have Oracle Internet Directory. All servers perform very high on search operations. No surprise there, because this is the very reason for which directory servers exist.

Authentication

After testing how well these directory servers handle search requests, we wanted to know how fast they could handle authentication. We had to take care, however, not to flood the server with too many connections because that would render it irresponsive.

From the results, we can see that the commercial products (OID and AD) score better than the open-source variants. However, Fedora389 is very close to the same results. We have noticed that using our tool, we sometimes accidentally perform a successful denial-of-service attack on the directory servers. The former statement does not apply to the Oracle Internet Directory and Active Directory. We retested multiple times to get stable results.

Registration

Most Directory Server implementations are optimized for reading, so we expected some interesting results from the registration benchmark.

The registration benchmark results show that there are 2 directory servers which are clearly faster than the others: Fedora389 appears to be slightly faster than OpenDJ, but we feel this difference is negligible. These two servers are most capable of writing entries to disk or memory.

Modification

Modification is actually a sequential process that consists of:

  1. looking up the entry (reading)
  2. modifying it,
  3. and updating the directory server

This is a heavy resource process, so our expectations for these results were quite low.

Generally, the average modification speed is very low. OpenDJ scores best with 155 modifications per second. ApacheDS is second with 147 modifications per second, and then we have Fedora389 with 128 modifications per second.

Hardware performance impact

Every test was performed on the two hardware configurations as described earlier. The following graph indicates the average performance improvement per directory server when tested on the High End server compared to the Mid End server.

Conclusions

The following table lists the best peforming directory server of our benchmark per operation:

Operation

Best performing LDAP server

Search

OpenDJ

Authentication

Oracle Internet Directory

Registration

Fedora389

Modification

OpenDJ

We conclude that the performance of open source directory servers is comparable to that of Active Directory or Oracle Internet Directory for performing the benchmarked operations on relatively modest hardware.

Outro

While performing these benchmarks we have gained a lot of experience regarding configuring and tuning these directory servers. Many enterprises rely on directory server technology and it has been a rewarding experience for us showing that there are also open-source software packages that are up to the task.

2 comments:

Onur said...

I was looking for a good report. great work and great reading, thanks to your efforts

hyc said...

Your results are quite unusual. In all of our tests, not only is OpenLDAP always fastest, but it is usually 5-200x faster than ActiveDirectory.

Your results would be more believable if you published the source code of your test client, along with the complete configurations of the tested servers.