Using Drools in a heavy batch process

bmw0128 picture bmw0128 · Sep 18, 2008 · Viewed 8.3k times · Source

We used Drools as part of a solution to act as a sort of filter in a very intense processing application, maybe running up to 100 rules on 500,000 + working memory objects. turns out that it is extremely slow. anybody else have any experience using Drools in a batch type processing application?

Answer

Michael Neale picture Michael Neale · Nov 7, 2008

Kind of depends on your rules - 500K objects is reasonable given enough memory (it has to populate a RETE network in memory, so memory usage is a multiple of 500K objects - ie space for objects + space for network structure, indexes etc) - its possible you are paging to disk which would be really slow.

Of course, if you have rules that match combinations of the same type of fact, that can cause an explosion of combinations to try, which even if you have 1 rule will be really really slow. If you had any more information on the analysis you are doing that would probably help with possible solutions.