Solr Out of Memory (OOM): Causes and Solutions

Solr is an open-source enterprise-search platform written in Java, from the Apache Lucene Project. Java applications sometimes run out of memory, which is a common issue with Solr deployments.

When Solr runs out of memory, we intuitively expect that the index is too large or the application is overwhelmed by a very high indexing rate. Although these issues are common, they might not be the real or the only reasons.

Below are some additional reasons why your Solr deployment might throw an Out of Memory Exception:

Requesting a large number of rows

Queries requesting a large number of rows can run the system out of memory.

When investigating performance issues in client deployments, we often see that the queries are asking for more than a million rows! Although Solr might not return that many documents, it internally allocates memory for the number of results that the query requested. 

Solution:

You should configure your application to request only the number of rows that you are showing in the search results. Even if you are using faceting, requesting for only 10 or 20 rows will still compute the facets over the entire resultset. 

Queries starting at a large page number 

Queries starting at a large page number use unexpected amounts of memory. A similar performance issue happens when the queries do deep pagination by having a large start parameter. Solr needs to fetch all results up to the value of the start parameter, resulting in heavy memory utilization.

Solution:

If your application cannot be restructured to avoid deep pagination, to fetch larger results, you can use “Cursors”. You can learn more about them at the Solr Pagination Documentation.

Faceting, sorting and grouping queries

Faceting, sorting and grouping queries use a lot of memory, especially if done on fields that are not docValues. In general, faceting, sorting and grouping queries are expensive, having high memory utilization. Setting docValues=true in the schema field definition reduces the java heap requirements by memory-mapping field data. 

Solution:

If you are having out-of-memory issues, you should investigate the fields that are being used for faceting, grouping, and sorting, and make sure that their schema sets docValues=true. (If you change a docValues setting in the schema, you’ll have to reindex your content.)

Large Caches – QueryResultCache, DocumentCache, FilterCache, FieldCache. 

Caching makes Solr fast and reliable by trading speed for memory. Large caches could be one of the reasons behind your out-of-memory problems. 

There are different kinds of caches that are configured in solrconfig.xml:

  1. filterCache: This is the cache storing unordered lists of document ids that have been returned by the “fq” (filterQuery) parameter of your queries.
  2. queryResultCache: This cache stores document ids returned by searches
  3. documentCache: This caches fieldValues that have been defined as “stored” in the schema, so that Solr does not have to go back to the index to fetch and return them for display.
  4. fieldCache: This cache is used to store all of the values for a field in memory rather than on to disk. For a large index, the fieldCache can occupy a lot of memory, especially if caching many fields.

The settings for each cache defines its initial size, its max size, and its autowarmcount – which is the number of items that are copied over from an old searcher to the new one. 

Solution

Looking at Plugins/Stats in the Solr dashboard, one can check the hit ratio of the caches to see if they are being utilized. If the hit ratio is too low, the caches are not really being utilized. You can make the caches smaller to reduce the memory footprint. 

Also, if the number of evictions is too large, chances are the cached entries are being tossed without being used. You might just benefit by reducing the cache sizes to relieve your out-of-memory problems.

You should also note that these caches are per core/collection. The memory requirements will be multiplied by the number of collections. If your application uses a large set of collections, the memory impact of caching will be magnified. 

 

If you are using SearchStax Solr Service to host and run your Solr deployments and need more memory, See here on how you can upgrade your SearchStax deployment.

In addition, if you’d like to learn and know whether your Solr process died because of an out-of-memory exception, Click here to know how to find out.

About the Author

Recommended for you