• Document Up to Date
  • Updated On 4.0.0

Migrating a site from Solr to Elasticsearch

When upgrading to CrafterCMS 4.0 you need to update the code of all existing sites to use Elasticsearch if your site(s) were built to use Solr.

Updating to Elasticsearch

To update your site to use Elasticsearch instead of Solr you can follow these steps:

  1. Overwrite the target in the Deployer to use Elasticsearch instead of Solr
  2. Index all existing content in Elasticsearch
  3. Find all references to searchService in your FreeMarker templates and replace them with the Elasticsearch client
  4. Find all references to searchService in your Groovy scripts and replace them with the Elasticsearch client
  5. Delete the unused Solr core if needed (can be done using the Solr Admin UI or the data/indexes folder)
  6. Update craftercms-plugin.yaml to use Elasticsearch as the search engine

Overwrite the target

For authoring environments:

 1curl --request POST \
 2  --url http://DEPLOYER_HOST:DEPLOYER_PORT/api/1/target/create \
 3  --header 'content-type: application/json' \
 4  --data '{
 5    "env": "preview",
 6    "site_name": "SITE_NAME",
 7    "template_name": "local",
 8    "repo_url": "INSTALL_DIR/data/repos/sites/SITE_NAME/sandbox",
 9    "disable_deploy_cron": true,
10    "replace": true
11  }'

For delivery environments:

 1curl --request POST \
 2  --url http://DEPLOYER_HOST:DEPLOYER_PORT/api/1/target/create \
 3  --header 'content-type: application/json' \
 4  --data '{
 5    "env": "default",
 6    "site_name": "SITE_NAME",
 7    "template_name": "remote",
 8    "repo_url": "INSTALL_DIR/data/repos/sites/SITE_NAME/published",
 9    "repo_branch": "live",
11    ... any additional settings like git credentials ...
13    "replace": true
14  }'


For a detailed list of parameters see Create Target

The create target operation will also create the new index in Elasticsearch.

Index all site content

To reindex all existing content execute the following command:

1curl --request POST \
2  --url http://DEPLOYER_HOST:DEPLOYER_PORT/api/1/target/deploy/ENVIRONMENT/SITE_NAME \
3  --header 'content-type: application/json' \
4  --data '{
5    "reprocess_all_files": true
6  }'

Update the site code

Because both Solr and Elasticsearch are based on Lucene, you will be able to keep most of your queries unchanged, however features like sorting, facets and highlighting will require code changes.


To take full advantage of Elasticsearch features it is recommended to replace query strings with other type of queries provided by the Elasticsearch DSL


If you are using any customization or any advance feature from Solr, you will need to find an alternative using Elasticsearch.

To update your code there are two possible approaches:


This is a basic example of replacing Crafter Search service with Elasticsearch

Existing Groovy code
 1def q = "${userTerm}~1 OR *${userTerm}*"
 3def query = searchService.createQuery()
 7query.setParam("sort", "createdDate_dt asc")
11def result = searchService.search(query)
13def documents = result.response.documents
14def highlighting = result.highlighting

Using the Elasticsearch Client the code will look like this:

Elasticsearch Client
 1import co.elastic.clients.elasticsearch._types.SortOrder
 3def q = "${userTerm}~1 OR *${userTerm}*"
 5// Execute the query
 6def result = elasticsearchClient(r -> r
 7  .query(q -> q
 8    .queryString(s -> s
 9      .query(q as String)
10    )
11  )
12  .from(start)
13  .size(rows)
14  .sort(s -> s
15    .field(f -> f
16      .field(createdDate_dt)
17      .order(SortOrder.Asc)
18    )
19  )
20  .highlight(h -> {
21    HIGHLIGHT_FIELDS.each { field ->
22      h.fields(field, f -> f)
23    }
24  })
25, Map)
27// Elasticsearch response (highlight results are part of each hit object)
28def documents = result.hits().hits()

For additional information you can read the official Java Client documentation and DSL documentation.

Notice in the given example that the query string didn’t change, you will need to update only the code that builds and executes the query. However Elasticsearch provides new query types and features that you can use directly from your Groovy scripts.

If any of your queries includes date math for range queries, you will also need to update them to use the Elasticsearch date math syntax described here.


Solr date math expression
1createdDate_dt: [ NOW-1MONTH/DAY TO NOW-2DAYS/DAY ]
Elasticsearch date math expression
1createdDate_dt: [ now-1M/d TO now-2d/d ]

In Solr there were two special fields _text_ and _text_main_, during indexing the values of other fields were copied to provide a simple way to create generic queries in all relevant text. Elasticsearch provides a different feature that replaces those fields Multi-match query


Solr query for any field
1_text_: some keywords
Elasticsearch query for any field (replacement for _text_)
1.multiMatch(m -> m
2  .query('some keywords')

Elasticsearch also offers the possibility to query fields with postfixes using wildcards

Elasticsearch query for specific fields (replacement for _text_main_)
1.multiMatch(m -> m
2  .query('some keywords')
3  .fields('*_t', '*_txt', '*_html')

Update “craftercms-plugin.yaml” to use Elasticsearch

Your site has a craftercms-plugin.yaml file that contains information for use by CrafterCMS. We’ll have to update the file to use Elasticsearch as the search engine.

Edit your craftercms-plugin.yaml, and remove the following property:

1searchEngine: CrafterSearch

And make sure to commit your changes to craftercms-plugin.yaml.

Migrating a site from the previous Elasticsearch client

Since 4.0.0

CrafterCMS 4.0 provides two different Elasticsearch clients, this is because Elasticsearch has released a new Java API Client to replace the Rest High Level Client and during the transition period both will work. So if you are upgrading from CrafterCMS 3.1 and your site already uses Elasticsearch it will continue to work with some small changes, but it is highly recommended to migrate to the new client to avoid any issues in future releases when the Rest High Level Client is completely removed.

Migrating to the new Elasticsearch client should not require too much effort:

  • If the existing code uses the builder classes you will need to replace them with the equivalent in the new client
  • If the existing code uses a map DSL it only needs to be replaced with the new lambda structure

For additional information about the new client you can read the official documentation