Developing any Sitecore website that can use either Standalone or Solr Cloud instances requires multiple developers. The developers work with multiple local instances, some test environments, and a UAT and a production environment.
To ensure all instances and environments remain in sync, regardless of the approach taken to store the cores, automation in deployment of new cores is a MUST.
This article covers a general approach for automating the deployment of custom index cores for both Standalone and Solr Cloud versions. This assumes that Solr has already been setup and configured on a Sitecore instance for the Sitecore default indexes.
A Standalone instance commonly requires two environments:
- Local Environment
- Test Environment
Local Sitecore instances built on Helix that are using Gulp for local deployment can contain dedicated Gulp tasks that ensure that the Solr index configuration is deployed from the solution to the local Solr index cores folder. Also, the Solr service needs to be restarted for the changes to be applied. The same approach can be used with other deployment tools, such as Nuke.
An implementation using Gulp would contain the following:
- Ensure that the core configuration is stored in the repository, you will need to version the config folder and core.properties file
- Add the following settings in Gulp: config.js file, for example
- Create a Deploy-Solr task to add in the default deployment that includes the following:
- Stop-Solr-Service: This can be easily achieved with a PowerShell script with the following command:
- Copy-Solr-Cores task
- Start-Solr-Service task: A PowerShell script similar to the one above, but calling StartService.ps1 task instead
For test environments, deploying custom index cores onto a Solr standalone version would need a similar approach as on local:
- Stop the Solr service: Easily achieved in a step on your deployment server. For example, in Octopus you can use Windows Service - Stop community step. You'll also need a deployment target pointing out to your Solr server
- Copy the cores into the Solr folder: Create a NuGet package with the contents at build time, within your build server. You'll probably want to define a .nuspec file for it. Deploy the package.
- Start the Solr service
Solr Cloud can be configured locally. However, even if this is helpful from a testing purpose i.e. to have the local environment configuration a closer match to the production environment, it would nonetheless, compared to using the standalone version that comes with the Sitecore Vanilla installation, create an overhead on the local setup.
To deploy new cores (called collections) on a Solr Cloud instance, the approach differs from the standalone version. This involves using Solr admin collections API instead.
Developers can easily create a new collection by writing a PowerShell script. For example:
This request needs to be invoked for all the custom cores that need to be deployed (which can be read from the folder in the repository or stored as configuration variables). In the example above, the index is using the same configset as the Sitecore indexes.
It’s important to acknowledge that calling the CREATE api on a collection that already exists will throw an error. This needs to be handled in the script to ensure that the deployment step will not cause the whole build to fail.
Another thing to note is that for a production environment, developers will need to deploy a second core used for switch on rebuild Solr functionality. This can easily be scripted along with the main cores.
This example script is using Octopus variables to configure the Solr cores. At Codehouse, our development team has used this script as a part of a deployment step targeted for the Solr environments.
Our development team has years of experience working on Sitecore projects that require complex and expert development skills. Get in touch to find out more about how automated deployments on Sitecore can be achieved.