Securing Elastic Search and Kibana
Feature
To enable secure login for Kibana and secure communication between the Elastic Search data store and Germain Services, the following steps can be followed. Please note that this document assumes Elastic Search is being run using Docker Compose.
version: '3.3'
services:
germain-es1:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.6
restart: always
mem_limit: 58g
ports:
- 9200:9200
- 9300:9300
environment:
node.name: "node01"
cluster.name: "uhc-elastic-prod"
discovery.seed_hosts: "node01:9300,node02:9300,node03:9300"
cluster.initial_master_nodes: "node01,node02,node03"
bootstrap.memory_lock: "true"
http.max_content_length: "150MB"
xpack.security.enabled: "false"
network.publish_host: "node01"
transport.publish_port: "9300"
path.repo: "/usr/share/elasticsearch/backup/uhc_repo"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- /es1/data:/usr/share/elasticsearch/data
- /es1/logs:/usr/share/elasticsearch/logs
- /elasticsearch_backup:/usr/share/elasticsearch/backup
- /opt/elastic/plugins/germainapm-es-plugin:/usr/share/elasticsearch/plugins/germainapm-es-plugin
germain-kibana:
image: docker.elastic.co/kibana/kibana:7.17.6
restart: always
ports:
- 5601:5601
environment:
ELASTICSEARCH_HOSTS: "http://node01:9200"
Pre-Requisites
Get a PFX File for our certificate update. ( Referring to this file as
germainkibana-prod.domain.com.pfxon this document )Generate the key file and cert file using the below OpenSSL commands.
CODEopenssl pkcs12 -in germainkibana-prod.uhc.com.pfx -nocerts -out key.pem -nodes openssl pkcs12 -in germainkibana-prod.uhc.com.pfx -nokeys -out cert.pemImport the certificate to the existing Java cacert file your current Germain services use.
CODEkeytool -import -trustcacerts -storepass changeit -file /path/to/cert.pem -alias youralias -keystore /path/to/cacertsAlternatively, you can mount the cacert file to the docker container
CODEvolumes: # Mount cacerts - ${PWD}/path/to/cacerts:/etc/ssl/certs/java/cacerts
Generate Keystore to store the key password for SSL using Docker image. To generate this Keystore, we use a docker container and mount the config directory into a local directory so we can extract the
elasticsearch.keystorefile from the same directory for future useCODEdocker run -it --rm \ -v /path/to/local-dir/config:/usr/share/elasticsearch/config \ elasticsearch:7.17.6 \ bin/elasticsearch-keystore \ add xpack.security.transport.ssl.keystore.secure_password docker run -it --rm \ -v /path/to/local-dir/config:/usr/share/elasticsearch/config \ elasticsearch:7.17.6 \ bin/elasticsearch-keystore \ add xpack.security.transport.ssl.truststore.secure_password docker run -it --rm \ -v /path/to/local-dir/config:/usr/share/elasticsearch/config \ elasticsearch:7.17.6 \ bin/elasticsearch-keystore \ add xpack.security.http.ssl.keystore.secure_password docker run -it --rm \ -v /path/to/local-dir/config:/usr/share/elasticsearch/config \ elasticsearch:7.17.6 \ bin/elasticsearch-keystore \ add xpack.security.http.ssl.truststore.secure_passwordChange the elastic search bootstrap password.
CODEdocker run -it --rm \ -v /path/to/local-dir/config:/usr/share/elasticsearch/config \ elasticsearch:7.17.6 \ bin/elasticsearch-keystore \ add bootstrap.passwordDownload a local installation of Kibana and get the
kibana.ymlfile to be mounted to docker. Then change/add the properties as bellowCODEelasticsearch.username: "elastic" elasticsearch.password: "your-super-secret-password"In the same file, uncomment and update properties for SSL
CODEserver.ssl.enabled: true server.ssl.certificate: /usr/share/kibana/config/cert.pem server.ssl.key: /usr/share/kibana/config/key.pem elasticsearch.ssl.certificateAuthorities: [ "/usr/share/kibana/config/cert.pem" ]
During Upgrade
Bring down all the Services, Engines and then Elastic Search as well.
Enable Elastic Search Security settings by adding the below environment variables
CODE#security http.cors.enabled: "true" http.cors.allow-origin: "*" xpack.security.enabled: "true"Setup SSL Settings to enable true by adding the below environment variables
CODE# ssl xpack.security.http.ssl.enabled: "true" xpack.security.http.ssl.verification_mode: none xpack.security.http.ssl.client_authentication: none xpack.security.http.ssl.keystore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx xpack.security.http.ssl.truststore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfxEnable Transport Security Settings by adding the below environment variables
CODE# transport xpack.security.transport.ssl.enabled: "true" xpack.security.transport.ssl.verification_mode: none xpack.security.transport.ssl.client_authentication: none xpack.security.transport.ssl.keystore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx xpack.security.transport.ssl.truststore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfxMount the certificate and keystore files by adding the below volumes
CODEvolumes: - ${PWD}/path/to/germainkibana-prod.uhc.com.pfx:/usr/share/elasticsearch/config/germainkibana-prod.uhc.com.pfx - ${PWD}/path/to/elasticsearch.keystore:/usr/share/elasticsearch/config/elasticsearch.keystoreUpdate zookeeper config to use secure elastic search
CODEset /config/germain/application/germain.elastic.url https://node01:9200 set /config/germain/application/germain.elastic.username elastic set /config/germain/application/germain.elastic.password your-super-secret-password set /config/germain/application/germain.indexer.url https://node01:9200 set /config/germain/application/germain.indexer.username elastic set /config/germain/application/germain.indexer.username your-super-secret-password
The final docker-compose file should look like this.
version: '3.3'
services:
germain-es1:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.6
restart: always
mem_limit: 58g
ports:
- 9200:9200
- 9300:9300
environment:
node.name: "node01.domain.com"
cluster.name: "uhc-elastic-prod"
discovery.seed_hosts: "node01.domain.com:9300,node02.domain.com:9300,node03.domain.com:9300"
cluster.initial_master_nodes: "node01.domain.com,node02.domain.com,node03.domain.com"
bootstrap.memory_lock: "true"
http.max_content_length: "150MB"
network.publish_host: "node01.domain.com"
transport.publish_port: "9300"
path.repo: "/usr/share/elasticsearch/backup/uhc_repo"
#security
http.cors.enabled: "true"
http.cors.allow-origin: "*"
xpack.security.enabled: "true"
# ssl
xpack.security.http.ssl.enabled: "true"
xpack.security.http.ssl.verification_mode: none
xpack.security.http.ssl.client_authentication: none
xpack.security.http.ssl.keystore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx
xpack.security.http.ssl.truststore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx
# transport
xpack.security.transport.ssl.enabled: "true"
xpack.security.transport.ssl.verification_mode: none
xpack.security.transport.ssl.client_authentication: none
xpack.security.transport.ssl.keystore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx
xpack.security.transport.ssl.truststore.path: /usr/share/elasticsearch/config/germainkibana-prod.domain.com.pfx
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- /es1/data:/usr/share/elasticsearch/data
- /es1/logs:/usr/share/elasticsearch/logs
- /elasticsearch_backup:/usr/share/elasticsearch/backup
- /opt/elastic/plugins/germainapm-es-plugin:/usr/share/elasticsearch/plugins/germainapm-es-plugin
- ${PWD}/path/to/germainkibana-prod.uhc.com.pfx:/usr/share/elasticsearch/config/germainkibana-prod.uhc.com.pfx
- ${PWD}/path/to/elasticsearch.keystore:/usr/share/elasticsearch/config/elasticsearch.keystore
germain-kibana:
image: docker.elastic.co/kibana/kibana:7.17.6
restart: always
ports:
- 5601:5601
environment:
SERVER_HOST: 0.0.0.0
ELASTICSEARCH_HOSTS: "http://node01:9200"
volumes:
- /opt/elastic/ssl/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml
- /opt/elastic/ssl/kibana/cert.pem:/usr/share/kibana/config/cert.pem
- /opt/elastic/ssl/kibana/key.pem:/usr/share/kibana/config/key.pem
After Upgrade
Start Elastic Cluster and validate
SSL Enabled for Kibana
SSL Enabled for API
Cert is valid
Start ActiveMQ, and Zookeeper, then validate Zookeeper config.
Storage service and confirm the connection is established successfully.
Once confirmed, start all other services.
Validate Service status on logs and APM State Section
Start Engines
Validate data reception and Dashboards
RollBack Plan
Bring down all the Services, Engines and then Elastic Search as well.
Revert all the docker-compose files back to the original.
Start Elastic Search
Start Services
Validate Services
Start Engines
Validate data reception and Dashboards