Setting-up AWS ElasticSearch

Rushabh Trivedi
4 min readFeb 5, 2021

Modern applications requires modern solutions for performance optimizations, logging and log analysis, scalability, cost optimizations etc.

For many of the real world applications, it is required to maintain log data for a long time for audit/regulatory purposes. As a purpose of this article I will show on how to configure elastic search on AWS for capturing logs.

This article focuses on four main aspects of setting-up the elastic search:

  • Create Elastic search on AWS
  • Install logstash service on EC2
  • Connect to Kibana

Let’s just directly dive into the details on configuring the ElasticSearch service on AWS

Create ElasticSearch service on AWS

Using AWS console go to ElasticSearch service to configure the same.

  • Select the environment type as per the need (use prod)
  • On the next page in the wizard you can select the number of nodes needed in each availability zones and the size of the instance. For production environments it’s recommended that you always distribute the instances across AZs for the failover.
  • On the other page you have the options to restrict the access to the ES cluster. For production environments, it is highly recommended to not to have public access to the cluster. User has to tunnel through to connect to the cluster(we will see this in the later part of this article).
  • To connect to the cluster you need to have user access to the console. We will create a user with user name and password. This we will use in the last step to connect to the Kibana console. Also if you are creating master user, you may not need cognito user identity.
  • This cluster will also need an access policy which may look something like below shown.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "es:*",
"Resource": "arn:aws:es:us-east-1:1234567XXXXX:domain/domain-name/*"
}
]
}

We don not want to change any encryption related things for now.

And you are done with your ES cluster on AWS!!! It should look like below.

Installing LogStach on EC2

Logstash is a service which generally is not included on a linux server. You have to install the same on the EC2. LogStash works in conjunction with the Elastic search to feed the log data to the cluster.

You have to create the repo for the logstash on the server for installing the service in /etc/yum.repos.d/ and put the following code in that.

[logstash-6.x]
name=Elastic repository for 6.x packages baseurl=https://artifacts.elastic.co/packages/6.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1
autorefresh=1
type=rpm-md

Install using

sudo yum install logstash

Update the configuration for the logstash and point the same to the

input { 
file {
path => "<your_project_dir>/logfile.log"
start_position => "beginning"
}
}
output {
amazon_es {
hosts => ["KIBANA-END-POINT-URL"]
}
region => <region-name>
index => "Index-%{+YYYY.MM.dd}"
aws_access_key_id => 'KEY'
aws_secret_access_key => 'SECRET'
}

You have to start the logstash service by

sudo service logstash start

This should start your logstash service and that should copy your logs to the Kibana.

Connecting to Kibana

We have created our ES cluster as VPC access only. You will not be able to connect to the cluster directly from your machine. You will need a jump server to connect to the cluster from the VPC.

You should be easily able to connect to the same using the tunnel from your machine now.

ssh -i ~/.ssh/your-key.pem ec2-user@your-ec2-instance-public-ip -N -L 9200:vpc-your-amazon-es-domain.region.es.amazonaws.com:443

You can see to Kibana login screen by accessing

https://localhost:9200/_plugin/kibana

All this looks great!!!
You have just setup an Elasticsearch cluster on your AWS.

Next actions,
Create indexes on Kibana to organize logs. You can also filter the logs using the queries on Kibana console.

You can also find official documentation for ElasticSearch here.

Looking forward for your feedback and comments!!!

--

--

Rushabh Trivedi

AWS Certified Associate Architect, Cloud Solutions Lead, Angular Developer