aws
We use elasticsearch for a lot of purposes within our organisation, and thanks to kibana, we sometimes also use elasticsearch to store some data that isn’t necessarily timeseries data, but just nosql data that we want to visualize. We had a business requirement of having multiple backups for one of these types of data. So, we decided to have a dump of these indices once everyday and store it in an Amazon S3 bucket, in case the worst should happen.
Continuing from the last post where we created a logging pipeline for storing and visualizing access logs in realtime, we faced some further issues with the said pipeline. In this post, I’ll try to build on that and get rid of some of these issues.
So I work for an Ad-Tech company, and in this industry we receive a huge volume and rate of requests. At our peak, we serve around 35,000 requests per second, which is actually a bigger number than you initially realize.
On a network with a large number of clients, DHCP is a must for dynamically assigning IP addresses to all the connected clients in order to avoid address conflicts and automating other required configurations like subnet mask, default gateway, DNS servers etc. Nowadays, almost all of the routers, even the smaller home routers already have an inbuilt DHCP server within the router that assigns IP addresses to connected clients. Still, why not learn how to set up one for yourself?
So, in the last post, we finished setting up the infrastructure for our blog. In this post, we’ll deploy our blog to our server and make our blog live!
So, after a long time and a very little deliberation, I finally decided to stop using Google Blogger and move to a “self-hosted” soultion. The idea was to setup most of the things from scratch and have much more control over my own blog. So, I decided to give “static site generators” a try. I looked at a few different static site generators and in the end decided to go with Hugo, because it was version controlled, fast, secure, easy to understand and had good flexibilty…