Author Archives: Rudy Amid

Re-Index With Elasticsearch

Elasticsearch Logo

When dealing with indices, it’s inevitable there will be a need to change the mapped fields. For example, in a firewall log, due to default mappings, a field like “RepeatCount” was stored as text instead of integer. To fix this, first write an ingest pipeline (using Kibana) to convert the field from text to integer:

PUT _ingest/pipeline/string-to-long
{
  "description": "convert RepeatCount field from string into long",
    "processors": [      
      {
        "convert": {
          "field": "RepeatCount",
          "type": "long",
          "ignore_missing": true
        }
      }
    ]
}

Next, run the POST command reindex the old index into the new one, while running the pipeline for conversion:

POST _reindex 
{ 
   "source": { 
      "index": "fwlogs-2019.02.01" 
   }, 
   "dest": { 
      "index": "fwlogs-2019.02.01-v2", 
      "pipeline": "string-to-long"
   } 
}

If there are multiple indices, it’s recommended to use a shell script to deal with the individual index systematically, such as “fwlogs-2019.02.01”, “fwlogs-2019.02.02”, etc.

#!/bin/sh
# The list of index names in rlist.txt file
LIST=`cat rlist.txt`
for index in $LIST; do
  curl -HContent-Type:application/json --user elastic:password -XPOST https://mysearch.domain.net:9200/_reindex?pretty -d'{
    "source": {
      "index": "'$index'"
    },
    "dest": {
      "index": "'$index'-v2",
      "pipeline": "string-to-long"
    }
  }'
done

Finally, clean up the old indices by deleting them. It’s a temptation to use Kibana to DELETE fwlogs-2019.02*, but beware the new indices have the suffix “-v2” and it will be deleted if the wildcard argument is used. Instead use the shell script to delete based on the names specifically listed in the txt file.

#!/bin/sh
# The list of index names in rlist.txt file
LIST=`cat rlist.txt`
for index in $LIST; do
  curl --user elastic:password -XDELETE "https://mysearch.domain.net:9200/$index"
done

Hype Cycle 2018 For Web Applications

By Jeremykemp at English Wikipedia, CC BY-SA 3.0

Technology changes quickly. This is especially true in web development. With companies such as Google, Facebook, Amazon, or Netflix leading the way, there will always be the “next best thing” every IT professional has to pay attention to. Depending on the size and budget, not all companies can invest in the latest trend of technology. The question always asked: “What can we invest in?” As a guideline, annually Gartner publishes their infamous Hype Cycle, that charts the popularity (or decline) of technology. For those who are on the cutting edge will try to follow anything towards the “Peak of Inflated Expectations”, where the technology is hot. However, the most interesting set are the ones sliding into the “Trough of Disillusionment”. In 2018, those web applications were:

  • Point-of-Decision HTAP
  • Cloud-Native Application Architecture
  • Reactive Programming
  • Microservices
  • Mesh App and Service Architecture
  • Public Web APIs
  • Miniservices

Enterprise has already started to invest in those declining trendy ideas.  However, in order to get to full adoption, IT Professionals have to familiarize with (and embrace) the new technology. It’ll be a difficult journey, but may be worth the investment. At this point, a great deal of material will be available since the concept has been around for a few years already. This is known as the “Slope of Enlightenment”. In order to get started, here are some suggestions on which presentation to listen to:

After listening to the presentations, one can determine the trend and make decisions on where/how to go to get Enterprise environments to the next level. It’ll take more time to get to the “Plateau of Productivity” where value can be realized by streamlining their execution for the long term production use.

Enterprise sure has plenty of work to do!

Are the Russian (Hackers) Still Coming?

The headlines in the news these days are about hackers attempting to infiltrate sites, mostly from Russia or China. The targets are many American sites, both government and private. How does IT Cybersecurity folks know if they’re coming? Going through the application logs for all attempts is a start. However, the best source of knowledge is the first line of defense: the Firewall. So it’s best to have a tool like Elasticsearch to make a readable report on the firewall logs, to figure out which ports are being probed.

It’s imperative any exposed ports are being denied on the firewall side to prevent any successful hack. In a real world example, in the past 7 days, the hackers were scanning for popular vulnerable applications such as telnet, RDP (Windows Remote Desktop), Microsoft SQL, or SMTP.

Thankfully, those ports are being blocked on the firewall. Unfortunately, this does not deter them from trying again and again. Network and system admins must put in the due diligence in controlling access and patching applications. No matter the business requirements, security must take precedence and IT Professionals must have the tools to detect, analyze, and protect.