一尘不染

如何使用logstash将队列的内容发送到elasticsearch索引

elasticsearch

我有一个运行起来的logstash,它消耗了两个Rabbit队列并发送给elasticsearch。这是我的logstash.conf文件:

input {
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'dev-user_trace'
    password => 'pass'
  }
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'min-price-queue'
    password => 'pass'
  }

}
filter{
}
output{
  stdout { codec => json}
    elasticsearch{
    hosts => ["elasticsearch"]
    index => "eventss-%{+YYYY.MM.dd}"
  }

}

现在,我有了另一个队列,但是我想将其内容发送到另一个elasticsearch索引。我的问题是:我如何需要将特定条目重定向到特定索引?还是我需要另一个logstash实例?

提前致谢。


阅读 263

收藏
2020-06-22

共1个答案

一尘不染

很好的开始。现在,您只需要“键入”每个输入,然后将事件转发给给定类型的适当输出,如下所示:

input {
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'dev-user_trace'
    password => 'pass'
    type => 'traces'               # <-- add this
  }
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'min-price-queue'
    password => 'pass'
    type => 'prices'               # <-- add this
  }

}
filter{
}
output{
  stdout { codec => json}

  if [type] == 'traces' {          # <-- check type
     elasticsearch{
       hosts => ["host1:9200"]
       index => "index1-%{+YYYY.MM.dd}"
     }
  }

  if [type] == 'prices' {          # <-- check type
     elasticsearch{
       hosts => ["host2:9200"]
       index => "index2-%{+YYYY.MM.dd}"
     }
  }
}

更新

上面是最通用的方法,因此您可以不同地配置两个输出。正如@pandaadb所建议的那样,您还可以有一个输出并定义一个将作为目标索引的类型:

input {
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'dev-user_trace'
    password => 'pass'
    type => 'index1'                    # <-- add this
  }
  rabbitmq {
    host => 'rabbit'
    durable => true
    user => 'user'
    queue => 'min-price-queue'
    password => 'pass'
    type => 'index2'                    # <-- add this
  }

}
filter{
}
output{
  stdout { codec => json}

  elasticsearch{
    hosts => ["localhost:9200"]
    index => "%{type}-%{+YYYY.MM.dd}"   # <-- use type here
  }
}
2020-06-22