一尘不染

在Elasticsearch和Lucene 4.4中使用shingle疹和停用词

elasticsearch

在建立的索引中,我有兴趣运行查询,然后(使用构面)返回该查询的带状疱疹。这是我在文字上使用的分析仪:

{
  "settings": {
    "analysis": {
      "analyzer": {
        "shingleAnalyzer": {
          "tokenizer": "standard",
          "filter": [
            "standard",
            "lowercase",
            "custom_stop",
            "custom_shingle",
            "custom_stemmer"
          ]
        }
      },
      "filter": {
        "custom_stemmer" : {
            "type": "stemmer",
            "name": "english"
        },
        "custom_stop": {
            "type": "stop",
            "stopwords": "_english_"
        },
        "custom_shingle": {
            "type": "shingle",
            "min_shingle_size": "2",
            "max_shingle_size": "3"
        }
      }
    }
  }
}

主要问题在于,对于Lucene
4.4,停止过滤器不再支持enable_position_increments消除包含停止词的带状疱疹的参数。相反,我会得到类似的结果。

“红色和黄色”

"terms": [
    {
        "term": "red",
        "count": 43
    },
    {
        "term": "red _",
        "count": 43
    },
    {
        "term": "red _ yellow",
        "count": 43
    },
    {
        "term": "_ yellow",
        "count": 42
    },
    {
        "term": "yellow",
        "count": 42
    }
]

自然,这极大地扭曲了返回的带状疱疹的数量。有没有一种方法可以在不进行结果后处理的情况下进行Lucene 4.4后的管理?


阅读 297

收藏
2020-06-22

共1个答案

一尘不染

可能不是最理想的解决方案,但最直截了当的是将另一个过滤器添加到分析仪中以杀死“ _”填充标记。在下面的示例中,我将其称为“ kill_fillers”:

   "shingleAnalyzer": {
      "tokenizer": "standard",
      "filter": [
        "standard",
        "lowercase",
        "custom_stop",
        "custom_shingle",
        "custom_stemmer",
        "kill_fillers"
       ],
       ...

将“ kill_fillers”过滤器添加到您的过滤器列表中:

"filters":{
...
  "kill_fillers": {
    "type": "pattern_replace",
    "pattern": ".*_.*",
    "replace": "",
  },
...
}
2020-06-22