问题描述
尝试将Dockerized fscrawler连接到Dockerized elasticsearch时收到以下错误:
[f.p.e.c.f.c.ElasticsearchClientManager]创建失败 elasticsearch客户,正在禁用搜寻器...
[f.p.e.c.f.FsCrawler]致命 运行搜寻器时收到错误:[连接被拒绝]
解决方法
第一次运行fscrawler(即docker-compose run fscrawler
)时,它将使用以下默认设置创建/config/{fscrawer_job}/_settings.yml
:
elasticsearch:
nodes:
- url: "http://127.0.0.1:9200"
这将导致fscrawler尝试连接到localhost(即127.0.0.1)。但是,当fscrawler位于Docker容器中时,这将失败,因为它正在尝试与CONTAINER的本地主机连接。在我的情况下,这尤其令人困惑,因为Elasticsearch WAS可以作为本地主机访问,但可以在我的物理计算机的本地主机(而不是容器的本地主机)上访问。更改网址后,fscrawler可以连接到Elasticsearch实际上所在的网络地址。
elasticsearch:
nodes:
- url: "http://elasticsearch:9200"
我使用了以下Docker镜像:https://hub.docker.com/r/toto1310/fscrawler
# FILE: docker-compose.yml
version: '2.2'
services:
# FSCrawler
fscrawler:
image: toto1310/fscrawler
container_name: fscrawler
volumes:
- ${PWD}/config:/root/.fscrawler
- ${PWD}/data:/tmp/es
networks:
- esnet
command: fscrawler job_name
# Elasticsearch Cluster
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.3.2
container_name: elasticsearch
environment:
- node.name=elasticsearch
- discovery.seed_hosts=elasticsearch2
- cluster.initial_master_nodes=elasticsearch,elasticsearch2
- cluster.name=docker-cluster
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- esdata01:/usr/share/elasticsearch/data
ports:
- 9200:9200
networks:
- esnet
elasticsearch2:
image: docker.elastic.co/elasticsearch/elasticsearch:7.3.2
container_name: elasticsearch2
environment:
- node.name=elasticsearch2
- discovery.seed_hosts=elasticsearch
- cluster.initial_master_nodes=elasticsearch,elasticsearch2
- cluster.name=docker-cluster
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- esdata02:/usr/share/elasticsearch/data
networks:
- esnet
volumes:
esdata01:
driver: local
esdata02:
driver: local
networks:
esnet:
冉docker-compose up elasticsearch elasticsearch2
调出elasticsearch节点。
运行docker-compose run fscrawler
创建_settings.yml
将_settings.yml
编辑为
elasticsearch:
nodes:
- url: "http://elasticsearch:9200"
启动了fscrawler docker-compose up fscrawler