如何解决气流Docker容器未使用Terraform连接到Postgres Docker容器
版本:
Terraform ==。12
docker == 19.03.8
python == 3.8
postgreSQL == 9.6
apache-airflow == 1.10.10
说明:
目标是为Apache气流局部开发创建一个容器。我在气流容器内使用puckel/docker-airflow码头工人形象,但在使用自己提供的entrypoint.sh(请参见下文)。我使用本地Postgres数据库作为后端来存储气流的元数据,因为需要使用Airflow LocalExecutor。最后,我正在使用Terraform创建本地Postgres数据库以及Airflow和Postgres容器。
问题:
该错误在气流容器的entrypoint.sh中引发。具体来说,airflow initdb
引发了与sql_alchemy_conn字符串有关的错误。在运行气流initdb之前,将sql_alchemy_conn字符串作为env变量导出到容器的entrypoint.sh中。
我尝试了以下sql_alchemy_conn字符串(不显示实际的敏感值):
sql_alchemy_conn="postgresql://postgres_user:password@postgres:5432/postgres_db"
sql_alchemy_conn="postgresql+psycopg2://postgres_user:password@postgres:5432/postgres_db"
这导致了错误:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not translate host name "postgres" to address: Name or service not known
我已经尝试过以下sql_alchemy_conn字符串:
sql_alchemy_conn="postgresql://postgres_user:password@localhost:5432/postgres_db"
sql_alchemy_conn="postgresql+psycopg2://postgres_user:password@localhost:5432/postgres_db"
这导致了错误:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Cannot assign requested address
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
相关文件:
main.tf:用于创建资源的Terraform文件
locals {
cwd = path.cwd
}
provider "postgresql" {
host = "localhost"
username = var.postgres_username
password = var.postgres_password
port = var.port
sslmode = "disable"
}
resource "postgresql_database" "airflow_backend" {
name = var.airflow_backend_db
owner = var.postgres_username
}
provider "docker"{}
resource "docker_image" "postgres" {
name = "postgres:latest"
}
resource "docker_image" "airflow" {
name = "puckel/docker-airflow:latest"
}
resource "docker_container" "postgres" {
image = docker_image.postgres.name
name = "postgres"
env = [
format("POSTGRES_USER=%s",var.postgres_user),format("POSTGRES_PASSWORD=%s",var.postgres_password),format("POSTGRES_DB=%s",var.postgres_db)
]
log_opts = {
max-size = "10m"
max-file = "3"
}
depends_on = [postgresql_database.airflow_backend]
ports {
internal = var.postgres_port
}
}
resource "docker_container" "airflow" {
name = "airflow"
image = docker_image.airflow.name
restart = "always"
depends_on = [docker_container.postgres]
env = [
format("AWS_ACCESS_KEY=%s",var.aws_access_key),format("AWS_SECRET_KEY=%s",var.aws_secret_key),format("POSTGRES_USER=%s",var.postgres_db),format("POSTGRES_HOST=%s","postgres"),format("POSTGRES_PORT=%s",var.postgres_port),format("AIRFLOW_HOME=%s",var.airflow_home),format("AIRFLOW__CORE__EXECUTOR=%s",var.executor),format("AIRFLOW__CORE__LOAD_EXAMPLES=%s",var.load_examples),format("AIRFLOW__CORE__DAGS_FOLDER=%s",var.container_dags_path),format("AIRFLOW__CORE__PLUGINS_FOLDER=%s",var.container_plugins_path)
]
log_opts = {
max-size = "10m"
max-file = "3"
}
volumes {
host_path = format(var.host_volumes_path,local.cwd)
container_path = var.container_volumes_path
}
ports {
internal = 8080
external = 8080
}
command = ["webserver"]
entrypoint = ["bash","entrypoint.sh"]
healthcheck {
test = ["CMD-SHELL","[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval = "30s"
timeout = "30s"
retries = 3
}
}
Airflow容器的entrypoint.sh中的相关代码段:
AIRFLOW__CORE__SQL_ALCHEMY_CONN="postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}"
export AIRFLOW__CORE__SQL_ALCHEMY_CONN
case "$1" in
webserver)
#where error arises
airflow initdb
airflow upgradedb
if [ "$AIRFLOW__CORE__EXECUTOR" = "LocalExecutor" ]; then
airflow scheduler &
fi
exec airflow webserver -p 8080
;;
worker|scheduler)
sleep 10
exec airflow "$@"
;;
flower)
sleep 10
exec airflow "$@"
;;
version)
exec airflow "$@"
;;
*)
exec "$@"
;;
esac
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。