Try 6 out of 6
Exception:
Cannot execute: ['/usr/lib/spark2/bin/spark-submit', '--master', 'yarn', '--conf', 'spark.yarn.maxAppAttempts=1', '--conf', 'spark.pyspark.python=venv/bin/python3.7', '--conf', 'spark.jars.ivySettings=/etc/maven/ivysettings.xml', '--conf', 'spark.jars.ivy=/tmp/airflow_ivy2', '--conf', 'spark.dynamicAllocation.maxExecutors=20', '--conf', 'spark.executor.memoryOverhead=1g', '--conf', 'spark.executor.memory=4g', '--files', '/etc/mysql/conf.d/analytics-search-client.cnf#mysql.cnf,/srv/mediawiki-config/dblists/s1.dblist,/srv/mediawiki-config/dblists/s2.dblist,/srv/mediawiki-config/dblists/s3.dblist,/srv/mediawiki-config/dblists/s4.dblist,/srv/mediawiki-config/dblists/s5.dblist,/srv/mediawiki-config/dblists/s6.dblist,/srv/mediawiki-config/dblists/s7.dblist,/srv/mediawiki-config/dblists/s8.dblist', '--py-files', '/srv/deployment/wikimedia/discovery/analytics/spark/wmf_spark.py', '--archives', 'hdfs://analytics-hadoop/wmf/discovery/current/environments/mw_sql_to_hive/venv.zip#venv', '--packages', 'mysql:mysql-connector-java:8.0.19', '--name', 'airflow-spark', '--queue', 'root.default', '--deploy-mode', 'cluster', '/srv/deployment/wikimedia/discovery/analytics/spark/mw_sql_to_hive.py', '--mysql-defaults-file', 'mysql.cnf', '--dblists', 's1.dblist,s2.dblist,s3.dblist,s4.dblist,s5.dblist,s6.dblist,s7.dblist,s8.dblist', '--query', '\n SELECT pp_page as page_id, page_namespace, pp_value as wikibase_item\n FROM page_props\n JOIN page ON page_id = pp_page\n WHERE pp_propname="wikibase_item"\n ', '--output-partition', 'discovery.wikibase_item/date=20201011']. Error code is: 1.
Log: Link
Host: an-airflow1001.eqiad.wmnet
Log file: /var/log/airflow/ores_predictions_weekly/extract_wikibase_item/2020-10-11T00:00:00+00:00.log
Mark success: Link