2018-02-06 09:35:48,449 [salt.loaded.int.states.file][WARNING ][1437] State for file: /etc/ssl/certs/ca-salt_master_ca.crt - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-02-06 09:35:48,476 [salt.loaded.int.module.cmdmod][ERROR   ][1437] Command 'while true; do salt-call saltutil.running|grep fun: && continue; salt-call --local service.restart salt-minion; break; done' failed with return code: None
2018-02-06 09:35:51,444 [salt.loaded.int.module.cmdmod][INFO    ][2067] Executing command ['systemctl', 'status', 'salt-minion.service', '-n', '0'] in directory '/root'
2018-02-06 09:35:51,461 [salt.loaded.int.module.cmdmod][INFO    ][2067] Executing command ['systemctl', 'is-enabled', 'salt-minion.service'] in directory '/root'
2018-02-06 09:35:51,491 [salt.loaded.int.module.cmdmod][INFO    ][2067] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'salt-minion.service'] in directory '/root'
2018-02-06 09:35:51,529 [salt.utils.parsers][WARNING ][1203] Minion received a SIGTERM. Exiting.
2018-02-06 09:35:51,926 [salt.cli.daemons ][INFO    ][2116] Setting up the Salt Minion "prx02.mcp-pike-ovs-ha.local"
2018-02-06 09:35:52,044 [salt.cli.daemons ][INFO    ][2116] Starting up the Salt Minion
2018-02-06 09:35:52,045 [salt.utils.event ][INFO    ][2116] Starting pull socket on /var/run/salt/minion/minion_event_02eab499dd_pull.ipc
2018-02-06 09:35:52,641 [salt.minion      ][INFO    ][2116] Creating minion process manager
2018-02-06 09:35:53,870 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][2116] Executing command ['date', '+%z'] in directory '/root'
2018-02-06 09:35:53,892 [salt.utils.schedule][INFO    ][2116] Updating job settings for scheduled job: __mine_interval
2018-02-06 09:35:53,895 [salt.minion      ][INFO    ][2116] Added mine.update to scheduler
2018-02-06 09:35:53,901 [salt.minion      ][INFO    ][2116] Minion is starting as user 'root'
2018-02-06 09:35:53,920 [salt.minion      ][INFO    ][2116] Minion is ready to receive requests!
2018-02-06 09:35:54,922 [salt.utils.schedule][INFO    ][2116] Running scheduled job: __mine_interval
2018-02-06 09:35:56,838 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command state.apply with jid 20180206093556267240
2018-02-06 09:35:56,861 [salt.minion      ][INFO    ][2207] Starting a new job with PID 2207
2018-02-06 09:36:00,623 [salt.state       ][INFO    ][2207] Loading fresh modules for state activity
2018-02-06 09:36:00,669 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/init.sls'
2018-02-06 09:36:01,137 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/init.sls'
2018-02-06 09:36:01,272 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/env.sls'
2018-02-06 09:36:01,383 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/profile.sls'
2018-02-06 09:36:01,492 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/repo.sls'
2018-02-06 09:36:01,683 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/package.sls'
2018-02-06 09:36:01,800 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/timezone.sls'
2018-02-06 09:36:01,907 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/kernel.sls'
2018-02-06 09:36:02,060 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/cpu.sls'
2018-02-06 09:36:02,163 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/sysfs.sls'
2018-02-06 09:36:02,275 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/locale.sls'
2018-02-06 09:36:02,380 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/user.sls'
2018-02-06 09:36:02,511 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/group.sls'
2018-02-06 09:36:02,624 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/limit.sls'
2018-02-06 09:36:02,723 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/systemd.sls'
2018-02-06 09:36:02,831 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/system/apt.sls'
2018-02-06 09:36:03,759 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/network/init.sls'
2018-02-06 09:36:03,864 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/network/hostname.sls'
2018-02-06 09:36:03,971 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/network/host.sls'
2018-02-06 09:36:04,144 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/network/interface.sls'
2018-02-06 09:36:04,347 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/network/proxy.sls'
2018-02-06 09:36:04,464 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/storage/init.sls'
2018-02-06 09:36:04,581 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'ntp/init.sls'
2018-02-06 09:36:04,604 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'ntp/client.sls'
2018-02-06 09:36:04,652 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'ntp/server.sls'
2018-02-06 09:36:04,692 [salt.state       ][INFO    ][2207] Running state [/etc/environment] at time 09:36:04.692852
2018-02-06 09:36:04,693 [salt.state       ][INFO    ][2207] Executing state file.blockreplace for /etc/environment
2018-02-06 09:36:04,702 [salt.state       ][INFO    ][2207] File changed:
--- 
+++ 
@@ -1 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
+# SALT MANAGED VARIABLES - DO NOT EDIT - START
+# +# SALT MANAGED VARIABLES - END

2018-02-06 09:36:04,703 [salt.state       ][INFO    ][2207] Completed state [/etc/environment] at time 09:36:04.703510 duration_in_ms=10.658
2018-02-06 09:36:04,704 [salt.state       ][INFO    ][2207] Running state [/etc/profile.d] at time 09:36:04.703978
2018-02-06 09:36:04,704 [salt.state       ][INFO    ][2207] Executing state file.directory for /etc/profile.d
2018-02-06 09:36:04,706 [salt.state       ][INFO    ][2207] Directory /etc/profile.d is in the correct state
2018-02-06 09:36:04,706 [salt.state       ][INFO    ][2207] Completed state [/etc/profile.d] at time 09:36:04.706753 duration_in_ms=2.775
2018-02-06 09:36:05,247 [salt.state       ][INFO    ][2207] Running state [/etc/apt/apt.conf.d/99compression-workaround-salt] at time 09:36:05.247687
2018-02-06 09:36:05,248 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/apt/apt.conf.d/99compression-workaround-salt
2018-02-06 09:36:05,275 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf'
2018-02-06 09:36:05,283 [salt.state       ][INFO    ][2207] File changed:
New file
2018-02-06 09:36:05,284 [salt.state       ][INFO    ][2207] Completed state [/etc/apt/apt.conf.d/99compression-workaround-salt] at time 09:36:05.284552 duration_in_ms=36.866
2018-02-06 09:36:05,285 [salt.state       ][INFO    ][2207] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 09:36:05.285047
2018-02-06 09:36:05,285 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/apt/apt.conf.d/99prefer_ipv4-salt
2018-02-06 09:36:05,302 [salt.state       ][INFO    ][2207] File changed:
New file
2018-02-06 09:36:05,302 [salt.state       ][INFO    ][2207] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 09:36:05.302879 duration_in_ms=17.832
2018-02-06 09:36:05,304 [salt.state       ][INFO    ][2207] Running state [linux_repo_prereq_pkgs] at time 09:36:05.304195
2018-02-06 09:36:05,304 [salt.state       ][INFO    ][2207] Executing state pkg.installed for linux_repo_prereq_pkgs
2018-02-06 09:36:05,305 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:36:05,674 [salt.state       ][INFO    ][2207] All specified packages are already installed
2018-02-06 09:36:05,674 [salt.state       ][INFO    ][2207] Completed state [linux_repo_prereq_pkgs] at time 09:36:05.674793 duration_in_ms=370.596
2018-02-06 09:36:05,675 [salt.state       ][INFO    ][2207] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 09:36:05.675456
2018-02-06 09:36:05,676 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/apt/apt.conf.d/99proxies-salt-uca
2018-02-06 09:36:05,676 [salt.state       ][INFO    ][2207] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-02-06 09:36:05,677 [salt.state       ][INFO    ][2207] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 09:36:05.677283 duration_in_ms=1.827
2018-02-06 09:36:05,677 [salt.state       ][INFO    ][2207] Running state [/etc/apt/preferences.d/uca] at time 09:36:05.677752
2018-02-06 09:36:05,678 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/apt/preferences.d/uca
2018-02-06 09:36:05,678 [salt.state       ][INFO    ][2207] File /etc/apt/preferences.d/uca is not present
2018-02-06 09:36:05,679 [salt.state       ][INFO    ][2207] Completed state [/etc/apt/preferences.d/uca] at time 09:36:05.679163 duration_in_ms=1.411
2018-02-06 09:36:05,681 [salt.state       ][INFO    ][2207] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 09:36:05.681574
2018-02-06 09:36:05,682 [salt.state       ][INFO    ][2207] Executing state cmd.run for apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA
2018-02-06 09:36:05,683 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-02-06 09:36:05,928 [salt.state       ][INFO    ][2207] {'pid': 2265, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: public key "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" imported\ngpg: Total number processed: 1\ngpg:               imported: 1  (RSA: 1)', 'stdout': 'Executing: /tmp/tmp.PiZhqzlG7j/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-02-06 09:36:05,929 [salt.state       ][INFO    ][2207] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 09:36:05.929571 duration_in_ms=247.995
2018-02-06 09:36:05,935 [salt.state       ][INFO    ][2207] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 09:36:05.935812
2018-02-06 09:36:05,936 [salt.state       ][INFO    ][2207] Executing state pkgrepo.managed for deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main
2018-02-06 09:36:06,041 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 09:36:06,924 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093606350538
2018-02-06 09:36:06,949 [salt.minion      ][INFO    ][2686] Starting a new job with PID 2686
2018-02-06 09:36:06,964 [salt.minion      ][INFO    ][2686] Returning information for job: 20180206093606350538
2018-02-06 09:36:09,600 [salt.state       ][INFO    ][2207] {'repo': 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main'}
2018-02-06 09:36:09,600 [salt.state       ][INFO    ][2207] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 09:36:09.600675 duration_in_ms=3664.862
2018-02-06 09:36:09,601 [salt.state       ][INFO    ][2207] Running state [linux_extra_packages_purged] at time 09:36:09.601076
2018-02-06 09:36:09,601 [salt.state       ][INFO    ][2207] Executing state pkg.purged for linux_extra_packages_purged
2018-02-06 09:36:09,632 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', 'purge', 'cloud-init', 'unattended-upgrades'] in directory '/root'
2018-02-06 09:36:14,230 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:36:14,274 [salt.state       ][INFO    ][2207] {'removed': {}, 'installed': {'cloud-init': {'new': '', 'old': '17.1-46-g7acc9e68-0ubuntu1~16.04.1'}, 'ec2-init': {'new': '', 'old': '1'}, 'unattended-upgrades': {'new': '', 'old': '0.90ubuntu0.9'}}}
2018-02-06 09:36:14,295 [salt.state       ][INFO    ][2207] Loading fresh modules for state activity
2018-02-06 09:36:14,333 [salt.state       ][INFO    ][2207] Completed state [linux_extra_packages_purged] at time 09:36:14.333587 duration_in_ms=4732.509
2018-02-06 09:36:14,339 [salt.state       ][INFO    ][2207] Running state [linux_extra_packages_latest] at time 09:36:14.339546
2018-02-06 09:36:14,339 [salt.state       ][INFO    ][2207] Executing state pkg.latest for linux_extra_packages_latest
2018-02-06 09:36:14,794 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-02-06 09:36:14,844 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['apt-cache', '-q', 'policy', 'mcelog'] in directory '/root'
2018-02-06 09:36:14,903 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 09:36:14,934 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'libapache2-mod-wsgi', 'mcelog'] in directory '/root'
2018-02-06 09:36:17,033 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093616461708
2018-02-06 09:36:17,056 [salt.minion      ][INFO    ][3763] Starting a new job with PID 3763
2018-02-06 09:36:17,076 [salt.minion      ][INFO    ][3763] Returning information for job: 20180206093616461708
2018-02-06 09:36:19,130 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:36:19,177 [salt.state       ][INFO    ][2207] Made the following changes:
'libaprutil1-ldap' changed from 'absent' to '1.5.4-1build1'
'libapr1' changed from 'absent' to '1.5.2-3'
'libpython2.7' changed from 'absent' to '2.7.12-1ubuntu0~16.04.3'
'libapache2-mod-wsgi' changed from 'absent' to '4.3.0-1.1build1'
'apache2-api-20120211' changed from 'absent' to '1'
'libaprutil1' changed from 'absent' to '1.5.4-1build1'
'liblua5.1-0' changed from 'absent' to '5.1.5-8ubuntu1'
'libaprutil1-dbd-sqlite3' changed from 'absent' to '1.5.4-1build1'
'mcelog' changed from 'absent' to '128+dfsg-1'
'httpd-wsgi' changed from 'absent' to '1'
'apache2-bin' changed from 'absent' to '2.4.18-2ubuntu3.5'

2018-02-06 09:36:19,196 [salt.state       ][INFO    ][2207] Loading fresh modules for state activity
2018-02-06 09:36:19,281 [salt.state       ][INFO    ][2207] Completed state [linux_extra_packages_latest] at time 09:36:19.281611 duration_in_ms=4942.063
2018-02-06 09:36:19,285 [salt.state       ][INFO    ][2207] Running state [UTC] at time 09:36:19.285338
2018-02-06 09:36:19,285 [salt.state       ][INFO    ][2207] Executing state timezone.system for UTC
2018-02-06 09:36:19,288 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['timedatectl'] in directory '/root'
2018-02-06 09:36:19,363 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['timedatectl'] in directory '/root'
2018-02-06 09:36:19,380 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'timedatectl set-timezone UTC' in directory '/root'
2018-02-06 09:36:19,401 [salt.state       ][INFO    ][2207] {'timezone': 'UTC'}
2018-02-06 09:36:19,402 [salt.state       ][INFO    ][2207] Completed state [UTC] at time 09:36:19.402408 duration_in_ms=117.069
2018-02-06 09:36:19,407 [salt.state       ][INFO    ][2207] Running state [nf_conntrack] at time 09:36:19.407071
2018-02-06 09:36:19,407 [salt.state       ][INFO    ][2207] Executing state kmod.present for nf_conntrack
2018-02-06 09:36:19,408 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'lsmod' in directory '/root'
2018-02-06 09:36:19,502 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'lsmod' in directory '/root'
2018-02-06 09:36:19,525 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'modprobe nf_conntrack' in directory '/root'
2018-02-06 09:36:19,558 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'lsmod' in directory '/root'
2018-02-06 09:36:19,613 [salt.state       ][INFO    ][2207] {'nf_conntrack': 'loaded'}
2018-02-06 09:36:19,613 [salt.state       ][INFO    ][2207] Completed state [nf_conntrack] at time 09:36:19.613782 duration_in_ms=206.711
2018-02-06 09:36:19,617 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_keepalive_probes] at time 09:36:19.617951
2018-02-06 09:36:19,618 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_keepalive_probes
2018-02-06 09:36:19,619 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,679 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_keepalive_probes="8"' in directory '/root'
2018-02-06 09:36:19,695 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_keepalive_probes': 8}
2018-02-06 09:36:19,696 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_keepalive_probes] at time 09:36:19.695968 duration_in_ms=78.016
2018-02-06 09:36:19,696 [salt.state       ][INFO    ][2207] Running state [fs.file-max] at time 09:36:19.696397
2018-02-06 09:36:19,696 [salt.state       ][INFO    ][2207] Executing state sysctl.present for fs.file-max
2018-02-06 09:36:19,697 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,736 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w fs.file-max="124165"' in directory '/root'
2018-02-06 09:36:19,751 [salt.state       ][INFO    ][2207] {'fs.file-max': 124165}
2018-02-06 09:36:19,751 [salt.state       ][INFO    ][2207] Completed state [fs.file-max] at time 09:36:19.751775 duration_in_ms=55.377
2018-02-06 09:36:19,752 [salt.state       ][INFO    ][2207] Running state [net.core.somaxconn] at time 09:36:19.752192
2018-02-06 09:36:19,752 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.core.somaxconn
2018-02-06 09:36:19,753 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,792 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.core.somaxconn="4096"' in directory '/root'
2018-02-06 09:36:19,811 [salt.state       ][INFO    ][2207] {'net.core.somaxconn': 4096}
2018-02-06 09:36:19,811 [salt.state       ][INFO    ][2207] Completed state [net.core.somaxconn] at time 09:36:19.811632 duration_in_ms=59.44
2018-02-06 09:36:19,812 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_max_syn_backlog] at time 09:36:19.812160
2018-02-06 09:36:19,812 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_max_syn_backlog
2018-02-06 09:36:19,813 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,851 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_max_syn_backlog="8192"' in directory '/root'
2018-02-06 09:36:19,869 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_max_syn_backlog': 8192}
2018-02-06 09:36:19,870 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_max_syn_backlog] at time 09:36:19.870357 duration_in_ms=58.196
2018-02-06 09:36:19,870 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_tw_reuse] at time 09:36:19.870828
2018-02-06 09:36:19,871 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_tw_reuse
2018-02-06 09:36:19,872 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,912 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_tw_reuse="1"' in directory '/root'
2018-02-06 09:36:19,930 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_tw_reuse': 1}
2018-02-06 09:36:19,930 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_tw_reuse] at time 09:36:19.930474 duration_in_ms=59.645
2018-02-06 09:36:19,931 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_congestion_control] at time 09:36:19.930963
2018-02-06 09:36:19,931 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_congestion_control
2018-02-06 09:36:19,932 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:19,965 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_congestion_control="yeah"' in directory '/root'
2018-02-06 09:36:19,985 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_congestion_control': 'yeah'}
2018-02-06 09:36:19,985 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_congestion_control] at time 09:36:19.985912 duration_in_ms=54.949
2018-02-06 09:36:19,986 [salt.state       ][INFO    ][2207] Running state [net.nf_conntrack_max] at time 09:36:19.986408
2018-02-06 09:36:19,986 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.nf_conntrack_max
2018-02-06 09:36:19,988 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,025 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.nf_conntrack_max="1048576"' in directory '/root'
2018-02-06 09:36:20,044 [salt.state       ][INFO    ][2207] {'net.nf_conntrack_max': 1048576}
2018-02-06 09:36:20,045 [salt.state       ][INFO    ][2207] Completed state [net.nf_conntrack_max] at time 09:36:20.045066 duration_in_ms=58.658
2018-02-06 09:36:20,045 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_retries2] at time 09:36:20.045616
2018-02-06 09:36:20,046 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_retries2
2018-02-06 09:36:20,046 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,088 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_retries2="5"' in directory '/root'
2018-02-06 09:36:20,104 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_retries2': 5}
2018-02-06 09:36:20,105 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_retries2] at time 09:36:20.104949 duration_in_ms=59.332
2018-02-06 09:36:20,105 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_fin_timeout] at time 09:36:20.105502
2018-02-06 09:36:20,105 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_fin_timeout
2018-02-06 09:36:20,106 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,149 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_fin_timeout="30"' in directory '/root'
2018-02-06 09:36:20,171 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_fin_timeout': 30}
2018-02-06 09:36:20,171 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_fin_timeout] at time 09:36:20.171659 duration_in_ms=66.156
2018-02-06 09:36:20,172 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_slow_start_after_idle] at time 09:36:20.172201
2018-02-06 09:36:20,172 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_slow_start_after_idle
2018-02-06 09:36:20,173 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,213 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_slow_start_after_idle="0"' in directory '/root'
2018-02-06 09:36:20,235 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_slow_start_after_idle': 0}
2018-02-06 09:36:20,236 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 09:36:20.236383 duration_in_ms=64.181
2018-02-06 09:36:20,237 [salt.state       ][INFO    ][2207] Running state [vm.swappiness] at time 09:36:20.237284
2018-02-06 09:36:20,237 [salt.state       ][INFO    ][2207] Executing state sysctl.present for vm.swappiness
2018-02-06 09:36:20,240 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,277 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w vm.swappiness="10"' in directory '/root'
2018-02-06 09:36:20,294 [salt.state       ][INFO    ][2207] {'vm.swappiness': 10}
2018-02-06 09:36:20,295 [salt.state       ][INFO    ][2207] Completed state [vm.swappiness] at time 09:36:20.295485 duration_in_ms=58.2
2018-02-06 09:36:20,296 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_keepalive_intvl] at time 09:36:20.296168
2018-02-06 09:36:20,296 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_keepalive_intvl
2018-02-06 09:36:20,297 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,334 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_keepalive_intvl="3"' in directory '/root'
2018-02-06 09:36:20,352 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_keepalive_intvl': 3}
2018-02-06 09:36:20,353 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_keepalive_intvl] at time 09:36:20.353299 duration_in_ms=57.13
2018-02-06 09:36:20,354 [salt.state       ][INFO    ][2207] Running state [net.ipv4.neigh.default.gc_thresh1] at time 09:36:20.353952
2018-02-06 09:36:20,354 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh1
2018-02-06 09:36:20,355 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,392 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh1="4096"' in directory '/root'
2018-02-06 09:36:20,412 [salt.state       ][INFO    ][2207] {'net.ipv4.neigh.default.gc_thresh1': 4096}
2018-02-06 09:36:20,413 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 09:36:20.413113 duration_in_ms=59.16
2018-02-06 09:36:20,413 [salt.state       ][INFO    ][2207] Running state [net.ipv4.neigh.default.gc_thresh2] at time 09:36:20.413914
2018-02-06 09:36:20,414 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh2
2018-02-06 09:36:20,415 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,462 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh2="8192"' in directory '/root'
2018-02-06 09:36:20,478 [salt.state       ][INFO    ][2207] {'net.ipv4.neigh.default.gc_thresh2': 8192}
2018-02-06 09:36:20,478 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 09:36:20.478715 duration_in_ms=64.801
2018-02-06 09:36:20,479 [salt.state       ][INFO    ][2207] Running state [net.ipv4.neigh.default.gc_thresh3] at time 09:36:20.479835
2018-02-06 09:36:20,480 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh3
2018-02-06 09:36:20,481 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,516 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh3="16384"' in directory '/root'
2018-02-06 09:36:20,532 [salt.state       ][INFO    ][2207] {'net.ipv4.neigh.default.gc_thresh3': 16384}
2018-02-06 09:36:20,532 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 09:36:20.532692 duration_in_ms=52.856
2018-02-06 09:36:20,533 [salt.state       ][INFO    ][2207] Running state [net.core.netdev_max_backlog] at time 09:36:20.533261
2018-02-06 09:36:20,533 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.core.netdev_max_backlog
2018-02-06 09:36:20,534 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,570 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.core.netdev_max_backlog="261144"' in directory '/root'
2018-02-06 09:36:20,586 [salt.state       ][INFO    ][2207] {'net.core.netdev_max_backlog': 261144}
2018-02-06 09:36:20,587 [salt.state       ][INFO    ][2207] Completed state [net.core.netdev_max_backlog] at time 09:36:20.586983 duration_in_ms=53.721
2018-02-06 09:36:20,588 [salt.state       ][INFO    ][2207] Running state [net.ipv4.tcp_keepalive_time] at time 09:36:20.587953
2018-02-06 09:36:20,588 [salt.state       ][INFO    ][2207] Executing state sysctl.present for net.ipv4.tcp_keepalive_time
2018-02-06 09:36:20,589 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,625 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w net.ipv4.tcp_keepalive_time="30"' in directory '/root'
2018-02-06 09:36:20,640 [salt.state       ][INFO    ][2207] {'net.ipv4.tcp_keepalive_time': 30}
2018-02-06 09:36:20,641 [salt.state       ][INFO    ][2207] Completed state [net.ipv4.tcp_keepalive_time] at time 09:36:20.640943 duration_in_ms=52.989
2018-02-06 09:36:20,641 [salt.state       ][INFO    ][2207] Running state [kernel.panic] at time 09:36:20.641503
2018-02-06 09:36:20,641 [salt.state       ][INFO    ][2207] Executing state sysctl.present for kernel.panic
2018-02-06 09:36:20,642 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:36:20,677 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'sysctl -w kernel.panic="60"' in directory '/root'
2018-02-06 09:36:20,693 [salt.state       ][INFO    ][2207] {'kernel.panic': 60}
2018-02-06 09:36:20,694 [salt.state       ][INFO    ][2207] Completed state [kernel.panic] at time 09:36:20.694372 duration_in_ms=52.868
2018-02-06 09:36:20,702 [salt.state       ][INFO    ][2207] Running state [linux_sysfs_package] at time 09:36:20.702535
2018-02-06 09:36:20,702 [salt.state       ][INFO    ][2207] Executing state pkg.installed for linux_sysfs_package
2018-02-06 09:36:21,139 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['apt-cache', '-q', 'policy', 'sysfsutils'] in directory '/root'
2018-02-06 09:36:21,206 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 09:36:22,948 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 09:36:22,986 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'sysfsutils'] in directory '/root'
2018-02-06 09:36:25,798 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:36:25,844 [salt.state       ][INFO    ][2207] Made the following changes:
'libsysfs2' changed from 'absent' to '2.1.0+repack-4'
'sysfsutils' changed from 'absent' to '2.1.0+repack-4'

2018-02-06 09:36:25,865 [salt.state       ][INFO    ][2207] Loading fresh modules for state activity
2018-02-06 09:36:25,901 [salt.state       ][INFO    ][2207] Completed state [linux_sysfs_package] at time 09:36:25.901790 duration_in_ms=5199.253
2018-02-06 09:36:25,907 [salt.state       ][INFO    ][2207] Running state [/etc/sysfs.d] at time 09:36:25.907048
2018-02-06 09:36:25,907 [salt.state       ][INFO    ][2207] Executing state file.directory for /etc/sysfs.d
2018-02-06 09:36:25,911 [salt.state       ][INFO    ][2207] Directory /etc/sysfs.d is in the correct state
2018-02-06 09:36:25,911 [salt.state       ][INFO    ][2207] Completed state [/etc/sysfs.d] at time 09:36:25.911200 duration_in_ms=4.152
2018-02-06 09:36:26,084 [salt.state       ][INFO    ][2207] Running state [ondemand] at time 09:36:26.084129
2018-02-06 09:36:26,084 [salt.state       ][INFO    ][2207] Executing state service.dead for ondemand
2018-02-06 09:36:26,086 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-02-06 09:36:26,115 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,140 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,168 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,260 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,282 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,305 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,331 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', '/usr/sbin/update-rc.d', '-f', 'ondemand', 'remove'] in directory '/root'
2018-02-06 09:36:26,501 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-02-06 09:36:26,524 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'runlevel' in directory '/root'
2018-02-06 09:36:26,542 [salt.state       ][INFO    ][2207] {'ondemand': True}
2018-02-06 09:36:26,543 [salt.state       ][INFO    ][2207] Completed state [ondemand] at time 09:36:26.543278 duration_in_ms=459.148
2018-02-06 09:36:26,546 [salt.state       ][INFO    ][2207] Running state [cs_CZ.UTF-8] at time 09:36:26.546897
2018-02-06 09:36:26,547 [salt.state       ][INFO    ][2207] Executing state locale.present for cs_CZ.UTF-8
2018-02-06 09:36:26,548 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'locale -a' in directory '/root'
2018-02-06 09:36:26,570 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['locale-gen', 'cs_CZ.utf8'] in directory '/root'
2018-02-06 09:36:27,223 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093626651478
2018-02-06 09:36:27,256 [salt.minion      ][INFO    ][4782] Starting a new job with PID 4782
2018-02-06 09:36:27,276 [salt.minion      ][INFO    ][4782] Returning information for job: 20180206093626651478
2018-02-06 09:36:27,379 [salt.state       ][INFO    ][2207] {'locale': 'cs_CZ.UTF-8'}
2018-02-06 09:36:27,380 [salt.state       ][INFO    ][2207] Completed state [cs_CZ.UTF-8] at time 09:36:27.380324 duration_in_ms=833.426
2018-02-06 09:36:27,380 [salt.state       ][INFO    ][2207] Running state [en_US.UTF-8] at time 09:36:27.380792
2018-02-06 09:36:27,381 [salt.state       ][INFO    ][2207] Executing state locale.present for en_US.UTF-8
2018-02-06 09:36:27,382 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'locale -a' in directory '/root'
2018-02-06 09:36:27,396 [salt.state       ][INFO    ][2207] Locale en_US.UTF-8 is already present
2018-02-06 09:36:27,396 [salt.state       ][INFO    ][2207] Completed state [en_US.UTF-8] at time 09:36:27.396454 duration_in_ms=15.662
2018-02-06 09:36:27,398 [salt.state       ][INFO    ][2207] Running state [en_US.UTF-8] at time 09:36:27.398425
2018-02-06 09:36:27,398 [salt.state       ][INFO    ][2207] Executing state locale.system for en_US.UTF-8
2018-02-06 09:36:27,399 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'localectl' in directory '/root'
2018-02-06 09:36:27,454 [salt.state       ][INFO    ][2207] System locale en_US.UTF-8 already set
2018-02-06 09:36:27,455 [salt.state       ][INFO    ][2207] Completed state [en_US.UTF-8] at time 09:36:27.454950 duration_in_ms=56.523
2018-02-06 09:36:27,461 [salt.state       ][INFO    ][2207] Running state [root] at time 09:36:27.461313
2018-02-06 09:36:27,461 [salt.state       ][INFO    ][2207] Executing state user.present for root
2018-02-06 09:36:27,466 [salt.state       ][INFO    ][2207] User root is present and up to date
2018-02-06 09:36:27,466 [salt.state       ][INFO    ][2207] Completed state [root] at time 09:36:27.466851 duration_in_ms=5.538
2018-02-06 09:36:27,469 [salt.state       ][INFO    ][2207] Running state [/root] at time 09:36:27.469309
2018-02-06 09:36:27,469 [salt.state       ][INFO    ][2207] Executing state file.directory for /root
2018-02-06 09:36:27,470 [salt.state       ][INFO    ][2207] Directory /root is in the correct state
2018-02-06 09:36:27,471 [salt.state       ][INFO    ][2207] Completed state [/root] at time 09:36:27.471295 duration_in_ms=1.986
2018-02-06 09:36:27,471 [salt.state       ][INFO    ][2207] Running state [/etc/sudoers.d/90-salt-user-root] at time 09:36:27.471795
2018-02-06 09:36:27,472 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/sudoers.d/90-salt-user-root
2018-02-06 09:36:27,472 [salt.state       ][INFO    ][2207] File /etc/sudoers.d/90-salt-user-root is not present
2018-02-06 09:36:27,473 [salt.state       ][INFO    ][2207] Completed state [/etc/sudoers.d/90-salt-user-root] at time 09:36:27.473237 duration_in_ms=1.442
2018-02-06 09:36:27,473 [salt.state       ][INFO    ][2207] Running state [ubuntu] at time 09:36:27.473709
2018-02-06 09:36:27,474 [salt.state       ][INFO    ][2207] Executing state user.present for ubuntu
2018-02-06 09:36:27,477 [salt.state       ][INFO    ][2207] {'passwd': 'XXX-REDACTED-XXX'}
2018-02-06 09:36:27,478 [salt.state       ][INFO    ][2207] Completed state [ubuntu] at time 09:36:27.478368 duration_in_ms=4.659
2018-02-06 09:36:27,479 [salt.state       ][INFO    ][2207] Running state [/home/ubuntu] at time 09:36:27.479640
2018-02-06 09:36:27,480 [salt.state       ][INFO    ][2207] Executing state file.directory for /home/ubuntu
2018-02-06 09:36:27,481 [salt.state       ][INFO    ][2207] {'mode': '0700'}
2018-02-06 09:36:27,481 [salt.state       ][INFO    ][2207] Completed state [/home/ubuntu] at time 09:36:27.481798 duration_in_ms=2.158
2018-02-06 09:36:27,482 [salt.state       ][INFO    ][2207] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 09:36:27.482885
2018-02-06 09:36:27,483 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/sudoers.d/90-salt-user-ubuntu
2018-02-06 09:36:27,503 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/files/sudoer'
2018-02-06 09:36:27,508 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command '/usr/sbin/visudo -c -f /tmp/tmptorUEw' in directory '/root'
2018-02-06 09:36:27,531 [salt.state       ][INFO    ][2207] File changed:
New file
2018-02-06 09:36:27,532 [salt.state       ][INFO    ][2207] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 09:36:27.532100 duration_in_ms=49.215
2018-02-06 09:36:27,532 [salt.state       ][INFO    ][2207] Running state [/etc/security/limits.d/90-salt-default.conf] at time 09:36:27.532728
2018-02-06 09:36:27,533 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/security/limits.d/90-salt-default.conf
2018-02-06 09:36:27,561 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/files/limits.conf'
2018-02-06 09:36:27,656 [salt.state       ][INFO    ][2207] File changed:
New file
2018-02-06 09:36:27,657 [salt.state       ][INFO    ][2207] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 09:36:27.657381 duration_in_ms=124.653
2018-02-06 09:36:27,657 [salt.state       ][INFO    ][2207] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 09:36:27.657688
2018-02-06 09:36:27,657 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/systemd/system.conf.d/90-salt.conf
2018-02-06 09:36:27,673 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/files/systemd.conf'
2018-02-06 09:36:27,761 [salt.state       ][INFO    ][2207] File changed:
New file
2018-02-06 09:36:27,761 [salt.state       ][INFO    ][2207] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 09:36:27.761773 duration_in_ms=104.085
2018-02-06 09:36:27,763 [salt.state       ][INFO    ][2207] Running state [service.systemctl_reload] at time 09:36:27.763519
2018-02-06 09:36:27,763 [salt.state       ][INFO    ][2207] Executing state module.wait for service.systemctl_reload
2018-02-06 09:36:27,764 [salt.state       ][INFO    ][2207] No changes made for service.systemctl_reload
2018-02-06 09:36:27,764 [salt.state       ][INFO    ][2207] Completed state [service.systemctl_reload] at time 09:36:27.764234 duration_in_ms=0.715
2018-02-06 09:36:27,764 [salt.state       ][INFO    ][2207] Running state [service.systemctl_reload] at time 09:36:27.764437
2018-02-06 09:36:27,764 [salt.state       ][INFO    ][2207] Executing state module.mod_watch for service.systemctl_reload
2018-02-06 09:36:27,765 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', '--system', 'daemon-reload'] in directory '/root'
2018-02-06 09:36:27,857 [salt.state       ][INFO    ][2207] {'ret': True}
2018-02-06 09:36:27,858 [salt.state       ][INFO    ][2207] Completed state [service.systemctl_reload] at time 09:36:27.858444 duration_in_ms=94.005
2018-02-06 09:36:27,859 [salt.state       ][INFO    ][2207] Running state [/etc/hostname] at time 09:36:27.858973
2018-02-06 09:36:27,859 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/hostname
2018-02-06 09:36:27,878 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'linux/files/hostname'
2018-02-06 09:36:27,883 [salt.state       ][INFO    ][2207] File changed:
--- 
+++ 
@@ -1 +1 @@
-ubuntu
+prx02

2018-02-06 09:36:27,884 [salt.state       ][INFO    ][2207] Completed state [/etc/hostname] at time 09:36:27.884180 duration_in_ms=25.207
2018-02-06 09:36:27,886 [salt.state       ][INFO    ][2207] Running state [hostname prx02] at time 09:36:27.886506
2018-02-06 09:36:27,886 [salt.state       ][INFO    ][2207] Executing state cmd.wait for hostname prx02
2018-02-06 09:36:27,887 [salt.state       ][INFO    ][2207] No changes made for hostname prx02
2018-02-06 09:36:27,887 [salt.state       ][INFO    ][2207] Completed state [hostname prx02] at time 09:36:27.887833 duration_in_ms=1.328
2018-02-06 09:36:27,888 [salt.state       ][INFO    ][2207] Running state [hostname prx02] at time 09:36:27.888047
2018-02-06 09:36:27,888 [salt.state       ][INFO    ][2207] Executing state cmd.mod_watch for hostname prx02
2018-02-06 09:36:27,889 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command 'hostname prx02' in directory '/root'
2018-02-06 09:36:27,907 [salt.state       ][INFO    ][2207] {'pid': 4809, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-02-06 09:36:27,908 [salt.state       ][INFO    ][2207] Completed state [hostname prx02] at time 09:36:27.908429 duration_in_ms=20.382
2018-02-06 09:36:27,909 [salt.state       ][INFO    ][2207] Running state [mdb02] at time 09:36:27.909676
2018-02-06 09:36:27,910 [salt.state       ][INFO    ][2207] Executing state host.present for mdb02
2018-02-06 09:36:27,913 [salt.state       ][INFO    ][2207] {'host': 'mdb02'}
2018-02-06 09:36:27,913 [salt.state       ][INFO    ][2207] Completed state [mdb02] at time 09:36:27.913373 duration_in_ms=3.696
2018-02-06 09:36:27,913 [salt.state       ][INFO    ][2207] Running state [mdb02.mcp-pike-ovs-ha.local] at time 09:36:27.913678
2018-02-06 09:36:27,913 [salt.state       ][INFO    ][2207] Executing state host.present for mdb02.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,914 [salt.state       ][INFO    ][2207] {'host': 'mdb02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,915 [salt.state       ][INFO    ][2207] Completed state [mdb02.mcp-pike-ovs-ha.local] at time 09:36:27.915000 duration_in_ms=1.322
2018-02-06 09:36:27,915 [salt.state       ][INFO    ][2207] Running state [mdb03] at time 09:36:27.915534
2018-02-06 09:36:27,915 [salt.state       ][INFO    ][2207] Executing state host.present for mdb03
2018-02-06 09:36:27,916 [salt.state       ][INFO    ][2207] {'host': 'mdb03'}
2018-02-06 09:36:27,916 [salt.state       ][INFO    ][2207] Completed state [mdb03] at time 09:36:27.916805 duration_in_ms=1.271
2018-02-06 09:36:27,917 [salt.state       ][INFO    ][2207] Running state [mdb03.mcp-pike-ovs-ha.local] at time 09:36:27.917065
2018-02-06 09:36:27,917 [salt.state       ][INFO    ][2207] Executing state host.present for mdb03.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,918 [salt.state       ][INFO    ][2207] {'host': 'mdb03.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,918 [salt.state       ][INFO    ][2207] Completed state [mdb03.mcp-pike-ovs-ha.local] at time 09:36:27.918478 duration_in_ms=1.413
2018-02-06 09:36:27,918 [salt.state       ][INFO    ][2207] Running state [mdb01] at time 09:36:27.918741
2018-02-06 09:36:27,918 [salt.state       ][INFO    ][2207] Executing state host.present for mdb01
2018-02-06 09:36:27,920 [salt.state       ][INFO    ][2207] {'host': 'mdb01'}
2018-02-06 09:36:27,920 [salt.state       ][INFO    ][2207] Completed state [mdb01] at time 09:36:27.920626 duration_in_ms=1.885
2018-02-06 09:36:27,920 [salt.state       ][INFO    ][2207] Running state [mdb01.mcp-pike-ovs-ha.local] at time 09:36:27.920855
2018-02-06 09:36:27,921 [salt.state       ][INFO    ][2207] Executing state host.present for mdb01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,922 [salt.state       ][INFO    ][2207] {'host': 'mdb01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,923 [salt.state       ][INFO    ][2207] Completed state [mdb01.mcp-pike-ovs-ha.local] at time 09:36:27.923039 duration_in_ms=2.184
2018-02-06 09:36:27,923 [salt.state       ][INFO    ][2207] Running state [mdb] at time 09:36:27.923577
2018-02-06 09:36:27,923 [salt.state       ][INFO    ][2207] Executing state host.present for mdb
2018-02-06 09:36:27,924 [salt.state       ][INFO    ][2207] {'host': 'mdb'}
2018-02-06 09:36:27,924 [salt.state       ][INFO    ][2207] Completed state [mdb] at time 09:36:27.924868 duration_in_ms=1.291
2018-02-06 09:36:27,925 [salt.state       ][INFO    ][2207] Running state [mdb.mcp-pike-ovs-ha.local] at time 09:36:27.925100
2018-02-06 09:36:27,925 [salt.state       ][INFO    ][2207] Executing state host.present for mdb.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,926 [salt.state       ][INFO    ][2207] {'host': 'mdb.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,926 [salt.state       ][INFO    ][2207] Completed state [mdb.mcp-pike-ovs-ha.local] at time 09:36:27.926446 duration_in_ms=1.346
2018-02-06 09:36:27,926 [salt.state       ][INFO    ][2207] Running state [cfg01] at time 09:36:27.926675
2018-02-06 09:36:27,926 [salt.state       ][INFO    ][2207] Executing state host.present for cfg01
2018-02-06 09:36:27,927 [salt.state       ][INFO    ][2207] {'host': 'cfg01'}
2018-02-06 09:36:27,928 [salt.state       ][INFO    ][2207] Completed state [cfg01] at time 09:36:27.928001 duration_in_ms=1.326
2018-02-06 09:36:27,928 [salt.state       ][INFO    ][2207] Running state [cfg01.mcp-pike-ovs-ha.local] at time 09:36:27.928231
2018-02-06 09:36:27,928 [salt.state       ][INFO    ][2207] Executing state host.present for cfg01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,929 [salt.state       ][INFO    ][2207] {'host': 'cfg01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,929 [salt.state       ][INFO    ][2207] Completed state [cfg01.mcp-pike-ovs-ha.local] at time 09:36:27.929580 duration_in_ms=1.349
2018-02-06 09:36:27,929 [salt.state       ][INFO    ][2207] Running state [prx01] at time 09:36:27.929806
2018-02-06 09:36:27,930 [salt.state       ][INFO    ][2207] Executing state host.present for prx01
2018-02-06 09:36:27,930 [salt.state       ][INFO    ][2207] {'host': 'prx01'}
2018-02-06 09:36:27,931 [salt.state       ][INFO    ][2207] Completed state [prx01] at time 09:36:27.931424 duration_in_ms=1.618
2018-02-06 09:36:27,931 [salt.state       ][INFO    ][2207] Running state [prx01.mcp-pike-ovs-ha.local] at time 09:36:27.931667
2018-02-06 09:36:27,931 [salt.state       ][INFO    ][2207] Executing state host.present for prx01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,932 [salt.state       ][INFO    ][2207] {'host': 'prx01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,933 [salt.state       ][INFO    ][2207] Completed state [prx01.mcp-pike-ovs-ha.local] at time 09:36:27.932995 duration_in_ms=1.328
2018-02-06 09:36:27,933 [salt.state       ][INFO    ][2207] Running state [kvm01] at time 09:36:27.933253
2018-02-06 09:36:27,933 [salt.state       ][INFO    ][2207] Executing state host.present for kvm01
2018-02-06 09:36:27,934 [salt.state       ][INFO    ][2207] {'host': 'kvm01'}
2018-02-06 09:36:27,934 [salt.state       ][INFO    ][2207] Completed state [kvm01] at time 09:36:27.934573 duration_in_ms=1.32
2018-02-06 09:36:27,934 [salt.state       ][INFO    ][2207] Running state [kvm01.mcp-pike-ovs-ha.local] at time 09:36:27.934798
2018-02-06 09:36:27,935 [salt.state       ][INFO    ][2207] Executing state host.present for kvm01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,936 [salt.state       ][INFO    ][2207] {'host': 'kvm01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,937 [salt.state       ][INFO    ][2207] Completed state [kvm01.mcp-pike-ovs-ha.local] at time 09:36:27.936959 duration_in_ms=2.161
2018-02-06 09:36:27,937 [salt.state       ][INFO    ][2207] Running state [kvm03] at time 09:36:27.937209
2018-02-06 09:36:27,937 [salt.state       ][INFO    ][2207] Executing state host.present for kvm03
2018-02-06 09:36:27,938 [salt.state       ][INFO    ][2207] {'host': 'kvm03'}
2018-02-06 09:36:27,938 [salt.state       ][INFO    ][2207] Completed state [kvm03] at time 09:36:27.938832 duration_in_ms=1.623
2018-02-06 09:36:27,939 [salt.state       ][INFO    ][2207] Running state [kvm03.mcp-pike-ovs-ha.local] at time 09:36:27.939059
2018-02-06 09:36:27,939 [salt.state       ][INFO    ][2207] Executing state host.present for kvm03.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,940 [salt.state       ][INFO    ][2207] {'host': 'kvm03.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,941 [salt.state       ][INFO    ][2207] Completed state [kvm03.mcp-pike-ovs-ha.local] at time 09:36:27.941155 duration_in_ms=2.074
2018-02-06 09:36:27,941 [salt.state       ][INFO    ][2207] Running state [kvm02] at time 09:36:27.941390
2018-02-06 09:36:27,941 [salt.state       ][INFO    ][2207] Executing state host.present for kvm02
2018-02-06 09:36:27,942 [salt.state       ][INFO    ][2207] {'host': 'kvm02'}
2018-02-06 09:36:27,942 [salt.state       ][INFO    ][2207] Completed state [kvm02] at time 09:36:27.942725 duration_in_ms=1.334
2018-02-06 09:36:27,942 [salt.state       ][INFO    ][2207] Running state [kvm02.mcp-pike-ovs-ha.local] at time 09:36:27.942951
2018-02-06 09:36:27,943 [salt.state       ][INFO    ][2207] Executing state host.present for kvm02.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,944 [salt.state       ][INFO    ][2207] {'host': 'kvm02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,944 [salt.state       ][INFO    ][2207] Completed state [kvm02.mcp-pike-ovs-ha.local] at time 09:36:27.944863 duration_in_ms=1.912
2018-02-06 09:36:27,945 [salt.state       ][INFO    ][2207] Running state [dbs] at time 09:36:27.945101
2018-02-06 09:36:27,945 [salt.state       ][INFO    ][2207] Executing state host.present for dbs
2018-02-06 09:36:27,946 [salt.state       ][INFO    ][2207] {'host': 'dbs'}
2018-02-06 09:36:27,946 [salt.state       ][INFO    ][2207] Completed state [dbs] at time 09:36:27.946457 duration_in_ms=1.356
2018-02-06 09:36:27,946 [salt.state       ][INFO    ][2207] Running state [dbs.mcp-pike-ovs-ha.local] at time 09:36:27.946678
2018-02-06 09:36:27,946 [salt.state       ][INFO    ][2207] Executing state host.present for dbs.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,948 [salt.state       ][INFO    ][2207] {'host': 'dbs.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,948 [salt.state       ][INFO    ][2207] Completed state [dbs.mcp-pike-ovs-ha.local] at time 09:36:27.948313 duration_in_ms=1.634
2018-02-06 09:36:27,948 [salt.state       ][INFO    ][2207] Running state [prx] at time 09:36:27.948545
2018-02-06 09:36:27,948 [salt.state       ][INFO    ][2207] Executing state host.present for prx
2018-02-06 09:36:27,949 [salt.state       ][INFO    ][2207] {'host': 'prx'}
2018-02-06 09:36:27,949 [salt.state       ][INFO    ][2207] Completed state [prx] at time 09:36:27.949922 duration_in_ms=1.377
2018-02-06 09:36:27,950 [salt.state       ][INFO    ][2207] Running state [prx.mcp-pike-ovs-ha.local] at time 09:36:27.950144
2018-02-06 09:36:27,950 [salt.state       ][INFO    ][2207] Executing state host.present for prx.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,951 [salt.state       ][INFO    ][2207] {'host': 'prx.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,951 [salt.state       ][INFO    ][2207] Completed state [prx.mcp-pike-ovs-ha.local] at time 09:36:27.951795 duration_in_ms=1.651
2018-02-06 09:36:27,952 [salt.state       ][INFO    ][2207] Running state [prx02] at time 09:36:27.952030
2018-02-06 09:36:27,952 [salt.state       ][INFO    ][2207] Executing state host.present for prx02
2018-02-06 09:36:27,953 [salt.state       ][INFO    ][2207] {'host': 'prx02'}
2018-02-06 09:36:27,953 [salt.state       ][INFO    ][2207] Completed state [prx02] at time 09:36:27.953416 duration_in_ms=1.386
2018-02-06 09:36:27,953 [salt.state       ][INFO    ][2207] Running state [prx02.mcp-pike-ovs-ha.local] at time 09:36:27.953645
2018-02-06 09:36:27,953 [salt.state       ][INFO    ][2207] Executing state host.present for prx02.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,954 [salt.state       ][INFO    ][2207] {'host': 'prx02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,955 [salt.state       ][INFO    ][2207] Completed state [prx02.mcp-pike-ovs-ha.local] at time 09:36:27.955031 duration_in_ms=1.386
2018-02-06 09:36:27,956 [salt.state       ][INFO    ][2207] Running state [msg02] at time 09:36:27.956103
2018-02-06 09:36:27,956 [salt.state       ][INFO    ][2207] Executing state host.present for msg02
2018-02-06 09:36:27,957 [salt.state       ][INFO    ][2207] {'host': 'msg02'}
2018-02-06 09:36:27,957 [salt.state       ][INFO    ][2207] Completed state [msg02] at time 09:36:27.957536 duration_in_ms=1.434
2018-02-06 09:36:27,957 [salt.state       ][INFO    ][2207] Running state [msg02.mcp-pike-ovs-ha.local] at time 09:36:27.957765
2018-02-06 09:36:27,958 [salt.state       ][INFO    ][2207] Executing state host.present for msg02.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,959 [salt.state       ][INFO    ][2207] {'host': 'msg02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,959 [salt.state       ][INFO    ][2207] Completed state [msg02.mcp-pike-ovs-ha.local] at time 09:36:27.959196 duration_in_ms=1.431
2018-02-06 09:36:27,959 [salt.state       ][INFO    ][2207] Running state [msg03] at time 09:36:27.959435
2018-02-06 09:36:27,959 [salt.state       ][INFO    ][2207] Executing state host.present for msg03
2018-02-06 09:36:27,960 [salt.state       ][INFO    ][2207] {'host': 'msg03'}
2018-02-06 09:36:27,960 [salt.state       ][INFO    ][2207] Completed state [msg03] at time 09:36:27.960860 duration_in_ms=1.425
2018-02-06 09:36:27,961 [salt.state       ][INFO    ][2207] Running state [msg03.mcp-pike-ovs-ha.local] at time 09:36:27.961100
2018-02-06 09:36:27,961 [salt.state       ][INFO    ][2207] Executing state host.present for msg03.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,962 [salt.state       ][INFO    ][2207] {'host': 'msg03.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,962 [salt.state       ][INFO    ][2207] Completed state [msg03.mcp-pike-ovs-ha.local] at time 09:36:27.962716 duration_in_ms=1.615
2018-02-06 09:36:27,962 [salt.state       ][INFO    ][2207] Running state [msg01] at time 09:36:27.962955
2018-02-06 09:36:27,963 [salt.state       ][INFO    ][2207] Executing state host.present for msg01
2018-02-06 09:36:27,964 [salt.state       ][INFO    ][2207] {'host': 'msg01'}
2018-02-06 09:36:27,964 [salt.state       ][INFO    ][2207] Completed state [msg01] at time 09:36:27.964390 duration_in_ms=1.434
2018-02-06 09:36:27,964 [salt.state       ][INFO    ][2207] Running state [msg01.mcp-pike-ovs-ha.local] at time 09:36:27.964622
2018-02-06 09:36:27,964 [salt.state       ][INFO    ][2207] Executing state host.present for msg01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,965 [salt.state       ][INFO    ][2207] {'host': 'msg01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,966 [salt.state       ][INFO    ][2207] Completed state [msg01.mcp-pike-ovs-ha.local] at time 09:36:27.966097 duration_in_ms=1.475
2018-02-06 09:36:27,966 [salt.state       ][INFO    ][2207] Running state [msg] at time 09:36:27.966332
2018-02-06 09:36:27,966 [salt.state       ][INFO    ][2207] Executing state host.present for msg
2018-02-06 09:36:27,968 [salt.state       ][INFO    ][2207] {'host': 'msg'}
2018-02-06 09:36:27,968 [salt.state       ][INFO    ][2207] Completed state [msg] at time 09:36:27.968599 duration_in_ms=2.267
2018-02-06 09:36:27,968 [salt.state       ][INFO    ][2207] Running state [msg.mcp-pike-ovs-ha.local] at time 09:36:27.968828
2018-02-06 09:36:27,969 [salt.state       ][INFO    ][2207] Executing state host.present for msg.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,970 [salt.state       ][INFO    ][2207] {'host': 'msg.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,970 [salt.state       ][INFO    ][2207] Completed state [msg.mcp-pike-ovs-ha.local] at time 09:36:27.970314 duration_in_ms=1.485
2018-02-06 09:36:27,970 [salt.state       ][INFO    ][2207] Running state [cfg01] at time 09:36:27.970549
2018-02-06 09:36:27,970 [salt.state       ][INFO    ][2207] Executing state host.present for cfg01
2018-02-06 09:36:27,971 [salt.state       ][INFO    ][2207] Host cfg01 (192.168.10.100) already present
2018-02-06 09:36:27,971 [salt.state       ][INFO    ][2207] Completed state [cfg01] at time 09:36:27.971689 duration_in_ms=1.14
2018-02-06 09:36:27,971 [salt.state       ][INFO    ][2207] Running state [cfg01.mcp-pike-ovs-ha.local] at time 09:36:27.971913
2018-02-06 09:36:27,972 [salt.state       ][INFO    ][2207] Executing state host.present for cfg01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,972 [salt.state       ][INFO    ][2207] Host cfg01.mcp-pike-ovs-ha.local (192.168.10.100) already present
2018-02-06 09:36:27,972 [salt.state       ][INFO    ][2207] Completed state [cfg01.mcp-pike-ovs-ha.local] at time 09:36:27.972746 duration_in_ms=0.833
2018-02-06 09:36:27,973 [salt.state       ][INFO    ][2207] Running state [cmp002] at time 09:36:27.972974
2018-02-06 09:36:27,973 [salt.state       ][INFO    ][2207] Executing state host.present for cmp002
2018-02-06 09:36:27,974 [salt.state       ][INFO    ][2207] {'host': 'cmp002'}
2018-02-06 09:36:27,974 [salt.state       ][INFO    ][2207] Completed state [cmp002] at time 09:36:27.974448 duration_in_ms=1.474
2018-02-06 09:36:27,974 [salt.state       ][INFO    ][2207] Running state [cmp002.mcp-pike-ovs-ha.local] at time 09:36:27.974689
2018-02-06 09:36:27,974 [salt.state       ][INFO    ][2207] Executing state host.present for cmp002.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,975 [salt.state       ][INFO    ][2207] {'host': 'cmp002.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,976 [salt.state       ][INFO    ][2207] Completed state [cmp002.mcp-pike-ovs-ha.local] at time 09:36:27.976150 duration_in_ms=1.46
2018-02-06 09:36:27,976 [salt.state       ][INFO    ][2207] Running state [cmp001] at time 09:36:27.976378
2018-02-06 09:36:27,976 [salt.state       ][INFO    ][2207] Executing state host.present for cmp001
2018-02-06 09:36:27,977 [salt.state       ][INFO    ][2207] {'host': 'cmp001'}
2018-02-06 09:36:27,977 [salt.state       ][INFO    ][2207] Completed state [cmp001] at time 09:36:27.977869 duration_in_ms=1.491
2018-02-06 09:36:27,978 [salt.state       ][INFO    ][2207] Running state [cmp001.mcp-pike-ovs-ha.local] at time 09:36:27.978100
2018-02-06 09:36:27,978 [salt.state       ][INFO    ][2207] Executing state host.present for cmp001.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,979 [salt.state       ][INFO    ][2207] {'host': 'cmp001.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,979 [salt.state       ][INFO    ][2207] Completed state [cmp001.mcp-pike-ovs-ha.local] at time 09:36:27.979588 duration_in_ms=1.488
2018-02-06 09:36:27,979 [salt.state       ][INFO    ][2207] Running state [dbs01] at time 09:36:27.979828
2018-02-06 09:36:27,980 [salt.state       ][INFO    ][2207] Executing state host.present for dbs01
2018-02-06 09:36:27,981 [salt.state       ][INFO    ][2207] {'host': 'dbs01'}
2018-02-06 09:36:27,981 [salt.state       ][INFO    ][2207] Completed state [dbs01] at time 09:36:27.981464 duration_in_ms=1.636
2018-02-06 09:36:27,981 [salt.state       ][INFO    ][2207] Running state [dbs01.mcp-pike-ovs-ha.local] at time 09:36:27.981703
2018-02-06 09:36:27,981 [salt.state       ][INFO    ][2207] Executing state host.present for dbs01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,983 [salt.state       ][INFO    ][2207] {'host': 'dbs01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,984 [salt.state       ][INFO    ][2207] Completed state [dbs01.mcp-pike-ovs-ha.local] at time 09:36:27.984014 duration_in_ms=2.311
2018-02-06 09:36:27,984 [salt.state       ][INFO    ][2207] Running state [dbs02] at time 09:36:27.984244
2018-02-06 09:36:27,984 [salt.state       ][INFO    ][2207] Executing state host.present for dbs02
2018-02-06 09:36:27,985 [salt.state       ][INFO    ][2207] {'host': 'dbs02'}
2018-02-06 09:36:27,985 [salt.state       ][INFO    ][2207] Completed state [dbs02] at time 09:36:27.985771 duration_in_ms=1.527
2018-02-06 09:36:27,986 [salt.state       ][INFO    ][2207] Running state [dbs02.mcp-pike-ovs-ha.local] at time 09:36:27.985999
2018-02-06 09:36:27,986 [salt.state       ][INFO    ][2207] Executing state host.present for dbs02.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,987 [salt.state       ][INFO    ][2207] {'host': 'dbs02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,987 [salt.state       ][INFO    ][2207] Completed state [dbs02.mcp-pike-ovs-ha.local] at time 09:36:27.987807 duration_in_ms=1.808
2018-02-06 09:36:27,988 [salt.state       ][INFO    ][2207] Running state [dbs03] at time 09:36:27.988044
2018-02-06 09:36:27,988 [salt.state       ][INFO    ][2207] Executing state host.present for dbs03
2018-02-06 09:36:27,989 [salt.state       ][INFO    ][2207] {'host': 'dbs03'}
2018-02-06 09:36:27,989 [salt.state       ][INFO    ][2207] Completed state [dbs03] at time 09:36:27.989642 duration_in_ms=1.598
2018-02-06 09:36:27,989 [salt.state       ][INFO    ][2207] Running state [dbs03.mcp-pike-ovs-ha.local] at time 09:36:27.989872
2018-02-06 09:36:27,990 [salt.state       ][INFO    ][2207] Executing state host.present for dbs03.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,992 [salt.state       ][INFO    ][2207] {'host': 'dbs03.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,992 [salt.state       ][INFO    ][2207] Completed state [dbs03.mcp-pike-ovs-ha.local] at time 09:36:27.992481 duration_in_ms=2.609
2018-02-06 09:36:27,992 [salt.state       ][INFO    ][2207] Running state [mas01] at time 09:36:27.992711
2018-02-06 09:36:27,992 [salt.state       ][INFO    ][2207] Executing state host.present for mas01
2018-02-06 09:36:27,994 [salt.state       ][INFO    ][2207] {'host': 'mas01'}
2018-02-06 09:36:27,994 [salt.state       ][INFO    ][2207] Completed state [mas01] at time 09:36:27.994268 duration_in_ms=1.557
2018-02-06 09:36:27,994 [salt.state       ][INFO    ][2207] Running state [mas01.mcp-pike-ovs-ha.local] at time 09:36:27.994501
2018-02-06 09:36:27,994 [salt.state       ][INFO    ][2207] Executing state host.present for mas01.mcp-pike-ovs-ha.local
2018-02-06 09:36:27,996 [salt.state       ][INFO    ][2207] {'host': 'mas01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:27,996 [salt.state       ][INFO    ][2207] Completed state [mas01.mcp-pike-ovs-ha.local] at time 09:36:27.996325 duration_in_ms=1.824
2018-02-06 09:36:27,996 [salt.state       ][INFO    ][2207] Running state [ctl02] at time 09:36:27.996564
2018-02-06 09:36:27,996 [salt.state       ][INFO    ][2207] Executing state host.present for ctl02
2018-02-06 09:36:28,000 [salt.state       ][INFO    ][2207] {'host': 'ctl02'}
2018-02-06 09:36:28,000 [salt.state       ][INFO    ][2207] Completed state [ctl02] at time 09:36:28.000848 duration_in_ms=4.284
2018-02-06 09:36:28,001 [salt.state       ][INFO    ][2207] Running state [ctl02.mcp-pike-ovs-ha.local] at time 09:36:28.001087
2018-02-06 09:36:28,001 [salt.state       ][INFO    ][2207] Executing state host.present for ctl02.mcp-pike-ovs-ha.local
2018-02-06 09:36:28,002 [salt.state       ][INFO    ][2207] {'host': 'ctl02.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:28,002 [salt.state       ][INFO    ][2207] Completed state [ctl02.mcp-pike-ovs-ha.local] at time 09:36:28.002682 duration_in_ms=1.595
2018-02-06 09:36:28,002 [salt.state       ][INFO    ][2207] Running state [ctl03] at time 09:36:28.002923
2018-02-06 09:36:28,004 [salt.state       ][INFO    ][2207] Executing state host.present for ctl03
2018-02-06 09:36:28,005 [salt.state       ][INFO    ][2207] {'host': 'ctl03'}
2018-02-06 09:36:28,005 [salt.state       ][INFO    ][2207] Completed state [ctl03] at time 09:36:28.005336 duration_in_ms=2.413
2018-02-06 09:36:28,005 [salt.state       ][INFO    ][2207] Running state [ctl03.mcp-pike-ovs-ha.local] at time 09:36:28.005586
2018-02-06 09:36:28,005 [salt.state       ][INFO    ][2207] Executing state host.present for ctl03.mcp-pike-ovs-ha.local
2018-02-06 09:36:28,006 [salt.state       ][INFO    ][2207] {'host': 'ctl03.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:28,008 [salt.state       ][INFO    ][2207] Completed state [ctl03.mcp-pike-ovs-ha.local] at time 09:36:28.007975 duration_in_ms=2.389
2018-02-06 09:36:28,008 [salt.state       ][INFO    ][2207] Running state [ctl01] at time 09:36:28.008203
2018-02-06 09:36:28,008 [salt.state       ][INFO    ][2207] Executing state host.present for ctl01
2018-02-06 09:36:28,009 [salt.state       ][INFO    ][2207] {'host': 'ctl01'}
2018-02-06 09:36:28,009 [salt.state       ][INFO    ][2207] Completed state [ctl01] at time 09:36:28.009777 duration_in_ms=1.573
2018-02-06 09:36:28,010 [salt.state       ][INFO    ][2207] Running state [ctl01.mcp-pike-ovs-ha.local] at time 09:36:28.010007
2018-02-06 09:36:28,010 [salt.state       ][INFO    ][2207] Executing state host.present for ctl01.mcp-pike-ovs-ha.local
2018-02-06 09:36:28,011 [salt.state       ][INFO    ][2207] {'host': 'ctl01.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:28,011 [salt.state       ][INFO    ][2207] Completed state [ctl01.mcp-pike-ovs-ha.local] at time 09:36:28.011572 duration_in_ms=1.564
2018-02-06 09:36:28,011 [salt.state       ][INFO    ][2207] Running state [ctl] at time 09:36:28.011801
2018-02-06 09:36:28,012 [salt.state       ][INFO    ][2207] Executing state host.present for ctl
2018-02-06 09:36:28,013 [salt.state       ][INFO    ][2207] {'host': 'ctl'}
2018-02-06 09:36:28,013 [salt.state       ][INFO    ][2207] Completed state [ctl] at time 09:36:28.013385 duration_in_ms=1.584
2018-02-06 09:36:28,013 [salt.state       ][INFO    ][2207] Running state [ctl.mcp-pike-ovs-ha.local] at time 09:36:28.013616
2018-02-06 09:36:28,013 [salt.state       ][INFO    ][2207] Executing state host.present for ctl.mcp-pike-ovs-ha.local
2018-02-06 09:36:28,015 [salt.state       ][INFO    ][2207] {'host': 'ctl.mcp-pike-ovs-ha.local'}
2018-02-06 09:36:28,015 [salt.state       ][INFO    ][2207] Completed state [ctl.mcp-pike-ovs-ha.local] at time 09:36:28.015508 duration_in_ms=1.892
2018-02-06 09:36:28,015 [salt.state       ][INFO    ][2207] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 09:36:28.015742
2018-02-06 09:36:28,016 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/network/interfaces.d/50-cloud-init.cfg
2018-02-06 09:36:28,016 [salt.state       ][INFO    ][2207] {'removed': '/etc/network/interfaces.d/50-cloud-init.cfg'}
2018-02-06 09:36:28,016 [salt.state       ][INFO    ][2207] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 09:36:28.016565 duration_in_ms=0.823
2018-02-06 09:36:28,026 [salt.state       ][INFO    ][2207] Running state [ens2] at time 09:36:28.026895
2018-02-06 09:36:28,027 [salt.state       ][INFO    ][2207] Executing state network.managed for ens2
2018-02-06 09:36:28,186 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['ifdown', 'ens2'] in directory '/root'
2018-02-06 09:36:29,333 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['ifup', 'ens2'] in directory '/root'
2018-02-06 09:36:30,689 [salt.state       ][INFO    ][2207] {'interface': 'Added network interface.', 'status': 'Interface ens2 restart to validate'}
2018-02-06 09:36:30,690 [salt.state       ][INFO    ][2207] Completed state [ens2] at time 09:36:30.690699 duration_in_ms=2663.803
2018-02-06 09:36:30,692 [salt.state       ][INFO    ][2207] Running state [ens3] at time 09:36:30.692832
2018-02-06 09:36:30,693 [salt.state       ][INFO    ][2207] Executing state network.managed for ens3
2018-02-06 09:36:30,724 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['ifup', 'ens3'] in directory '/root'
2018-02-06 09:36:31,479 [salt.state       ][INFO    ][2207] {'interface': 'Added network interface.', 'status': 'Interface ens3 is up'}
2018-02-06 09:36:31,480 [salt.state       ][INFO    ][2207] Completed state [ens3] at time 09:36:31.480879 duration_in_ms=788.046
2018-02-06 09:36:31,481 [salt.state       ][INFO    ][2207] Running state [ens3] at time 09:36:31.481518
2018-02-06 09:36:31,482 [salt.state       ][INFO    ][2207] Executing state network.routes for ens3
2018-02-06 09:36:31,492 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'status', 'networking.service', '-n', '0'] in directory '/root'
2018-02-06 09:36:31,513 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'networking.service'] in directory '/root'
2018-02-06 09:36:34,728 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'networking.service'] in directory '/root'
2018-02-06 09:36:34,750 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'networking.service'] in directory '/root'
2018-02-06 09:36:35,135 [salt.loaded.int.module.cmdmod][ERROR   ][2207] Command '['systemd-run', '--scope', 'systemctl', 'start', 'networking.service']' failed with return code: 1
2018-02-06 09:36:35,136 [salt.loaded.int.module.cmdmod][ERROR   ][2207] output: Running scope as unit run-ra490f2e77b35455d9e41b0e8147cfa9c.scope.
Job for networking.service failed because the control process exited with error code. See "systemctl status networking.service" and "journalctl -xe" for details.
2018-02-06 09:36:35,136 [salt.state       ][INFO    ][2207] {'network_routes': 'Added interface ens3 routes.'}
2018-02-06 09:36:35,136 [salt.state       ][INFO    ][2207] Completed state [ens3] at time 09:36:35.136794 duration_in_ms=3655.275
2018-02-06 09:36:35,137 [salt.state       ][INFO    ][2207] Running state [ens4] at time 09:36:35.137263
2018-02-06 09:36:35,137 [salt.state       ][INFO    ][2207] Executing state network.managed for ens4
2018-02-06 09:36:35,172 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['ifup', 'ens4'] in directory '/root'
2018-02-06 09:36:35,256 [salt.loaded.int.module.cmdmod][ERROR   ][2207] Command '['ifup', 'ens4']' failed with return code: 1
2018-02-06 09:36:35,257 [salt.loaded.int.module.cmdmod][ERROR   ][2207] output: SIOCADDRT: File exists
run-parts: /etc/network/if-up.d/route-ens3 exited with return code 7
Failed to bring up ens4.
2018-02-06 09:36:35,877 [salt.state       ][INFO    ][2207] {'interface': 'Added network interface.', 'status': 'Interface ens4 is up'}
2018-02-06 09:36:35,878 [salt.state       ][INFO    ][2207] Completed state [ens4] at time 09:36:35.878285 duration_in_ms=741.022
2018-02-06 09:36:35,878 [salt.state       ][INFO    ][2207] Running state [/etc/profile.d/proxy.sh] at time 09:36:35.878912
2018-02-06 09:36:35,880 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/profile.d/proxy.sh
2018-02-06 09:36:35,881 [salt.state       ][INFO    ][2207] File /etc/profile.d/proxy.sh is not present
2018-02-06 09:36:35,881 [salt.state       ][INFO    ][2207] Completed state [/etc/profile.d/proxy.sh] at time 09:36:35.881583 duration_in_ms=2.671
2018-02-06 09:36:35,882 [salt.state       ][INFO    ][2207] Running state [/etc/apt/apt.conf.d/95proxies] at time 09:36:35.882071
2018-02-06 09:36:35,882 [salt.state       ][INFO    ][2207] Executing state file.absent for /etc/apt/apt.conf.d/95proxies
2018-02-06 09:36:35,883 [salt.state       ][INFO    ][2207] File /etc/apt/apt.conf.d/95proxies is not present
2018-02-06 09:36:35,883 [salt.state       ][INFO    ][2207] Completed state [/etc/apt/apt.conf.d/95proxies] at time 09:36:35.883822 duration_in_ms=1.751
2018-02-06 09:36:35,885 [salt.state       ][INFO    ][2207] Running state [ntp] at time 09:36:35.885649
2018-02-06 09:36:35,886 [salt.state       ][INFO    ][2207] Executing state pkg.installed for ntp
2018-02-06 09:36:36,196 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 09:36:36,231 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'ntp'] in directory '/root'
2018-02-06 09:36:37,334 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093636761245
2018-02-06 09:36:37,356 [salt.minion      ][INFO    ][6000] Starting a new job with PID 6000
2018-02-06 09:36:37,376 [salt.minion      ][INFO    ][6000] Returning information for job: 20180206093636761245
2018-02-06 09:36:39,606 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:36:39,652 [salt.state       ][INFO    ][2207] Made the following changes:
'ntp' changed from 'absent' to '1:4.2.8p4+dfsg-3ubuntu5.7'
'libopts25' changed from 'absent' to '1:5.18.7-3'

2018-02-06 09:36:39,674 [salt.state       ][INFO    ][2207] Loading fresh modules for state activity
2018-02-06 09:36:39,709 [salt.state       ][INFO    ][2207] Completed state [ntp] at time 09:36:39.709583 duration_in_ms=3823.933
2018-02-06 09:36:39,713 [salt.state       ][INFO    ][2207] Running state [/etc/ntp.conf] at time 09:36:39.713910
2018-02-06 09:36:39,714 [salt.state       ][INFO    ][2207] Executing state file.managed for /etc/ntp.conf
2018-02-06 09:36:39,740 [salt.fileclient  ][INFO    ][2207] Fetching file from saltenv 'base', ** done ** 'ntp/files/ntp.conf'
2018-02-06 09:36:39,796 [salt.state       ][INFO    ][2207] File changed:
--- 
+++ 
@@ -1,66 +1,24 @@
-# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help
 
-driftfile /var/lib/ntp/ntp.drift
 
-# Enable this if you want statistics to be logged.
-#statsdir /var/log/ntpstats/
+# ntpd will only synchronize your clock.
 
-statistics loopstats peerstats clockstats
-filegen loopstats file loopstats type day enable
-filegen peerstats file peerstats type day enable
-filegen clockstats file clockstats type day enable
+# For details, see:
+# - the ntp.conf man page
+# - http://support.ntp.org/bin/view/Support/GettingStarted
+# - https://wiki.archlinux.org/index.php/Network_Time_Protocol_daemon
 
-# Specify one or more NTP servers.
+# Associate to cloud NTP pool servers
+server 1.pool.ntp.org iburst
+server 0.pool.ntp.org
 
-# Use servers from the NTP Pool Project. Approved by Ubuntu Technical Board
-# on 2011-02-08 (LP: #104525). See http://www.pool.ntp.org/join.html for
-# more information.
-pool 0.ubuntu.pool.ntp.org iburst
-pool 1.ubuntu.pool.ntp.org iburst
-pool 2.ubuntu.pool.ntp.org iburst
-pool 3.ubuntu.pool.ntp.org iburst
-
-# Use Ubuntu's ntp server as a fallback.
-pool ntp.ubuntu.com
-
-# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for
-# details.  The web page <http://support.ntp.org/bin/view/Support/AccessRestrictions>
-# might also be helpful.
-#
-# Note that "restrict" applies to both servers and clients, so a configuration
-# that might be intended to block requests from certain clients could also end
-# up blocking replies from your own upstream servers.
-
-# By default, exchange time with everybody, but don't allow configuration.
-restrict -4 default kod notrap nomodify nopeer noquery limited
-restrict -6 default kod notrap nomodify nopeer noquery limited
-
-# Local users may interrogate the ntp server more closely.
+# Only allow read-only access from localhost
+restrict default noquery nopeer
 restrict 127.0.0.1
 restrict ::1
 
-# Needed for adding pool entries
-restrict source notrap nomodify noquery
-
-# Clients from this (example!) subnet have unlimited access, but only if
-# cryptographically authenticated.
-#restrict 192.168.123.0 mask 255.255.255.0 notrust
+# mode7 is required for collectd monitoring
 
 
-# If you want to provide time to your local subnet, change the next line.
-# (Again, the address is an example only.)
-#broadcast 192.168.123.255
-
-# If you want to listen to time broadcasts on your local subnet, de-comment the
-# next lines.  Please do this only if you trust everybody on the network!
-#disable auth
-#broadcastclient
-
-#Changes recquired to use pps synchonisation as explained in documentation:
-#http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm#AEN3918
-
-#server 127.127.8.1 mode 135 prefer    # Meinberg GPS167 with PPS
-#fudge 127.127.8.1 time1 0.0042        # relative to PPS for my hardware
-
-#server 127.127.22.1                   # ATOM(PPS)
-#fudge 127.127.22.1 flag3 1            # enable PPS API
+# Location of drift file
+driftfile /var/lib/ntp/ntp.drift
+logfile /var/log/ntp.log
2018-02-06 09:36:39,796 [salt.state       ][INFO    ][2207] Completed state [/etc/ntp.conf] at time 09:36:39.796645 duration_in_ms=82.734
2018-02-06 09:36:39,938 [salt.state       ][INFO    ][2207] Running state [ntp] at time 09:36:39.938435
2018-02-06 09:36:39,938 [salt.state       ][INFO    ][2207] Executing state service.running for ntp
2018-02-06 09:36:39,941 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-02-06 09:36:39,961 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-02-06 09:36:39,979 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-02-06 09:36:39,999 [salt.state       ][INFO    ][2207] The service ntp is already running
2018-02-06 09:36:39,999 [salt.state       ][INFO    ][2207] Completed state [ntp] at time 09:36:39.999703 duration_in_ms=61.268
2018-02-06 09:36:40,000 [salt.state       ][INFO    ][2207] Running state [ntp] at time 09:36:40.000038
2018-02-06 09:36:40,000 [salt.state       ][INFO    ][2207] Executing state service.mod_watch for ntp
2018-02-06 09:36:40,001 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-02-06 09:36:40,018 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-02-06 09:36:40,037 [salt.loaded.int.module.cmdmod][INFO    ][2207] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'ntp.service'] in directory '/root'
2018-02-06 09:36:40,158 [salt.state       ][INFO    ][2207] {'ntp': True}
2018-02-06 09:36:40,159 [salt.state       ][INFO    ][2207] Completed state [ntp] at time 09:36:40.159011 duration_in_ms=158.972
2018-02-06 09:36:40,170 [salt.minion      ][INFO    ][2207] Returning information for job: 20180206093556267240
2018-02-06 09:37:07,757 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command state.apply with jid 20180206093707738134
2018-02-06 09:37:07,784 [salt.minion      ][INFO    ][6285] Starting a new job with PID 6285
2018-02-06 09:37:11,534 [salt.state       ][INFO    ][6285] Loading fresh modules for state activity
2018-02-06 09:37:14,746 [salt.state       ][INFO    ][6285] Running state [/etc/environment] at time 09:37:14.746660
2018-02-06 09:37:14,748 [salt.state       ][INFO    ][6285] Executing state file.blockreplace for /etc/environment
2018-02-06 09:37:14,756 [salt.state       ][INFO    ][6285] File changed:
--- 
+++ 
@@ -1,3 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
 # SALT MANAGED VARIABLES - DO NOT EDIT - START
+# 
 # # SALT MANAGED VARIABLES - END

2018-02-06 09:37:14,758 [salt.state       ][INFO    ][6285] Completed state [/etc/environment] at time 09:37:14.758244 duration_in_ms=11.584
2018-02-06 09:37:14,758 [salt.state       ][INFO    ][6285] Running state [/etc/profile.d] at time 09:37:14.758743
2018-02-06 09:37:14,759 [salt.state       ][INFO    ][6285] Executing state file.directory for /etc/profile.d
2018-02-06 09:37:14,761 [salt.state       ][INFO    ][6285] Directory /etc/profile.d is in the correct state
2018-02-06 09:37:14,761 [salt.state       ][INFO    ][6285] Completed state [/etc/profile.d] at time 09:37:14.761497 duration_in_ms=2.754
2018-02-06 09:37:15,303 [salt.state       ][INFO    ][6285] Running state [/etc/apt/apt.conf.d/99compression-workaround-salt] at time 09:37:15.303852
2018-02-06 09:37:15,304 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/apt/apt.conf.d/99compression-workaround-salt
2018-02-06 09:37:15,334 [salt.state       ][INFO    ][6285] File /etc/apt/apt.conf.d/99compression-workaround-salt is in the correct state
2018-02-06 09:37:15,334 [salt.state       ][INFO    ][6285] Completed state [/etc/apt/apt.conf.d/99compression-workaround-salt] at time 09:37:15.334531 duration_in_ms=30.679
2018-02-06 09:37:15,334 [salt.state       ][INFO    ][6285] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 09:37:15.334795
2018-02-06 09:37:15,335 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/apt/apt.conf.d/99prefer_ipv4-salt
2018-02-06 09:37:15,355 [salt.state       ][INFO    ][6285] File /etc/apt/apt.conf.d/99prefer_ipv4-salt is in the correct state
2018-02-06 09:37:15,355 [salt.state       ][INFO    ][6285] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 09:37:15.355448 duration_in_ms=20.652
2018-02-06 09:37:15,356 [salt.state       ][INFO    ][6285] Running state [linux_repo_prereq_pkgs] at time 09:37:15.356659
2018-02-06 09:37:15,356 [salt.state       ][INFO    ][6285] Executing state pkg.installed for linux_repo_prereq_pkgs
2018-02-06 09:37:15,357 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:37:15,780 [salt.state       ][INFO    ][6285] All specified packages are already installed
2018-02-06 09:37:15,781 [salt.state       ][INFO    ][6285] Completed state [linux_repo_prereq_pkgs] at time 09:37:15.781117 duration_in_ms=424.457
2018-02-06 09:37:15,781 [salt.state       ][INFO    ][6285] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 09:37:15.781706
2018-02-06 09:37:15,782 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/apt/apt.conf.d/99proxies-salt-uca
2018-02-06 09:37:15,783 [salt.state       ][INFO    ][6285] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-02-06 09:37:15,783 [salt.state       ][INFO    ][6285] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 09:37:15.783530 duration_in_ms=1.824
2018-02-06 09:37:15,784 [salt.state       ][INFO    ][6285] Running state [/etc/apt/preferences.d/uca] at time 09:37:15.784021
2018-02-06 09:37:15,784 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/apt/preferences.d/uca
2018-02-06 09:37:15,785 [salt.state       ][INFO    ][6285] File /etc/apt/preferences.d/uca is not present
2018-02-06 09:37:15,785 [salt.state       ][INFO    ][6285] Completed state [/etc/apt/preferences.d/uca] at time 09:37:15.785487 duration_in_ms=1.466
2018-02-06 09:37:15,788 [salt.state       ][INFO    ][6285] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 09:37:15.788275
2018-02-06 09:37:15,788 [salt.state       ][INFO    ][6285] Executing state cmd.run for apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA
2018-02-06 09:37:15,789 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-02-06 09:37:15,972 [salt.state       ][INFO    ][6285] {'pid': 6343, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" not changed\ngpg: Total number processed: 1\ngpg:              unchanged: 1', 'stdout': 'Executing: /tmp/tmp.Omugc4kNWF/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-02-06 09:37:15,973 [salt.state       ][INFO    ][6285] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 09:37:15.973252 duration_in_ms=184.976
2018-02-06 09:37:15,978 [salt.state       ][INFO    ][6285] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 09:37:15.978335
2018-02-06 09:37:15,978 [salt.state       ][INFO    ][6285] Executing state pkgrepo.managed for deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main
2018-02-06 09:37:16,082 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 09:37:17,852 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093717830158
2018-02-06 09:37:17,886 [salt.minion      ][INFO    ][6737] Starting a new job with PID 6737
2018-02-06 09:37:17,971 [salt.minion      ][INFO    ][6737] Returning information for job: 20180206093717830158
2018-02-06 09:37:19,693 [salt.state       ][INFO    ][6285] Configured package repo 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main'
2018-02-06 09:37:19,694 [salt.state       ][INFO    ][6285] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 09:37:19.694295 duration_in_ms=3715.96
2018-02-06 09:37:19,694 [salt.state       ][INFO    ][6285] Running state [linux_extra_packages_purged] at time 09:37:19.694946
2018-02-06 09:37:19,696 [salt.state       ][INFO    ][6285] Executing state pkg.purged for linux_extra_packages_purged
2018-02-06 09:37:19,711 [salt.state       ][INFO    ][6285] All specified packages are already absent
2018-02-06 09:37:19,712 [salt.state       ][INFO    ][6285] Completed state [linux_extra_packages_purged] at time 09:37:19.712265 duration_in_ms=17.318
2018-02-06 09:37:19,712 [salt.state       ][INFO    ][6285] Running state [linux_extra_packages_latest] at time 09:37:19.712781
2018-02-06 09:37:19,713 [salt.state       ][INFO    ][6285] Executing state pkg.latest for linux_extra_packages_latest
2018-02-06 09:37:19,727 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-02-06 09:37:19,779 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['apt-cache', '-q', 'policy', 'mcelog'] in directory '/root'
2018-02-06 09:37:19,833 [salt.state       ][INFO    ][6285] All packages are up-to-date (libapache2-mod-wsgi, mcelog).
2018-02-06 09:37:19,834 [salt.state       ][INFO    ][6285] Completed state [linux_extra_packages_latest] at time 09:37:19.833978 duration_in_ms=121.196
2018-02-06 09:37:19,835 [salt.state       ][INFO    ][6285] Running state [UTC] at time 09:37:19.835231
2018-02-06 09:37:19,836 [salt.state       ][INFO    ][6285] Executing state timezone.system for UTC
2018-02-06 09:37:19,837 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['timedatectl'] in directory '/root'
2018-02-06 09:37:19,881 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['timedatectl'] in directory '/root'
2018-02-06 09:37:19,898 [salt.state       ][INFO    ][6285] Timezone UTC already set, UTC already set to UTC
2018-02-06 09:37:19,899 [salt.state       ][INFO    ][6285] Completed state [UTC] at time 09:37:19.899132 duration_in_ms=63.9
2018-02-06 09:37:19,901 [salt.state       ][INFO    ][6285] Running state [nf_conntrack] at time 09:37:19.901277
2018-02-06 09:37:19,901 [salt.state       ][INFO    ][6285] Executing state kmod.present for nf_conntrack
2018-02-06 09:37:19,902 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'lsmod' in directory '/root'
2018-02-06 09:37:19,921 [salt.state       ][INFO    ][6285] Kernel module nf_conntrack is already present
2018-02-06 09:37:19,921 [salt.state       ][INFO    ][6285] Completed state [nf_conntrack] at time 09:37:19.921639 duration_in_ms=20.362
2018-02-06 09:37:19,922 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_keepalive_probes] at time 09:37:19.922700
2018-02-06 09:37:19,923 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_keepalive_probes
2018-02-06 09:37:19,939 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:19,979 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_keepalive_probes = 8 is already set
2018-02-06 09:37:19,980 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_keepalive_probes] at time 09:37:19.980156 duration_in_ms=57.455
2018-02-06 09:37:19,980 [salt.state       ][INFO    ][6285] Running state [fs.file-max] at time 09:37:19.980686
2018-02-06 09:37:19,981 [salt.state       ][INFO    ][6285] Executing state sysctl.present for fs.file-max
2018-02-06 09:37:19,981 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,022 [salt.state       ][INFO    ][6285] Sysctl value fs.file-max = 124165 is already set
2018-02-06 09:37:20,023 [salt.state       ][INFO    ][6285] Completed state [fs.file-max] at time 09:37:20.023376 duration_in_ms=42.689
2018-02-06 09:37:20,023 [salt.state       ][INFO    ][6285] Running state [net.core.somaxconn] at time 09:37:20.023793
2018-02-06 09:37:20,024 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.core.somaxconn
2018-02-06 09:37:20,025 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,062 [salt.state       ][INFO    ][6285] Sysctl value net.core.somaxconn = 4096 is already set
2018-02-06 09:37:20,062 [salt.state       ][INFO    ][6285] Completed state [net.core.somaxconn] at time 09:37:20.062690 duration_in_ms=38.896
2018-02-06 09:37:20,063 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_max_syn_backlog] at time 09:37:20.063103
2018-02-06 09:37:20,064 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_max_syn_backlog
2018-02-06 09:37:20,065 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,105 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_max_syn_backlog = 8192 is already set
2018-02-06 09:37:20,105 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_max_syn_backlog] at time 09:37:20.105546 duration_in_ms=42.442
2018-02-06 09:37:20,105 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_tw_reuse] at time 09:37:20.105949
2018-02-06 09:37:20,106 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_tw_reuse
2018-02-06 09:37:20,107 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,144 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_tw_reuse = 1 is already set
2018-02-06 09:37:20,145 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_tw_reuse] at time 09:37:20.145348 duration_in_ms=39.398
2018-02-06 09:37:20,146 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_congestion_control] at time 09:37:20.145990
2018-02-06 09:37:20,146 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_congestion_control
2018-02-06 09:37:20,148 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,187 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_congestion_control = yeah is already set
2018-02-06 09:37:20,188 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_congestion_control] at time 09:37:20.188119 duration_in_ms=42.128
2018-02-06 09:37:20,188 [salt.state       ][INFO    ][6285] Running state [net.nf_conntrack_max] at time 09:37:20.188817
2018-02-06 09:37:20,189 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.nf_conntrack_max
2018-02-06 09:37:20,190 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,230 [salt.state       ][INFO    ][6285] Sysctl value net.nf_conntrack_max = 1048576 is already set
2018-02-06 09:37:20,232 [salt.state       ][INFO    ][6285] Completed state [net.nf_conntrack_max] at time 09:37:20.231391 duration_in_ms=42.573
2018-02-06 09:37:20,233 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_retries2] at time 09:37:20.233169
2018-02-06 09:37:20,233 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_retries2
2018-02-06 09:37:20,234 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,276 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_retries2 = 5 is already set
2018-02-06 09:37:20,277 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_retries2] at time 09:37:20.276938 duration_in_ms=43.768
2018-02-06 09:37:20,277 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_fin_timeout] at time 09:37:20.277576
2018-02-06 09:37:20,278 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_fin_timeout
2018-02-06 09:37:20,279 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,317 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_fin_timeout = 30 is already set
2018-02-06 09:37:20,317 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_fin_timeout] at time 09:37:20.317869 duration_in_ms=40.292
2018-02-06 09:37:20,318 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_slow_start_after_idle] at time 09:37:20.318374
2018-02-06 09:37:20,318 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_slow_start_after_idle
2018-02-06 09:37:20,319 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,361 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_slow_start_after_idle = 0 is already set
2018-02-06 09:37:20,361 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 09:37:20.361899 duration_in_ms=43.524
2018-02-06 09:37:20,362 [salt.state       ][INFO    ][6285] Running state [vm.swappiness] at time 09:37:20.362427
2018-02-06 09:37:20,362 [salt.state       ][INFO    ][6285] Executing state sysctl.present for vm.swappiness
2018-02-06 09:37:20,364 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,405 [salt.state       ][INFO    ][6285] Sysctl value vm.swappiness = 10 is already set
2018-02-06 09:37:20,405 [salt.state       ][INFO    ][6285] Completed state [vm.swappiness] at time 09:37:20.405451 duration_in_ms=43.023
2018-02-06 09:37:20,405 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_keepalive_intvl] at time 09:37:20.405947
2018-02-06 09:37:20,406 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_keepalive_intvl
2018-02-06 09:37:20,407 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,447 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_keepalive_intvl = 3 is already set
2018-02-06 09:37:20,448 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_keepalive_intvl] at time 09:37:20.448472 duration_in_ms=42.524
2018-02-06 09:37:20,449 [salt.state       ][INFO    ][6285] Running state [net.ipv4.neigh.default.gc_thresh1] at time 09:37:20.448977
2018-02-06 09:37:20,449 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh1
2018-02-06 09:37:20,450 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,490 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.neigh.default.gc_thresh1 = 4096 is already set
2018-02-06 09:37:20,490 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 09:37:20.490710 duration_in_ms=41.732
2018-02-06 09:37:20,491 [salt.state       ][INFO    ][6285] Running state [net.ipv4.neigh.default.gc_thresh2] at time 09:37:20.491186
2018-02-06 09:37:20,492 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh2
2018-02-06 09:37:20,493 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,537 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.neigh.default.gc_thresh2 = 8192 is already set
2018-02-06 09:37:20,538 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 09:37:20.538368 duration_in_ms=47.181
2018-02-06 09:37:20,538 [salt.state       ][INFO    ][6285] Running state [net.ipv4.neigh.default.gc_thresh3] at time 09:37:20.538883
2018-02-06 09:37:20,539 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh3
2018-02-06 09:37:20,540 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,581 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.neigh.default.gc_thresh3 = 16384 is already set
2018-02-06 09:37:20,582 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 09:37:20.582013 duration_in_ms=43.128
2018-02-06 09:37:20,582 [salt.state       ][INFO    ][6285] Running state [net.core.netdev_max_backlog] at time 09:37:20.582535
2018-02-06 09:37:20,582 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.core.netdev_max_backlog
2018-02-06 09:37:20,583 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,622 [salt.state       ][INFO    ][6285] Sysctl value net.core.netdev_max_backlog = 261144 is already set
2018-02-06 09:37:20,622 [salt.state       ][INFO    ][6285] Completed state [net.core.netdev_max_backlog] at time 09:37:20.622648 duration_in_ms=40.112
2018-02-06 09:37:20,623 [salt.state       ][INFO    ][6285] Running state [net.ipv4.tcp_keepalive_time] at time 09:37:20.623129
2018-02-06 09:37:20,623 [salt.state       ][INFO    ][6285] Executing state sysctl.present for net.ipv4.tcp_keepalive_time
2018-02-06 09:37:20,624 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,661 [salt.state       ][INFO    ][6285] Sysctl value net.ipv4.tcp_keepalive_time = 30 is already set
2018-02-06 09:37:20,662 [salt.state       ][INFO    ][6285] Completed state [net.ipv4.tcp_keepalive_time] at time 09:37:20.662322 duration_in_ms=39.192
2018-02-06 09:37:20,662 [salt.state       ][INFO    ][6285] Running state [kernel.panic] at time 09:37:20.662938
2018-02-06 09:37:20,664 [salt.state       ][INFO    ][6285] Executing state sysctl.present for kernel.panic
2018-02-06 09:37:20,665 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'sysctl -a' in directory '/root'
2018-02-06 09:37:20,705 [salt.state       ][INFO    ][6285] Sysctl value kernel.panic = 60 is already set
2018-02-06 09:37:20,705 [salt.state       ][INFO    ][6285] Completed state [kernel.panic] at time 09:37:20.705829 duration_in_ms=42.891
2018-02-06 09:37:20,706 [salt.state       ][INFO    ][6285] Running state [linux_sysfs_package] at time 09:37:20.706464
2018-02-06 09:37:20,707 [salt.state       ][INFO    ][6285] Executing state pkg.installed for linux_sysfs_package
2018-02-06 09:37:20,716 [salt.state       ][INFO    ][6285] All specified packages are already installed
2018-02-06 09:37:20,716 [salt.state       ][INFO    ][6285] Completed state [linux_sysfs_package] at time 09:37:20.716646 duration_in_ms=10.182
2018-02-06 09:37:20,718 [salt.state       ][INFO    ][6285] Running state [/etc/sysfs.d] at time 09:37:20.718386
2018-02-06 09:37:20,718 [salt.state       ][INFO    ][6285] Executing state file.directory for /etc/sysfs.d
2018-02-06 09:37:20,719 [salt.state       ][INFO    ][6285] Directory /etc/sysfs.d is in the correct state
2018-02-06 09:37:20,720 [salt.state       ][INFO    ][6285] Completed state [/etc/sysfs.d] at time 09:37:20.720179 duration_in_ms=1.793
2018-02-06 09:37:20,721 [salt.state       ][INFO    ][6285] Running state [ondemand] at time 09:37:20.721237
2018-02-06 09:37:20,721 [salt.state       ][INFO    ][6285] Executing state service.dead for ondemand
2018-02-06 09:37:20,722 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-02-06 09:37:20,743 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-02-06 09:37:20,761 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-02-06 09:37:20,788 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'runlevel' in directory '/root'
2018-02-06 09:37:20,808 [salt.state       ][INFO    ][6285] The service ondemand is already dead
2018-02-06 09:37:20,809 [salt.state       ][INFO    ][6285] Completed state [ondemand] at time 09:37:20.809421 duration_in_ms=88.183
2018-02-06 09:37:20,810 [salt.state       ][INFO    ][6285] Running state [cs_CZ.UTF-8] at time 09:37:20.810726
2018-02-06 09:37:20,811 [salt.state       ][INFO    ][6285] Executing state locale.present for cs_CZ.UTF-8
2018-02-06 09:37:20,812 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'locale -a' in directory '/root'
2018-02-06 09:37:20,827 [salt.state       ][INFO    ][6285] Locale cs_CZ.UTF-8 is already present
2018-02-06 09:37:20,828 [salt.state       ][INFO    ][6285] Completed state [cs_CZ.UTF-8] at time 09:37:20.828279 duration_in_ms=17.551
2018-02-06 09:37:20,828 [salt.state       ][INFO    ][6285] Running state [en_US.UTF-8] at time 09:37:20.828954
2018-02-06 09:37:20,829 [salt.state       ][INFO    ][6285] Executing state locale.present for en_US.UTF-8
2018-02-06 09:37:20,830 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'locale -a' in directory '/root'
2018-02-06 09:37:20,846 [salt.state       ][INFO    ][6285] Locale en_US.UTF-8 is already present
2018-02-06 09:37:20,846 [salt.state       ][INFO    ][6285] Completed state [en_US.UTF-8] at time 09:37:20.846803 duration_in_ms=17.848
2018-02-06 09:37:20,848 [salt.state       ][INFO    ][6285] Running state [en_US.UTF-8] at time 09:37:20.848810
2018-02-06 09:37:20,849 [salt.state       ][INFO    ][6285] Executing state locale.system for en_US.UTF-8
2018-02-06 09:37:20,850 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'localectl' in directory '/root'
2018-02-06 09:37:20,887 [salt.state       ][INFO    ][6285] System locale en_US.UTF-8 already set
2018-02-06 09:37:20,888 [salt.state       ][INFO    ][6285] Completed state [en_US.UTF-8] at time 09:37:20.888702 duration_in_ms=39.891
2018-02-06 09:37:20,891 [salt.state       ][INFO    ][6285] Running state [root] at time 09:37:20.891035
2018-02-06 09:37:20,893 [salt.state       ][INFO    ][6285] Executing state user.present for root
2018-02-06 09:37:20,894 [salt.state       ][INFO    ][6285] User root is present and up to date
2018-02-06 09:37:20,895 [salt.state       ][INFO    ][6285] Completed state [root] at time 09:37:20.895143 duration_in_ms=4.108
2018-02-06 09:37:20,897 [salt.state       ][INFO    ][6285] Running state [/root] at time 09:37:20.897004
2018-02-06 09:37:20,897 [salt.state       ][INFO    ][6285] Executing state file.directory for /root
2018-02-06 09:37:20,898 [salt.state       ][INFO    ][6285] Directory /root is in the correct state
2018-02-06 09:37:20,898 [salt.state       ][INFO    ][6285] Completed state [/root] at time 09:37:20.898912 duration_in_ms=1.908
2018-02-06 09:37:20,899 [salt.state       ][INFO    ][6285] Running state [/etc/sudoers.d/90-salt-user-root] at time 09:37:20.899377
2018-02-06 09:37:20,899 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/sudoers.d/90-salt-user-root
2018-02-06 09:37:20,900 [salt.state       ][INFO    ][6285] File /etc/sudoers.d/90-salt-user-root is not present
2018-02-06 09:37:20,900 [salt.state       ][INFO    ][6285] Completed state [/etc/sudoers.d/90-salt-user-root] at time 09:37:20.900875 duration_in_ms=1.498
2018-02-06 09:37:20,901 [salt.state       ][INFO    ][6285] Running state [ubuntu] at time 09:37:20.901340
2018-02-06 09:37:20,901 [salt.state       ][INFO    ][6285] Executing state user.present for ubuntu
2018-02-06 09:37:20,903 [salt.state       ][INFO    ][6285] User ubuntu is present and up to date
2018-02-06 09:37:20,904 [salt.state       ][INFO    ][6285] Completed state [ubuntu] at time 09:37:20.904279 duration_in_ms=2.936
2018-02-06 09:37:20,905 [salt.state       ][INFO    ][6285] Running state [/home/ubuntu] at time 09:37:20.905538
2018-02-06 09:37:20,906 [salt.state       ][INFO    ][6285] Executing state file.directory for /home/ubuntu
2018-02-06 09:37:20,906 [salt.state       ][INFO    ][6285] Directory /home/ubuntu is in the correct state
2018-02-06 09:37:20,907 [salt.state       ][INFO    ][6285] Completed state [/home/ubuntu] at time 09:37:20.907387 duration_in_ms=1.849
2018-02-06 09:37:20,908 [salt.state       ][INFO    ][6285] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 09:37:20.908775
2018-02-06 09:37:20,909 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/sudoers.d/90-salt-user-ubuntu
2018-02-06 09:37:20,929 [salt.state       ][INFO    ][6285] File /etc/sudoers.d/90-salt-user-ubuntu is in the correct state
2018-02-06 09:37:20,930 [salt.state       ][INFO    ][6285] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 09:37:20.930210 duration_in_ms=21.435
2018-02-06 09:37:20,930 [salt.state       ][INFO    ][6285] Running state [/etc/security/limits.d/90-salt-default.conf] at time 09:37:20.930700
2018-02-06 09:37:20,931 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/security/limits.d/90-salt-default.conf
2018-02-06 09:37:21,037 [salt.state       ][INFO    ][6285] File /etc/security/limits.d/90-salt-default.conf is in the correct state
2018-02-06 09:37:21,038 [salt.state       ][INFO    ][6285] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 09:37:21.038188 duration_in_ms=107.488
2018-02-06 09:37:21,038 [salt.state       ][INFO    ][6285] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 09:37:21.038699
2018-02-06 09:37:21,039 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/systemd/system.conf.d/90-salt.conf
2018-02-06 09:37:21,141 [salt.state       ][INFO    ][6285] File /etc/systemd/system.conf.d/90-salt.conf is in the correct state
2018-02-06 09:37:21,142 [salt.state       ][INFO    ][6285] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 09:37:21.142349 duration_in_ms=103.649
2018-02-06 09:37:21,144 [salt.state       ][INFO    ][6285] Running state [service.systemctl_reload] at time 09:37:21.144318
2018-02-06 09:37:21,144 [salt.state       ][INFO    ][6285] Executing state module.wait for service.systemctl_reload
2018-02-06 09:37:21,145 [salt.state       ][INFO    ][6285] No changes made for service.systemctl_reload
2018-02-06 09:37:21,145 [salt.state       ][INFO    ][6285] Completed state [service.systemctl_reload] at time 09:37:21.145726 duration_in_ms=1.408
2018-02-06 09:37:21,146 [salt.state       ][INFO    ][6285] Running state [/etc/hostname] at time 09:37:21.146198
2018-02-06 09:37:21,146 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/hostname
2018-02-06 09:37:21,162 [salt.state       ][INFO    ][6285] File /etc/hostname is in the correct state
2018-02-06 09:37:21,163 [salt.state       ][INFO    ][6285] Completed state [/etc/hostname] at time 09:37:21.163398 duration_in_ms=17.2
2018-02-06 09:37:21,164 [salt.state       ][INFO    ][6285] Running state [hostname prx02] at time 09:37:21.164564
2018-02-06 09:37:21,165 [salt.state       ][INFO    ][6285] Executing state cmd.wait for hostname prx02
2018-02-06 09:37:21,165 [salt.state       ][INFO    ][6285] No changes made for hostname prx02
2018-02-06 09:37:21,166 [salt.state       ][INFO    ][6285] Completed state [hostname prx02] at time 09:37:21.165982 duration_in_ms=1.418
2018-02-06 09:37:21,166 [salt.state       ][INFO    ][6285] Running state [mdb02] at time 09:37:21.166754
2018-02-06 09:37:21,167 [salt.state       ][INFO    ][6285] Executing state host.present for mdb02
2018-02-06 09:37:21,168 [salt.state       ][INFO    ][6285] Host mdb02 (192.168.10.77) already present
2018-02-06 09:37:21,168 [salt.state       ][INFO    ][6285] Completed state [mdb02] at time 09:37:21.168715 duration_in_ms=1.961
2018-02-06 09:37:21,169 [salt.state       ][INFO    ][6285] Running state [mdb02.mcp-pike-ovs-ha.local] at time 09:37:21.169177
2018-02-06 09:37:21,169 [salt.state       ][INFO    ][6285] Executing state host.present for mdb02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,170 [salt.state       ][INFO    ][6285] Host mdb02.mcp-pike-ovs-ha.local (192.168.10.77) already present
2018-02-06 09:37:21,170 [salt.state       ][INFO    ][6285] Completed state [mdb02.mcp-pike-ovs-ha.local] at time 09:37:21.170754 duration_in_ms=1.578
2018-02-06 09:37:21,171 [salt.state       ][INFO    ][6285] Running state [mdb03] at time 09:37:21.171215
2018-02-06 09:37:21,172 [salt.state       ][INFO    ][6285] Executing state host.present for mdb03
2018-02-06 09:37:21,173 [salt.state       ][INFO    ][6285] Host mdb03 (192.168.10.78) already present
2018-02-06 09:37:21,173 [salt.state       ][INFO    ][6285] Completed state [mdb03] at time 09:37:21.173388 duration_in_ms=2.173
2018-02-06 09:37:21,173 [salt.state       ][INFO    ][6285] Running state [mdb03.mcp-pike-ovs-ha.local] at time 09:37:21.173849
2018-02-06 09:37:21,174 [salt.state       ][INFO    ][6285] Executing state host.present for mdb03.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,175 [salt.state       ][INFO    ][6285] Host mdb03.mcp-pike-ovs-ha.local (192.168.10.78) already present
2018-02-06 09:37:21,175 [salt.state       ][INFO    ][6285] Completed state [mdb03.mcp-pike-ovs-ha.local] at time 09:37:21.175451 duration_in_ms=1.602
2018-02-06 09:37:21,175 [salt.state       ][INFO    ][6285] Running state [mdb01] at time 09:37:21.175925
2018-02-06 09:37:21,176 [salt.state       ][INFO    ][6285] Executing state host.present for mdb01
2018-02-06 09:37:21,177 [salt.state       ][INFO    ][6285] Host mdb01 (192.168.10.76) already present
2018-02-06 09:37:21,177 [salt.state       ][INFO    ][6285] Completed state [mdb01] at time 09:37:21.177558 duration_in_ms=1.633
2018-02-06 09:37:21,178 [salt.state       ][INFO    ][6285] Running state [mdb01.mcp-pike-ovs-ha.local] at time 09:37:21.178024
2018-02-06 09:37:21,178 [salt.state       ][INFO    ][6285] Executing state host.present for mdb01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,179 [salt.state       ][INFO    ][6285] Host mdb01.mcp-pike-ovs-ha.local (192.168.10.76) already present
2018-02-06 09:37:21,180 [salt.state       ][INFO    ][6285] Completed state [mdb01.mcp-pike-ovs-ha.local] at time 09:37:21.180741 duration_in_ms=2.717
2018-02-06 09:37:21,181 [salt.state       ][INFO    ][6285] Running state [mdb] at time 09:37:21.181016
2018-02-06 09:37:21,181 [salt.state       ][INFO    ][6285] Executing state host.present for mdb
2018-02-06 09:37:21,181 [salt.state       ][INFO    ][6285] Host mdb (192.168.10.75) already present
2018-02-06 09:37:21,181 [salt.state       ][INFO    ][6285] Completed state [mdb] at time 09:37:21.181950 duration_in_ms=0.934
2018-02-06 09:37:21,182 [salt.state       ][INFO    ][6285] Running state [mdb.mcp-pike-ovs-ha.local] at time 09:37:21.182175
2018-02-06 09:37:21,182 [salt.state       ][INFO    ][6285] Executing state host.present for mdb.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,182 [salt.state       ][INFO    ][6285] Host mdb.mcp-pike-ovs-ha.local (192.168.10.75) already present
2018-02-06 09:37:21,183 [salt.state       ][INFO    ][6285] Completed state [mdb.mcp-pike-ovs-ha.local] at time 09:37:21.183073 duration_in_ms=0.898
2018-02-06 09:37:21,183 [salt.state       ][INFO    ][6285] Running state [cfg01] at time 09:37:21.183296
2018-02-06 09:37:21,183 [salt.state       ][INFO    ][6285] Executing state host.present for cfg01
2018-02-06 09:37:21,184 [salt.state       ][INFO    ][6285] Host cfg01 (192.168.10.100) already present
2018-02-06 09:37:21,184 [salt.state       ][INFO    ][6285] Completed state [cfg01] at time 09:37:21.184533 duration_in_ms=1.237
2018-02-06 09:37:21,184 [salt.state       ][INFO    ][6285] Running state [cfg01.mcp-pike-ovs-ha.local] at time 09:37:21.184755
2018-02-06 09:37:21,184 [salt.state       ][INFO    ][6285] Executing state host.present for cfg01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,185 [salt.state       ][INFO    ][6285] Host cfg01.mcp-pike-ovs-ha.local (192.168.10.100) already present
2018-02-06 09:37:21,185 [salt.state       ][INFO    ][6285] Completed state [cfg01.mcp-pike-ovs-ha.local] at time 09:37:21.185637 duration_in_ms=0.882
2018-02-06 09:37:21,185 [salt.state       ][INFO    ][6285] Running state [prx01] at time 09:37:21.185857
2018-02-06 09:37:21,186 [salt.state       ][INFO    ][6285] Executing state host.present for prx01
2018-02-06 09:37:21,186 [salt.state       ][INFO    ][6285] Host prx01 (192.168.10.104) already present
2018-02-06 09:37:21,186 [salt.state       ][INFO    ][6285] Completed state [prx01] at time 09:37:21.186729 duration_in_ms=0.871
2018-02-06 09:37:21,186 [salt.state       ][INFO    ][6285] Running state [prx01.mcp-pike-ovs-ha.local] at time 09:37:21.186946
2018-02-06 09:37:21,187 [salt.state       ][INFO    ][6285] Executing state host.present for prx01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,187 [salt.state       ][INFO    ][6285] Host prx01.mcp-pike-ovs-ha.local (192.168.10.104) already present
2018-02-06 09:37:21,188 [salt.state       ][INFO    ][6285] Completed state [prx01.mcp-pike-ovs-ha.local] at time 09:37:21.188102 duration_in_ms=1.156
2018-02-06 09:37:21,188 [salt.state       ][INFO    ][6285] Running state [kvm01] at time 09:37:21.188342
2018-02-06 09:37:21,188 [salt.state       ][INFO    ][6285] Executing state host.present for kvm01
2018-02-06 09:37:21,189 [salt.state       ][INFO    ][6285] Host kvm01 (192.168.10.141) already present
2018-02-06 09:37:21,189 [salt.state       ][INFO    ][6285] Completed state [kvm01] at time 09:37:21.189206 duration_in_ms=0.864
2018-02-06 09:37:21,189 [salt.state       ][INFO    ][6285] Running state [kvm01.mcp-pike-ovs-ha.local] at time 09:37:21.189425
2018-02-06 09:37:21,189 [salt.state       ][INFO    ][6285] Executing state host.present for kvm01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,190 [salt.state       ][INFO    ][6285] Host kvm01.mcp-pike-ovs-ha.local (192.168.10.141) already present
2018-02-06 09:37:21,190 [salt.state       ][INFO    ][6285] Completed state [kvm01.mcp-pike-ovs-ha.local] at time 09:37:21.190288 duration_in_ms=0.863
2018-02-06 09:37:21,190 [salt.state       ][INFO    ][6285] Running state [kvm03] at time 09:37:21.190507
2018-02-06 09:37:21,190 [salt.state       ][INFO    ][6285] Executing state host.present for kvm03
2018-02-06 09:37:21,191 [salt.state       ][INFO    ][6285] Host kvm03 (192.168.10.143) already present
2018-02-06 09:37:21,191 [salt.state       ][INFO    ][6285] Completed state [kvm03] at time 09:37:21.191373 duration_in_ms=0.866
2018-02-06 09:37:21,192 [salt.state       ][INFO    ][6285] Running state [kvm03.mcp-pike-ovs-ha.local] at time 09:37:21.192720
2018-02-06 09:37:21,192 [salt.state       ][INFO    ][6285] Executing state host.present for kvm03.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,193 [salt.state       ][INFO    ][6285] Host kvm03.mcp-pike-ovs-ha.local (192.168.10.143) already present
2018-02-06 09:37:21,193 [salt.state       ][INFO    ][6285] Completed state [kvm03.mcp-pike-ovs-ha.local] at time 09:37:21.193585 duration_in_ms=0.865
2018-02-06 09:37:21,193 [salt.state       ][INFO    ][6285] Running state [kvm02] at time 09:37:21.193804
2018-02-06 09:37:21,194 [salt.state       ][INFO    ][6285] Executing state host.present for kvm02
2018-02-06 09:37:21,194 [salt.state       ][INFO    ][6285] Host kvm02 (192.168.10.142) already present
2018-02-06 09:37:21,194 [salt.state       ][INFO    ][6285] Completed state [kvm02] at time 09:37:21.194659 duration_in_ms=0.856
2018-02-06 09:37:21,194 [salt.state       ][INFO    ][6285] Running state [kvm02.mcp-pike-ovs-ha.local] at time 09:37:21.194881
2018-02-06 09:37:21,195 [salt.state       ][INFO    ][6285] Executing state host.present for kvm02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,196 [salt.state       ][INFO    ][6285] Host kvm02.mcp-pike-ovs-ha.local (192.168.10.142) already present
2018-02-06 09:37:21,196 [salt.state       ][INFO    ][6285] Completed state [kvm02.mcp-pike-ovs-ha.local] at time 09:37:21.196332 duration_in_ms=1.45
2018-02-06 09:37:21,196 [salt.state       ][INFO    ][6285] Running state [dbs] at time 09:37:21.196555
2018-02-06 09:37:21,196 [salt.state       ][INFO    ][6285] Executing state host.present for dbs
2018-02-06 09:37:21,197 [salt.state       ][INFO    ][6285] Host dbs (192.168.10.50) already present
2018-02-06 09:37:21,197 [salt.state       ][INFO    ][6285] Completed state [dbs] at time 09:37:21.197423 duration_in_ms=0.869
2018-02-06 09:37:21,197 [salt.state       ][INFO    ][6285] Running state [dbs.mcp-pike-ovs-ha.local] at time 09:37:21.197644
2018-02-06 09:37:21,197 [salt.state       ][INFO    ][6285] Executing state host.present for dbs.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,198 [salt.state       ][INFO    ][6285] Host dbs.mcp-pike-ovs-ha.local (192.168.10.50) already present
2018-02-06 09:37:21,198 [salt.state       ][INFO    ][6285] Completed state [dbs.mcp-pike-ovs-ha.local] at time 09:37:21.198711 duration_in_ms=1.067
2018-02-06 09:37:21,198 [salt.state       ][INFO    ][6285] Running state [prx] at time 09:37:21.198932
2018-02-06 09:37:21,199 [salt.state       ][INFO    ][6285] Executing state host.present for prx
2018-02-06 09:37:21,199 [salt.state       ][INFO    ][6285] Host prx (192.168.10.103) already present
2018-02-06 09:37:21,199 [salt.state       ][INFO    ][6285] Completed state [prx] at time 09:37:21.199793 duration_in_ms=0.861
2018-02-06 09:37:21,200 [salt.state       ][INFO    ][6285] Running state [prx.mcp-pike-ovs-ha.local] at time 09:37:21.200010
2018-02-06 09:37:21,200 [salt.state       ][INFO    ][6285] Executing state host.present for prx.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,200 [salt.state       ][INFO    ][6285] Host prx.mcp-pike-ovs-ha.local (192.168.10.103) already present
2018-02-06 09:37:21,200 [salt.state       ][INFO    ][6285] Completed state [prx.mcp-pike-ovs-ha.local] at time 09:37:21.200889 duration_in_ms=0.879
2018-02-06 09:37:21,201 [salt.state       ][INFO    ][6285] Running state [prx02] at time 09:37:21.201108
2018-02-06 09:37:21,201 [salt.state       ][INFO    ][6285] Executing state host.present for prx02
2018-02-06 09:37:21,201 [salt.state       ][INFO    ][6285] Host prx02 (192.168.10.105) already present
2018-02-06 09:37:21,202 [salt.state       ][INFO    ][6285] Completed state [prx02] at time 09:37:21.201967 duration_in_ms=0.859
2018-02-06 09:37:21,202 [salt.state       ][INFO    ][6285] Running state [prx02.mcp-pike-ovs-ha.local] at time 09:37:21.202190
2018-02-06 09:37:21,202 [salt.state       ][INFO    ][6285] Executing state host.present for prx02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,203 [salt.state       ][INFO    ][6285] Host prx02.mcp-pike-ovs-ha.local (192.168.10.105) already present
2018-02-06 09:37:21,203 [salt.state       ][INFO    ][6285] Completed state [prx02.mcp-pike-ovs-ha.local] at time 09:37:21.203324 duration_in_ms=1.133
2018-02-06 09:37:21,205 [salt.state       ][INFO    ][6285] Running state [file.replace] at time 09:37:21.205240
2018-02-06 09:37:21,205 [salt.state       ][INFO    ][6285] Executing state module.run for file.replace
2018-02-06 09:37:21,300 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['git', '--version'] in directory '/root'
2018-02-06 09:37:21,397 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command 'grep -q "prx02 prx02.mcp-pike-ovs-ha.local" /etc/hosts' in directory '/root'
2018-02-06 09:37:21,419 [salt.state       ][INFO    ][6285] {'ret': '--- \n+++ \n@@ -17,7 +17,7 @@\n 192.168.10.142\t\tkvm02 kvm02.mcp-pike-ovs-ha.local\n 192.168.10.50\t\tdbs dbs.mcp-pike-ovs-ha.local\n 192.168.10.103\t\tprx prx.mcp-pike-ovs-ha.local\n-192.168.10.105\t\tprx02 prx02.mcp-pike-ovs-ha.local\n+192.168.10.105\t\tprx02.mcp-pike-ovs-ha.local prx02\n 192.168.10.42\t\tmsg02 msg02.mcp-pike-ovs-ha.local\n 192.168.10.43\t\tmsg03 msg03.mcp-pike-ovs-ha.local\n 192.168.10.41\t\tmsg01 msg01.mcp-pike-ovs-ha.local\n'}
2018-02-06 09:37:21,420 [salt.state       ][INFO    ][6285] Completed state [file.replace] at time 09:37:21.420260 duration_in_ms=215.019
2018-02-06 09:37:21,420 [salt.state       ][INFO    ][6285] Running state [msg02] at time 09:37:21.420941
2018-02-06 09:37:21,421 [salt.state       ][INFO    ][6285] Executing state host.present for msg02
2018-02-06 09:37:21,422 [salt.state       ][INFO    ][6285] Host msg02 (192.168.10.42) already present
2018-02-06 09:37:21,422 [salt.state       ][INFO    ][6285] Completed state [msg02] at time 09:37:21.422900 duration_in_ms=1.959
2018-02-06 09:37:21,423 [salt.state       ][INFO    ][6285] Running state [msg02.mcp-pike-ovs-ha.local] at time 09:37:21.423381
2018-02-06 09:37:21,423 [salt.state       ][INFO    ][6285] Executing state host.present for msg02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,424 [salt.state       ][INFO    ][6285] Host msg02.mcp-pike-ovs-ha.local (192.168.10.42) already present
2018-02-06 09:37:21,425 [salt.state       ][INFO    ][6285] Completed state [msg02.mcp-pike-ovs-ha.local] at time 09:37:21.425093 duration_in_ms=1.712
2018-02-06 09:37:21,425 [salt.state       ][INFO    ][6285] Running state [msg03] at time 09:37:21.425563
2018-02-06 09:37:21,426 [salt.state       ][INFO    ][6285] Executing state host.present for msg03
2018-02-06 09:37:21,426 [salt.state       ][INFO    ][6285] Host msg03 (192.168.10.43) already present
2018-02-06 09:37:21,427 [salt.state       ][INFO    ][6285] Completed state [msg03] at time 09:37:21.427137 duration_in_ms=1.574
2018-02-06 09:37:21,427 [salt.state       ][INFO    ][6285] Running state [msg03.mcp-pike-ovs-ha.local] at time 09:37:21.427605
2018-02-06 09:37:21,428 [salt.state       ][INFO    ][6285] Executing state host.present for msg03.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,428 [salt.state       ][INFO    ][6285] Host msg03.mcp-pike-ovs-ha.local (192.168.10.43) already present
2018-02-06 09:37:21,429 [salt.state       ][INFO    ][6285] Completed state [msg03.mcp-pike-ovs-ha.local] at time 09:37:21.429230 duration_in_ms=1.624
2018-02-06 09:37:21,429 [salt.state       ][INFO    ][6285] Running state [msg01] at time 09:37:21.429690
2018-02-06 09:37:21,430 [salt.state       ][INFO    ][6285] Executing state host.present for msg01
2018-02-06 09:37:21,430 [salt.state       ][INFO    ][6285] Host msg01 (192.168.10.41) already present
2018-02-06 09:37:21,431 [salt.state       ][INFO    ][6285] Completed state [msg01] at time 09:37:21.431281 duration_in_ms=1.591
2018-02-06 09:37:21,431 [salt.state       ][INFO    ][6285] Running state [msg01.mcp-pike-ovs-ha.local] at time 09:37:21.431770
2018-02-06 09:37:21,432 [salt.state       ][INFO    ][6285] Executing state host.present for msg01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,433 [salt.state       ][INFO    ][6285] Host msg01.mcp-pike-ovs-ha.local (192.168.10.41) already present
2018-02-06 09:37:21,433 [salt.state       ][INFO    ][6285] Completed state [msg01.mcp-pike-ovs-ha.local] at time 09:37:21.433440 duration_in_ms=1.67
2018-02-06 09:37:21,433 [salt.state       ][INFO    ][6285] Running state [msg] at time 09:37:21.433901
2018-02-06 09:37:21,434 [salt.state       ][INFO    ][6285] Executing state host.present for msg
2018-02-06 09:37:21,435 [salt.state       ][INFO    ][6285] Host msg (192.168.10.40) already present
2018-02-06 09:37:21,435 [salt.state       ][INFO    ][6285] Completed state [msg] at time 09:37:21.435490 duration_in_ms=1.589
2018-02-06 09:37:21,435 [salt.state       ][INFO    ][6285] Running state [msg.mcp-pike-ovs-ha.local] at time 09:37:21.435953
2018-02-06 09:37:21,436 [salt.state       ][INFO    ][6285] Executing state host.present for msg.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,437 [salt.state       ][INFO    ][6285] Host msg.mcp-pike-ovs-ha.local (192.168.10.40) already present
2018-02-06 09:37:21,437 [salt.state       ][INFO    ][6285] Completed state [msg.mcp-pike-ovs-ha.local] at time 09:37:21.437585 duration_in_ms=1.632
2018-02-06 09:37:21,438 [salt.state       ][INFO    ][6285] Running state [cfg01] at time 09:37:21.438053
2018-02-06 09:37:21,438 [salt.state       ][INFO    ][6285] Executing state host.present for cfg01
2018-02-06 09:37:21,439 [salt.state       ][INFO    ][6285] Host cfg01 (192.168.10.100) already present
2018-02-06 09:37:21,440 [salt.state       ][INFO    ][6285] Completed state [cfg01] at time 09:37:21.439971 duration_in_ms=1.918
2018-02-06 09:37:21,440 [salt.state       ][INFO    ][6285] Running state [cfg01.mcp-pike-ovs-ha.local] at time 09:37:21.440476
2018-02-06 09:37:21,440 [salt.state       ][INFO    ][6285] Executing state host.present for cfg01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,441 [salt.state       ][INFO    ][6285] Host cfg01.mcp-pike-ovs-ha.local (192.168.10.100) already present
2018-02-06 09:37:21,442 [salt.state       ][INFO    ][6285] Completed state [cfg01.mcp-pike-ovs-ha.local] at time 09:37:21.442252 duration_in_ms=1.776
2018-02-06 09:37:21,442 [salt.state       ][INFO    ][6285] Running state [cmp002] at time 09:37:21.442722
2018-02-06 09:37:21,443 [salt.state       ][INFO    ][6285] Executing state host.present for cmp002
2018-02-06 09:37:21,443 [salt.state       ][INFO    ][6285] Host cmp002 (192.168.10.102) already present
2018-02-06 09:37:21,444 [salt.state       ][INFO    ][6285] Completed state [cmp002] at time 09:37:21.444350 duration_in_ms=1.628
2018-02-06 09:37:21,444 [salt.state       ][INFO    ][6285] Running state [cmp002.mcp-pike-ovs-ha.local] at time 09:37:21.444828
2018-02-06 09:37:21,445 [salt.state       ][INFO    ][6285] Executing state host.present for cmp002.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,446 [salt.state       ][INFO    ][6285] Host cmp002.mcp-pike-ovs-ha.local (192.168.10.102) already present
2018-02-06 09:37:21,446 [salt.state       ][INFO    ][6285] Completed state [cmp002.mcp-pike-ovs-ha.local] at time 09:37:21.446413 duration_in_ms=1.585
2018-02-06 09:37:21,446 [salt.state       ][INFO    ][6285] Running state [cmp001] at time 09:37:21.446872
2018-02-06 09:37:21,447 [salt.state       ][INFO    ][6285] Executing state host.present for cmp001
2018-02-06 09:37:21,448 [salt.state       ][INFO    ][6285] Host cmp001 (192.168.10.101) already present
2018-02-06 09:37:21,448 [salt.state       ][INFO    ][6285] Completed state [cmp001] at time 09:37:21.448491 duration_in_ms=1.619
2018-02-06 09:37:21,448 [salt.state       ][INFO    ][6285] Running state [cmp001.mcp-pike-ovs-ha.local] at time 09:37:21.448960
2018-02-06 09:37:21,449 [salt.state       ][INFO    ][6285] Executing state host.present for cmp001.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,450 [salt.state       ][INFO    ][6285] Host cmp001.mcp-pike-ovs-ha.local (192.168.10.101) already present
2018-02-06 09:37:21,450 [salt.state       ][INFO    ][6285] Completed state [cmp001.mcp-pike-ovs-ha.local] at time 09:37:21.450548 duration_in_ms=1.588
2018-02-06 09:37:21,451 [salt.state       ][INFO    ][6285] Running state [dbs01] at time 09:37:21.451004
2018-02-06 09:37:21,451 [salt.state       ][INFO    ][6285] Executing state host.present for dbs01
2018-02-06 09:37:21,452 [salt.state       ][INFO    ][6285] Host dbs01 (192.168.10.51) already present
2018-02-06 09:37:21,452 [salt.state       ][INFO    ][6285] Completed state [dbs01] at time 09:37:21.452578 duration_in_ms=1.574
2018-02-06 09:37:21,453 [salt.state       ][INFO    ][6285] Running state [dbs01.mcp-pike-ovs-ha.local] at time 09:37:21.453280
2018-02-06 09:37:21,453 [salt.state       ][INFO    ][6285] Executing state host.present for dbs01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,454 [salt.state       ][INFO    ][6285] Host dbs01.mcp-pike-ovs-ha.local (192.168.10.51) already present
2018-02-06 09:37:21,454 [salt.state       ][INFO    ][6285] Completed state [dbs01.mcp-pike-ovs-ha.local] at time 09:37:21.454762 duration_in_ms=1.481
2018-02-06 09:37:21,455 [salt.state       ][INFO    ][6285] Running state [dbs02] at time 09:37:21.455186
2018-02-06 09:37:21,455 [salt.state       ][INFO    ][6285] Executing state host.present for dbs02
2018-02-06 09:37:21,456 [salt.state       ][INFO    ][6285] Host dbs02 (192.168.10.52) already present
2018-02-06 09:37:21,457 [salt.state       ][INFO    ][6285] Completed state [dbs02] at time 09:37:21.457006 duration_in_ms=1.82
2018-02-06 09:37:21,457 [salt.state       ][INFO    ][6285] Running state [dbs02.mcp-pike-ovs-ha.local] at time 09:37:21.457433
2018-02-06 09:37:21,457 [salt.state       ][INFO    ][6285] Executing state host.present for dbs02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,458 [salt.state       ][INFO    ][6285] Host dbs02.mcp-pike-ovs-ha.local (192.168.10.52) already present
2018-02-06 09:37:21,458 [salt.state       ][INFO    ][6285] Completed state [dbs02.mcp-pike-ovs-ha.local] at time 09:37:21.458897 duration_in_ms=1.464
2018-02-06 09:37:21,459 [salt.state       ][INFO    ][6285] Running state [dbs03] at time 09:37:21.459330
2018-02-06 09:37:21,459 [salt.state       ][INFO    ][6285] Executing state host.present for dbs03
2018-02-06 09:37:21,460 [salt.state       ][INFO    ][6285] Host dbs03 (192.168.10.53) already present
2018-02-06 09:37:21,460 [salt.state       ][INFO    ][6285] Completed state [dbs03] at time 09:37:21.460860 duration_in_ms=1.53
2018-02-06 09:37:21,461 [salt.state       ][INFO    ][6285] Running state [dbs03.mcp-pike-ovs-ha.local] at time 09:37:21.461288
2018-02-06 09:37:21,461 [salt.state       ][INFO    ][6285] Executing state host.present for dbs03.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,462 [salt.state       ][INFO    ][6285] Host dbs03.mcp-pike-ovs-ha.local (192.168.10.53) already present
2018-02-06 09:37:21,462 [salt.state       ][INFO    ][6285] Completed state [dbs03.mcp-pike-ovs-ha.local] at time 09:37:21.462767 duration_in_ms=1.479
2018-02-06 09:37:21,463 [salt.state       ][INFO    ][6285] Running state [mas01] at time 09:37:21.463195
2018-02-06 09:37:21,463 [salt.state       ][INFO    ][6285] Executing state host.present for mas01
2018-02-06 09:37:21,464 [salt.state       ][INFO    ][6285] Host mas01 (192.168.10.3) already present
2018-02-06 09:37:21,464 [salt.state       ][INFO    ][6285] Completed state [mas01] at time 09:37:21.464746 duration_in_ms=1.551
2018-02-06 09:37:21,465 [salt.state       ][INFO    ][6285] Running state [mas01.mcp-pike-ovs-ha.local] at time 09:37:21.465170
2018-02-06 09:37:21,465 [salt.state       ][INFO    ][6285] Executing state host.present for mas01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,466 [salt.state       ][INFO    ][6285] Host mas01.mcp-pike-ovs-ha.local (192.168.10.3) already present
2018-02-06 09:37:21,466 [salt.state       ][INFO    ][6285] Completed state [mas01.mcp-pike-ovs-ha.local] at time 09:37:21.466663 duration_in_ms=1.493
2018-02-06 09:37:21,467 [salt.state       ][INFO    ][6285] Running state [ctl02] at time 09:37:21.467106
2018-02-06 09:37:21,468 [salt.state       ][INFO    ][6285] Executing state host.present for ctl02
2018-02-06 09:37:21,469 [salt.state       ][INFO    ][6285] Host ctl02 (192.168.10.12) already present
2018-02-06 09:37:21,469 [salt.state       ][INFO    ][6285] Completed state [ctl02] at time 09:37:21.469470 duration_in_ms=2.364
2018-02-06 09:37:21,469 [salt.state       ][INFO    ][6285] Running state [ctl02.mcp-pike-ovs-ha.local] at time 09:37:21.469890
2018-02-06 09:37:21,470 [salt.state       ][INFO    ][6285] Executing state host.present for ctl02.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,471 [salt.state       ][INFO    ][6285] Host ctl02.mcp-pike-ovs-ha.local (192.168.10.12) already present
2018-02-06 09:37:21,471 [salt.state       ][INFO    ][6285] Completed state [ctl02.mcp-pike-ovs-ha.local] at time 09:37:21.471370 duration_in_ms=1.48
2018-02-06 09:37:21,471 [salt.state       ][INFO    ][6285] Running state [ctl03] at time 09:37:21.471813
2018-02-06 09:37:21,472 [salt.state       ][INFO    ][6285] Executing state host.present for ctl03
2018-02-06 09:37:21,472 [salt.state       ][INFO    ][6285] Host ctl03 (192.168.10.13) already present
2018-02-06 09:37:21,473 [salt.state       ][INFO    ][6285] Completed state [ctl03] at time 09:37:21.473338 duration_in_ms=1.525
2018-02-06 09:37:21,473 [salt.state       ][INFO    ][6285] Running state [ctl03.mcp-pike-ovs-ha.local] at time 09:37:21.473759
2018-02-06 09:37:21,474 [salt.state       ][INFO    ][6285] Executing state host.present for ctl03.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,474 [salt.state       ][INFO    ][6285] Host ctl03.mcp-pike-ovs-ha.local (192.168.10.13) already present
2018-02-06 09:37:21,475 [salt.state       ][INFO    ][6285] Completed state [ctl03.mcp-pike-ovs-ha.local] at time 09:37:21.475258 duration_in_ms=1.499
2018-02-06 09:37:21,476 [salt.state       ][INFO    ][6285] Running state [ctl01] at time 09:37:21.475977
2018-02-06 09:37:21,476 [salt.state       ][INFO    ][6285] Executing state host.present for ctl01
2018-02-06 09:37:21,477 [salt.state       ][INFO    ][6285] Host ctl01 (192.168.10.11) already present
2018-02-06 09:37:21,477 [salt.state       ][INFO    ][6285] Completed state [ctl01] at time 09:37:21.477494 duration_in_ms=1.517
2018-02-06 09:37:21,477 [salt.state       ][INFO    ][6285] Running state [ctl01.mcp-pike-ovs-ha.local] at time 09:37:21.477917
2018-02-06 09:37:21,478 [salt.state       ][INFO    ][6285] Executing state host.present for ctl01.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,479 [salt.state       ][INFO    ][6285] Host ctl01.mcp-pike-ovs-ha.local (192.168.10.11) already present
2018-02-06 09:37:21,480 [salt.state       ][INFO    ][6285] Completed state [ctl01.mcp-pike-ovs-ha.local] at time 09:37:21.479412 duration_in_ms=1.495
2018-02-06 09:37:21,480 [salt.state       ][INFO    ][6285] Running state [ctl] at time 09:37:21.480708
2018-02-06 09:37:21,481 [salt.state       ][INFO    ][6285] Executing state host.present for ctl
2018-02-06 09:37:21,481 [salt.state       ][INFO    ][6285] Host ctl (192.168.10.10) already present
2018-02-06 09:37:21,482 [salt.state       ][INFO    ][6285] Completed state [ctl] at time 09:37:21.482206 duration_in_ms=1.498
2018-02-06 09:37:21,482 [salt.state       ][INFO    ][6285] Running state [ctl.mcp-pike-ovs-ha.local] at time 09:37:21.482625
2018-02-06 09:37:21,483 [salt.state       ][INFO    ][6285] Executing state host.present for ctl.mcp-pike-ovs-ha.local
2018-02-06 09:37:21,484 [salt.state       ][INFO    ][6285] Host ctl.mcp-pike-ovs-ha.local (192.168.10.10) already present
2018-02-06 09:37:21,484 [salt.state       ][INFO    ][6285] Completed state [ctl.mcp-pike-ovs-ha.local] at time 09:37:21.484547 duration_in_ms=1.921
2018-02-06 09:37:21,485 [salt.state       ][INFO    ][6285] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 09:37:21.484976
2018-02-06 09:37:21,485 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/network/interfaces.d/50-cloud-init.cfg
2018-02-06 09:37:21,485 [salt.state       ][INFO    ][6285] File /etc/network/interfaces.d/50-cloud-init.cfg is not present
2018-02-06 09:37:21,486 [salt.state       ][INFO    ][6285] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 09:37:21.486302 duration_in_ms=1.326
2018-02-06 09:37:21,486 [salt.state       ][INFO    ][6285] Running state [ens2] at time 09:37:21.486723
2018-02-06 09:37:21,487 [salt.state       ][INFO    ][6285] Executing state network.managed for ens2
2018-02-06 09:37:22,280 [salt.state       ][INFO    ][6285] Interface ens2 is up to date.
2018-02-06 09:37:22,281 [salt.state       ][INFO    ][6285] Completed state [ens2] at time 09:37:22.281154 duration_in_ms=794.43
2018-02-06 09:37:22,281 [salt.state       ][INFO    ][6285] Running state [ens3] at time 09:37:22.281716
2018-02-06 09:37:22,282 [salt.state       ][INFO    ][6285] Executing state network.managed for ens3
2018-02-06 09:37:22,944 [salt.state       ][INFO    ][6285] Interface ens3 is up to date.
2018-02-06 09:37:22,946 [salt.state       ][INFO    ][6285] Completed state [ens3] at time 09:37:22.946381 duration_in_ms=664.667
2018-02-06 09:37:22,947 [salt.state       ][INFO    ][6285] Running state [ens3] at time 09:37:22.947081
2018-02-06 09:37:22,947 [salt.state       ][INFO    ][6285] Executing state network.routes for ens3
2018-02-06 09:37:22,957 [salt.state       ][INFO    ][6285] Interface ens3 routes are up to date.
2018-02-06 09:37:22,957 [salt.state       ][INFO    ][6285] Completed state [ens3] at time 09:37:22.957797 duration_in_ms=10.715
2018-02-06 09:37:22,958 [salt.state       ][INFO    ][6285] Running state [ens4] at time 09:37:22.958287
2018-02-06 09:37:22,958 [salt.state       ][INFO    ][6285] Executing state network.managed for ens4
2018-02-06 09:37:23,675 [salt.state       ][INFO    ][6285] Interface ens4 is up to date.
2018-02-06 09:37:23,677 [salt.state       ][INFO    ][6285] Completed state [ens4] at time 09:37:23.677042 duration_in_ms=718.754
2018-02-06 09:37:23,677 [salt.state       ][INFO    ][6285] Running state [/etc/profile.d/proxy.sh] at time 09:37:23.677625
2018-02-06 09:37:23,678 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/profile.d/proxy.sh
2018-02-06 09:37:23,678 [salt.state       ][INFO    ][6285] File /etc/profile.d/proxy.sh is not present
2018-02-06 09:37:23,679 [salt.state       ][INFO    ][6285] Completed state [/etc/profile.d/proxy.sh] at time 09:37:23.679353 duration_in_ms=1.728
2018-02-06 09:37:23,680 [salt.state       ][INFO    ][6285] Running state [/etc/apt/apt.conf.d/95proxies] at time 09:37:23.680463
2018-02-06 09:37:23,680 [salt.state       ][INFO    ][6285] Executing state file.absent for /etc/apt/apt.conf.d/95proxies
2018-02-06 09:37:23,681 [salt.state       ][INFO    ][6285] File /etc/apt/apt.conf.d/95proxies is not present
2018-02-06 09:37:23,681 [salt.state       ][INFO    ][6285] Completed state [/etc/apt/apt.conf.d/95proxies] at time 09:37:23.681878 duration_in_ms=1.415
2018-02-06 09:37:23,682 [salt.state       ][INFO    ][6285] Running state [ntp] at time 09:37:23.682352
2018-02-06 09:37:23,682 [salt.state       ][INFO    ][6285] Executing state pkg.installed for ntp
2018-02-06 09:37:23,693 [salt.state       ][INFO    ][6285] All specified packages are already installed
2018-02-06 09:37:23,694 [salt.state       ][INFO    ][6285] Completed state [ntp] at time 09:37:23.694076 duration_in_ms=11.724
2018-02-06 09:37:23,695 [salt.state       ][INFO    ][6285] Running state [/etc/ntp.conf] at time 09:37:23.695730
2018-02-06 09:37:23,696 [salt.state       ][INFO    ][6285] Executing state file.managed for /etc/ntp.conf
2018-02-06 09:37:23,754 [salt.state       ][INFO    ][6285] File /etc/ntp.conf is in the correct state
2018-02-06 09:37:23,754 [salt.state       ][INFO    ][6285] Completed state [/etc/ntp.conf] at time 09:37:23.754772 duration_in_ms=59.042
2018-02-06 09:37:23,756 [salt.state       ][INFO    ][6285] Running state [ntp] at time 09:37:23.756774
2018-02-06 09:37:23,757 [salt.state       ][INFO    ][6285] Executing state service.running for ntp
2018-02-06 09:37:23,758 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-02-06 09:37:23,780 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-02-06 09:37:23,802 [salt.loaded.int.module.cmdmod][INFO    ][6285] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-02-06 09:37:23,825 [salt.state       ][INFO    ][6285] The service ntp is already running
2018-02-06 09:37:23,826 [salt.state       ][INFO    ][6285] Completed state [ntp] at time 09:37:23.826326 duration_in_ms=69.551
2018-02-06 09:37:23,831 [salt.minion      ][INFO    ][6285] Returning information for job: 20180206093707738134
2018-02-06 09:37:50,901 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command ssh.set_auth_key with jid 20180206093750880209
2018-02-06 09:37:50,931 [salt.minion      ][INFO    ][7009] Starting a new job with PID 7009
2018-02-06 09:37:50,944 [salt.loader.192.168.11.2.int.module.ssh][WARNING ][7009] Public Key hashing currently defaults to "md5". This will change to "sha256" in the Nitrogen release.
2018-02-06 09:37:50,946 [salt.minion      ][INFO    ][7009] Returning information for job: 20180206093750880209
2018-02-06 09:37:51,654 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command file.write with jid 20180206093751636713
2018-02-06 09:37:51,677 [salt.minion      ][INFO    ][7018] Starting a new job with PID 7018
2018-02-06 09:37:51,690 [salt.minion      ][INFO    ][7018] Returning information for job: 20180206093751636713
2018-02-06 09:37:52,370 [salt.minion      ][INFO    ][2116] User sudo_ubuntu Executing command system.reboot with jid 20180206093752357850
2018-02-06 09:37:52,391 [salt.minion      ][INFO    ][7023] Starting a new job with PID 7023
2018-02-06 09:37:52,397 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][7023] Executing command ['shutdown', '-r', 'now'] in directory '/root'
2018-02-06 09:37:52,507 [salt.utils.parsers][WARNING ][2116] Minion received a SIGTERM. Exiting.
2018-02-06 09:37:52,507 [salt.cli.daemons ][INFO    ][2116] Shutting down the Salt Minion
2018-02-06 09:38:02,883 [salt.cli.daemons ][INFO    ][1359] Setting up the Salt Minion "prx02.mcp-pike-ovs-ha.local"
2018-02-06 09:38:03,095 [salt.cli.daemons ][INFO    ][1359] Starting up the Salt Minion
2018-02-06 09:38:03,097 [salt.utils.event ][INFO    ][1359] Starting pull socket on /var/run/salt/minion/minion_event_02eab499dd_pull.ipc
2018-02-06 09:38:03,731 [salt.minion      ][INFO    ][1359] Creating minion process manager
2018-02-06 09:38:04,960 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1359] Executing command ['date', '+%z'] in directory '/root'
2018-02-06 09:38:04,988 [salt.utils.schedule][INFO    ][1359] Updating job settings for scheduled job: __mine_interval
2018-02-06 09:38:04,991 [salt.minion      ][INFO    ][1359] Added mine.update to scheduler
2018-02-06 09:38:04,996 [salt.minion      ][INFO    ][1359] Minion is starting as user 'root'
2018-02-06 09:38:05,021 [salt.minion      ][INFO    ][1359] Minion is ready to receive requests!
2018-02-06 09:38:06,023 [salt.utils.schedule][INFO    ][1359] Running scheduled job: __mine_interval
2018-02-06 09:38:14,080 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command test.ping with jid 20180206093814064352
2018-02-06 09:38:14,105 [salt.minion      ][INFO    ][1455] Starting a new job with PID 1455
2018-02-06 09:38:14,178 [salt.minion      ][INFO    ][1455] Returning information for job: 20180206093814064352
2018-02-06 09:38:14,895 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command pkg.upgrade with jid 20180206093814876966
2018-02-06 09:38:14,920 [salt.minion      ][INFO    ][1460] Starting a new job with PID 1460
2018-02-06 09:38:14,985 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1460] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:38:15,466 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1460] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'upgrade'] in directory '/root'
2018-02-06 09:38:25,012 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093824989454
2018-02-06 09:38:25,041 [salt.minion      ][INFO    ][3978] Starting a new job with PID 3978
2018-02-06 09:38:25,084 [salt.minion      ][INFO    ][3978] Returning information for job: 20180206093824989454
2018-02-06 09:38:35,065 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206093835046936
2018-02-06 09:38:35,084 [salt.minion      ][INFO    ][8146] Starting a new job with PID 8146
2018-02-06 09:38:35,102 [salt.minion      ][INFO    ][8146] Returning information for job: 20180206093835046936
2018-02-06 09:38:39,676 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1460] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:38:39,715 [salt.minion      ][INFO    ][1460] Returning information for job: 20180206093814876966
2018-02-06 09:38:41,672 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command test.ping with jid 20180206093841658168
2018-02-06 09:38:41,697 [salt.minion      ][INFO    ][9188] Starting a new job with PID 9188
2018-02-06 09:38:41,767 [salt.minion      ][INFO    ][9188] Returning information for job: 20180206093841658168
2018-02-06 09:38:52,301 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command state.sls with jid 20180206093852290485
2018-02-06 09:38:52,324 [salt.minion      ][INFO    ][9193] Starting a new job with PID 9193
2018-02-06 09:38:52,943 [salt.state       ][INFO    ][9193] Loading fresh modules for state activity
2018-02-06 09:38:53,006 [salt.fileclient  ][INFO    ][9193] Fetching file from saltenv 'base', ** done ** 'keepalived/init.sls'
2018-02-06 09:38:53,038 [salt.fileclient  ][INFO    ][9193] Fetching file from saltenv 'base', ** done ** 'keepalived/cluster.sls'
2018-02-06 09:38:53,841 [salt.state       ][INFO    ][9193] Running state [keepalived] at time 09:38:53.841699
2018-02-06 09:38:53,842 [salt.state       ][INFO    ][9193] Executing state pkg.installed for keepalived
2018-02-06 09:38:53,843 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:38:54,190 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['apt-cache', '-q', 'policy', 'keepalived'] in directory '/root'
2018-02-06 09:38:54,268 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 09:38:55,957 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 09:38:55,990 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'keepalived'] in directory '/root'
2018-02-06 09:38:59,945 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 09:38:59,985 [salt.state       ][INFO    ][9193] Made the following changes:
'libsnmp30' changed from 'absent' to '5.7.3+dfsg-1ubuntu4'
'libnl-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'
'libsensors4' changed from 'absent' to '1:3.4.0-2'
'libsnmp-base' changed from 'absent' to '5.7.3+dfsg-1ubuntu4'
'keepalived' changed from 'absent' to '1:1.2.19-1ubuntu0.2'
'ipvsadm' changed from 'absent' to '1:1.28-3'
'libnl-genl-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'

2018-02-06 09:39:00,015 [salt.state       ][INFO    ][9193] Loading fresh modules for state activity
2018-02-06 09:39:00,046 [salt.state       ][INFO    ][9193] Completed state [keepalived] at time 09:39:00.045970 duration_in_ms=6204.271
2018-02-06 09:39:00,051 [salt.state       ][INFO    ][9193] Running state [lsof] at time 09:39:00.051401
2018-02-06 09:39:00,051 [salt.state       ][INFO    ][9193] Executing state pkg.installed for lsof
2018-02-06 09:39:00,478 [salt.state       ][INFO    ][9193] All specified packages are already installed
2018-02-06 09:39:00,479 [salt.state       ][INFO    ][9193] Completed state [lsof] at time 09:39:00.479298 duration_in_ms=427.896
2018-02-06 09:39:00,483 [salt.state       ][INFO    ][9193] Running state [/etc/keepalived/keepalived.conf] at time 09:39:00.483501
2018-02-06 09:39:00,484 [salt.state       ][INFO    ][9193] Executing state file.managed for /etc/keepalived/keepalived.conf
2018-02-06 09:39:00,521 [salt.fileclient  ][INFO    ][9193] Fetching file from saltenv 'base', ** done ** 'keepalived/files/keepalived.conf'
2018-02-06 09:39:00,580 [salt.state       ][INFO    ][9193] File changed:
New file
2018-02-06 09:39:00,581 [salt.state       ][INFO    ][9193] Completed state [/etc/keepalived/keepalived.conf] at time 09:39:00.581161 duration_in_ms=97.66
2018-02-06 09:39:00,583 [salt.state       ][INFO    ][9193] Running state [keepalived] at time 09:39:00.583171
2018-02-06 09:39:00,583 [salt.state       ][INFO    ][9193] Executing state service.running for keepalived
2018-02-06 09:39:00,584 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'status', 'keepalived.service', '-n', '0'] in directory '/root'
2018-02-06 09:39:00,613 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,633 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,654 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,674 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,760 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,787 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,804 [salt.loaded.int.module.cmdmod][INFO    ][9193] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-02-06 09:39:00,822 [salt.state       ][INFO    ][9193] {'keepalived': True}
2018-02-06 09:39:00,823 [salt.state       ][INFO    ][9193] Completed state [keepalived] at time 09:39:00.823232 duration_in_ms=240.061
2018-02-06 09:39:00,825 [salt.minion      ][INFO    ][9193] Returning information for job: 20180206093852290485
2018-02-06 09:41:16,476 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command pillar.get with jid 20180206094116463310
2018-02-06 09:41:16,502 [salt.minion      ][INFO    ][10374] Starting a new job with PID 10374
2018-02-06 09:41:16,507 [salt.minion      ][INFO    ][10374] Returning information for job: 20180206094116463310
2018-02-06 10:10:19,022 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command state.sls with jid 20180206101019004303
2018-02-06 10:10:19,041 [salt.minion      ][INFO    ][10454] Starting a new job with PID 10454
2018-02-06 10:10:21,469 [salt.state       ][INFO    ][10454] Loading fresh modules for state activity
2018-02-06 10:10:21,518 [salt.fileclient  ][INFO    ][10454] Fetching file from saltenv 'base', ** done ** 'memcached/init.sls'
2018-02-06 10:10:21,543 [salt.fileclient  ][INFO    ][10454] Fetching file from saltenv 'base', ** done ** 'memcached/server.sls'
2018-02-06 10:10:21,569 [salt.fileclient  ][INFO    ][10454] Fetching file from saltenv 'base', ** done ** 'memcached/map.jinja'
2018-02-06 10:10:22,081 [salt.state       ][INFO    ][10454] Running state [memcached] at time 10:10:22.081131
2018-02-06 10:10:22,082 [salt.state       ][INFO    ][10454] Executing state pkg.installed for memcached
2018-02-06 10:10:22,082 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:10:22,495 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['apt-cache', '-q', 'policy', 'memcached'] in directory '/root'
2018-02-06 10:10:22,575 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 10:10:26,139 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:10:26,167 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'memcached'] in directory '/root'
2018-02-06 10:10:29,135 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206101029114558
2018-02-06 10:10:29,159 [salt.minion      ][INFO    ][11131] Starting a new job with PID 11131
2018-02-06 10:10:29,179 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:10:29,180 [salt.minion      ][INFO    ][11131] Returning information for job: 20180206101029114558
2018-02-06 10:10:29,223 [salt.state       ][INFO    ][10454] Made the following changes:
'memcached' changed from 'absent' to '1.4.25-2ubuntu1.2'

2018-02-06 10:10:29,245 [salt.state       ][INFO    ][10454] Loading fresh modules for state activity
2018-02-06 10:10:29,279 [salt.state       ][INFO    ][10454] Completed state [memcached] at time 10:10:29.279019 duration_in_ms=7197.889
2018-02-06 10:10:29,285 [salt.state       ][INFO    ][10454] Running state [python-memcache] at time 10:10:29.284993
2018-02-06 10:10:29,285 [salt.state       ][INFO    ][10454] Executing state pkg.installed for python-memcache
2018-02-06 10:10:29,716 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:10:29,748 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-memcache'] in directory '/root'
2018-02-06 10:10:31,439 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:10:31,487 [salt.state       ][INFO    ][10454] Made the following changes:
'python-memcache' changed from 'absent' to '1.57-1'

2018-02-06 10:10:31,507 [salt.state       ][INFO    ][10454] Loading fresh modules for state activity
2018-02-06 10:10:31,541 [salt.state       ][INFO    ][10454] Completed state [python-memcache] at time 10:10:31.541620 duration_in_ms=2256.627
2018-02-06 10:10:31,545 [salt.state       ][INFO    ][10454] Running state [/etc/memcached.conf] at time 10:10:31.545933
2018-02-06 10:10:31,546 [salt.state       ][INFO    ][10454] Executing state file.managed for /etc/memcached.conf
2018-02-06 10:10:31,592 [salt.fileclient  ][INFO    ][10454] Fetching file from saltenv 'base', ** done ** 'memcached/files/memcached.conf'
2018-02-06 10:10:31,631 [salt.state       ][INFO    ][10454] File changed:
--- 
+++ 
@@ -1,11 +1,10 @@
+
 # memcached default config file
 # 2003 - Jay Bonci <jaybonci@debian.org>
-# This configuration file is read by the start-memcached script provided as
-# part of the Debian GNU/Linux distribution.
+# This configuration file is read by the start-memcached script provided as part of the Debian GNU/Linux distribution. 
 
 # Run memcached as a daemon. This command is implied, and is not needed for the
-# daemon to run. See the README.Debian that comes with this package for more
-# information.
+# daemon to run. See the README.Debian that comes with this package for more information.
 -d
 
 # Log memcached's output to /var/log/memcached
@@ -18,13 +17,13 @@
 # -vv
 
 # Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
-# Note that the daemon will grow to this size, but does not start out holding this much
-# memory
+# Note that the daemon will grow to this size, but does not start out holding this much memory
 -m 64
 
 # Default connection port is 11211
 -p 11211
 
+-U 11211
 # Run the daemon as root. The start-memcached will default to running as root if no
 # -u command is present in this config file
 -u memcache
@@ -32,10 +31,12 @@
 # Specify which IP address to listen on. The default is to listen on all IP addresses
 # This parameter is one of the only security measures that memcached has, so make sure
 # it's listening on a firewalled interface.
--l 127.0.0.1
+-l 0.0.0.0
 
 # Limit the number of simultaneous incoming connections. The daemon default is 1024
 # -c 1024
+# Mirantis
+-c 8192
 
 # Lock down all paged memory. Consult with the README and homepage before you do this
 # -k
@@ -45,3 +46,6 @@
 
 # Maximize core file limit
 # -r
+
+# Number of threads to use to process incoming requests.
+-t 1
2018-02-06 10:10:31,631 [salt.state       ][INFO    ][10454] Completed state [/etc/memcached.conf] at time 10:10:31.631831 duration_in_ms=85.898
2018-02-06 10:10:31,790 [salt.state       ][INFO    ][10454] Running state [memcached] at time 10:10:31.790610
2018-02-06 10:10:31,790 [salt.state       ][INFO    ][10454] Executing state service.running for memcached
2018-02-06 10:10:31,793 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemctl', 'status', 'memcached.service', '-n', '0'] in directory '/root'
2018-02-06 10:10:31,814 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-02-06 10:10:31,830 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-02-06 10:10:31,846 [salt.state       ][INFO    ][10454] The service memcached is already running
2018-02-06 10:10:31,846 [salt.state       ][INFO    ][10454] Completed state [memcached] at time 10:10:31.846632 duration_in_ms=56.022
2018-02-06 10:10:31,846 [salt.state       ][INFO    ][10454] Running state [memcached] at time 10:10:31.846946
2018-02-06 10:10:31,847 [salt.state       ][INFO    ][10454] Executing state service.mod_watch for memcached
2018-02-06 10:10:31,848 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-02-06 10:10:31,864 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-02-06 10:10:31,879 [salt.loaded.int.module.cmdmod][INFO    ][10454] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'memcached.service'] in directory '/root'
2018-02-06 10:10:31,926 [salt.state       ][INFO    ][10454] {'memcached': True}
2018-02-06 10:10:31,926 [salt.state       ][INFO    ][10454] Completed state [memcached] at time 10:10:31.926625 duration_in_ms=79.679
2018-02-06 10:10:31,928 [salt.minion      ][INFO    ][10454] Returning information for job: 20180206101019004303
2018-02-06 10:37:52,975 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command state.sls with jid 20180206103752951148
2018-02-06 10:37:52,997 [salt.minion      ][INFO    ][11352] Starting a new job with PID 11352
2018-02-06 10:37:53,628 [salt.state       ][INFO    ][11352] Loading fresh modules for state activity
2018-02-06 10:37:53,677 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/init.sls'
2018-02-06 10:37:53,711 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/server/init.sls'
2018-02-06 10:37:53,737 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/server/service.sls'
2018-02-06 10:37:53,846 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/server/plugin.sls'
2018-02-06 10:37:54,425 [salt.state       ][INFO    ][11352] Running state [apache2] at time 10:37:54.425066
2018-02-06 10:37:54,425 [salt.state       ][INFO    ][11352] Executing state pkg.installed for apache2
2018-02-06 10:37:54,426 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:37:54,788 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['apt-cache', '-q', 'policy', 'apache2'] in directory '/root'
2018-02-06 10:37:54,870 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 10:37:58,332 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:37:58,369 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'apache2'] in directory '/root'
2018-02-06 10:38:03,073 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103803054346
2018-02-06 10:38:03,091 [salt.minion      ][INFO    ][12574] Starting a new job with PID 12574
2018-02-06 10:38:03,111 [salt.minion      ][INFO    ][12574] Returning information for job: 20180206103803054346
2018-02-06 10:38:04,680 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:38:04,727 [salt.state       ][INFO    ][11352] Made the following changes:
'apache2-data' changed from 'absent' to '2.4.18-2ubuntu3.5'
'httpd-cgi' changed from 'absent' to '1'
'apache2-utils' changed from 'absent' to '2.4.18-2ubuntu3.5'
'httpd' changed from 'absent' to '1'
'ssl-cert' changed from 'absent' to '1.0.37'
'apache2' changed from 'absent' to '2.4.18-2ubuntu3.5'

2018-02-06 10:38:04,748 [salt.state       ][INFO    ][11352] Loading fresh modules for state activity
2018-02-06 10:38:04,785 [salt.state       ][INFO    ][11352] Completed state [apache2] at time 10:38:04.785353 duration_in_ms=10360.288
2018-02-06 10:38:04,791 [salt.state       ][INFO    ][11352] Running state [libapache2-mod-wsgi] at time 10:38:04.791132
2018-02-06 10:38:04,791 [salt.state       ][INFO    ][11352] Executing state pkg.installed for libapache2-mod-wsgi
2018-02-06 10:38:05,207 [salt.state       ][INFO    ][11352] All specified packages are already installed
2018-02-06 10:38:05,208 [salt.state       ][INFO    ][11352] Completed state [libapache2-mod-wsgi] at time 10:38:05.207928 duration_in_ms=416.795
2018-02-06 10:38:05,208 [salt.state       ][INFO    ][11352] Running state [openstack-dashboard] at time 10:38:05.208492
2018-02-06 10:38:05,209 [salt.state       ][INFO    ][11352] Executing state pkg.installed for openstack-dashboard
2018-02-06 10:38:05,229 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:38:05,266 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'openstack-dashboard'] in directory '/root'
2018-02-06 10:38:06,023 [salt.utils.schedule][INFO    ][1359] Running scheduled job: __mine_interval
2018-02-06 10:38:13,307 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103813279659
2018-02-06 10:38:13,329 [salt.minion      ][INFO    ][13248] Starting a new job with PID 13248
2018-02-06 10:38:13,344 [salt.minion      ][INFO    ][13248] Returning information for job: 20180206103813279659
2018-02-06 10:38:23,339 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103823314459
2018-02-06 10:38:23,360 [salt.minion      ][INFO    ][14831] Starting a new job with PID 14831
2018-02-06 10:38:23,376 [salt.minion      ][INFO    ][14831] Returning information for job: 20180206103823314459
2018-02-06 10:38:33,363 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103833341428
2018-02-06 10:38:33,384 [salt.minion      ][INFO    ][15412] Starting a new job with PID 15412
2018-02-06 10:38:33,403 [salt.minion      ][INFO    ][15412] Returning information for job: 20180206103833341428
2018-02-06 10:38:43,415 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103843390557
2018-02-06 10:38:43,440 [salt.minion      ][INFO    ][15611] Starting a new job with PID 15611
2018-02-06 10:38:43,454 [salt.minion      ][INFO    ][15611] Returning information for job: 20180206103843390557
2018-02-06 10:38:53,440 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103853419169
2018-02-06 10:38:53,466 [salt.minion      ][INFO    ][15616] Starting a new job with PID 15616
2018-02-06 10:38:53,479 [salt.minion      ][INFO    ][15616] Returning information for job: 20180206103853419169
2018-02-06 10:39:03,470 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103903449419
2018-02-06 10:39:03,495 [salt.minion      ][INFO    ][15621] Starting a new job with PID 15621
2018-02-06 10:39:03,510 [salt.minion      ][INFO    ][15621] Returning information for job: 20180206103903449419
2018-02-06 10:39:13,563 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103913537189
2018-02-06 10:39:13,589 [salt.minion      ][INFO    ][15626] Starting a new job with PID 15626
2018-02-06 10:39:13,602 [salt.minion      ][INFO    ][15626] Returning information for job: 20180206103913537189
2018-02-06 10:39:23,585 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103923563779
2018-02-06 10:39:23,608 [salt.minion      ][INFO    ][15696] Starting a new job with PID 15696
2018-02-06 10:39:23,622 [salt.minion      ][INFO    ][15696] Returning information for job: 20180206103923563779
2018-02-06 10:39:24,109 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:39:24,161 [salt.state       ][INFO    ][11352] Made the following changes:
'python-routes' changed from 'absent' to '2.4.1-1~cloud0'
'python-retrying' changed from 'absent' to '1.3.3-1'
'python-kombu' changed from 'absent' to '4.0.2+really4.0.2+dfsg-2ubuntu1~cloud0'
'python-oslo.concurrency' changed from 'absent' to '3.21.0-0ubuntu2~cloud0'
'python-sqlparse' changed from 'absent' to '0.1.18-1'
'python-pint' changed from 'absent' to '0.6-1ubuntu1'
'python-monotonic' changed from 'absent' to '0.6-2'
'python2.7-pymongo' changed from 'absent' to '1'
'python2.7-bson' changed from 'absent' to '1'
'libtiff5' changed from 'absent' to '4.0.6-1ubuntu0.2'
'python-secretstorage' changed from 'absent' to '2.1.3-1'
'python-glanceclient' changed from 'absent' to '1:2.8.0-0ubuntu1~cloud0'
'python-formencode' changed from 'absent' to '1.3.0-0ubuntu5'
'python-functools32' changed from 'absent' to '3.2.3.2-2'
'python-cachetools' changed from 'absent' to '1.1.6-1~cloud0'
'python-semantic-version' changed from 'absent' to '2.3.1-1'
'python-blinker' changed from 'absent' to '1.3.dfsg2-1build1'
'python-roman' changed from 'absent' to '2.0.0-2'
'python-pastescript' changed from 'absent' to '1.7.5-3build1'
'python-bs4' changed from 'absent' to '4.4.1-1'
'python2.7-pymongo-ext' changed from 'absent' to '1'
'python-tenacity' changed from 'absent' to '3.3.0-0ubuntu1~cloud0'
'python-unittest2' changed from 'absent' to '1.1.0-6.1'
'python-setuptools' changed from 'absent' to '36.2.7-2~cloud0'
'python2.7-django-appconf' changed from 'absent' to '1'
'docutils-doc' changed from 'absent' to '0.12+dfsg-1'
'python-dbus' changed from 'absent' to '1.2.0-3'
'python-gridfs' changed from 'absent' to '3.2-1build1'
'python-fixtures' changed from 'absent' to '3.0.0-2~cloud0'
'python-testtools' changed from 'absent' to '1.8.1-0ubuntu1'
'python-anyjson' changed from 'absent' to '0.3.3-1build1'
'python-jsonschema' changed from 'absent' to '2.5.1-4'
'python-prettytable' changed from 'absent' to '0.7.2-3'
'python-compressor' changed from 'absent' to '2.0-1ubuntu1'
'python-netaddr' changed from 'absent' to '0.7.18-1'
'python-dnspython' changed from 'absent' to '1.15.0-1~cloud0'
'python-babel' changed from 'absent' to '2.4.0+dfsg.1-2ubuntu1~cloud0'
'python-requests' changed from '2.9.1-3' to '2.18.1-1~cloud0'
'python-certifi' changed from 'absent' to '2015.11.20.1-2'
'python-pil' changed from 'absent' to '3.1.2-0ubuntu1.1'
'docutils-common' changed from 'absent' to '0.12+dfsg-1'
'python2.7-lxml' changed from 'absent' to '1'
'python-pika' changed from 'absent' to '0.10.0-1'
'python-osc-lib' changed from 'absent' to '1.7.0-0ubuntu1~cloud0'
'python-keystoneclient' changed from 'absent' to '1:3.13.0-0ubuntu1~cloud0'
'python2.7-simplejson' changed from 'absent' to '1'
'python-extras' changed from 'absent' to '0.0.3-3'
'python2.7-django-openstack-auth' changed from 'absent' to '1'
'python-funcsigs' changed from 'absent' to '1.0.2-3~cloud0'
'python-bson-ext' changed from 'absent' to '3.2-1build1'
'python-scgi' changed from 'absent' to '1.13-1.1build1'
'python2.7-pil' changed from 'absent' to '1'
'python-repoze.lru' changed from 'absent' to '0.6-6'
'python-posix-ipc' changed from 'absent' to '0.9.8-2build2'
'formencode-i18n' changed from 'absent' to '1.3.0-0ubuntu5'
'python2.7-testtools' changed from 'absent' to '1'
'docutils' changed from 'absent' to '1'
'python-django-pyscss' changed from 'absent' to '2.0.2-4'
'ieee-data' changed from 'absent' to '20150531.1'
'python2.7-dbus' changed from 'absent' to '1'
'python-oslo.middleware' changed from 'absent' to '3.30.0-0ubuntu1.1~cloud0'
'python-pygments' changed from 'absent' to '2.2.0+dfsg-1~cloud0'
'python-pillow' changed from 'absent' to '1'
'python2.7-cinderclient' changed from 'absent' to '1'
'libpaperg' changed from 'absent' to '1'
'python2.7-netifaces' changed from 'absent' to '1'
'liblcms2-2' changed from 'absent' to '2.6-3ubuntu2'
'python-oslo.context' changed from 'absent' to '1:2.17.0-0ubuntu1~cloud0'
'python-neutronclient' changed from 'absent' to '1:6.5.0-0ubuntu1.1~cloud0'
'python-pymongo-ext' changed from 'absent' to '3.2-1build1'
'python-urllib3' changed from '1.13.1-2ubuntu0.16.04.1' to '1.21.1-1~cloud0'
'python2.7-pyinotify' changed from 'absent' to '1'
'python-webob' changed from 'absent' to '1:1.7.2-0ubuntu1~cloud0'
'python-pyparsing' changed from 'absent' to '2.1.10+dfsg1-1~cloud0'
'python-babel-localedata' changed from 'absent' to '2.4.0+dfsg.1-2ubuntu1~cloud0'
'python-positional' changed from 'absent' to '1.1.1-3~cloud0'
'python-appconf' changed from 'absent' to '1'
'python-cmd2' changed from 'absent' to '0.6.8-1'
'python-distribute' changed from 'absent' to '1'
'python-oslo-log' changed from 'absent' to '1'
'python-rjsmin' changed from 'absent' to '1.0.12+dfsg1-2ubuntu1'
'python-django-openstack-auth' changed from 'absent' to '3.5.0-0ubuntu1~cloud0'
'python-pathlib' changed from 'absent' to '1.0.1-2'
'python-iso8601' changed from 'absent' to '0.1.11-1'
'python-jsonpatch' changed from 'absent' to '1.19-3'
'python-cinderclient' changed from 'absent' to '1:3.1.0-0ubuntu1~cloud0'
'libwebpmux1' changed from 'absent' to '0.4.4-1'
'python-heatclient' changed from 'absent' to '1.11.0-0ubuntu1~cloud0'
'python-oslo.policy' changed from 'absent' to '1.25.1-0ubuntu1~cloud0'
'python-stevedore' changed from 'absent' to '1:1.25.0-0ubuntu1~cloud0'
'python-paste' changed from 'absent' to '1.7.5.1-6ubuntu3'
'python-openstack-auth' changed from 'absent' to '3.5.0-0ubuntu1~cloud0'
'python-lxml' changed from 'absent' to '3.5.0-1build1'
'python-oslo.config' changed from 'absent' to '1:4.11.0-0ubuntu1~cloud0'
'python-futurist' changed from 'absent' to '0.13.0-2'
'libpaper1' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-fasteners' changed from 'absent' to '0.12.0-2ubuntu1'
'python2.7-gi' changed from 'absent' to '1'
'python-linecache2' changed from 'absent' to '1.0.0-2'
'python-mimeparse' changed from 'absent' to '0.1.4-1build1'
'python-pastedeploy-tpl' changed from 'absent' to '1.5.2-1'
'python-oauthlib' changed from 'absent' to '1.0.3-1'
'python2.7-django-compressor' changed from 'absent' to '1'
'python-gi' changed from 'absent' to '3.20.0-0ubuntu1'
'python-contextlib2' changed from 'absent' to '0.5.1-1'
'python2.7-pathlib' changed from 'absent' to '1'
'python-oslo.serialization' changed from 'absent' to '2.20.0-0ubuntu1~cloud0'
'python-oslo.utils' changed from 'absent' to '3.28.0-0ubuntu1~cloud0'
'python-pika-pool' changed from 'absent' to '0.1.3-1ubuntu1'
'python-django' changed from 'absent' to '1.8.7-1ubuntu5.5'
'python-warlock' changed from 'absent' to '1.1.0-1'
'python-debtcollector' changed from 'absent' to '1.3.0-2'
'python2.7-gridfs' changed from 'absent' to '1'
'python-bson' changed from 'absent' to '3.2-1build1'
'python-simplejson' changed from 'absent' to '3.8.1-1ubuntu2'
'python-wrapt' changed from 'absent' to '1.8.0-5build2'
'python-docutils' changed from 'absent' to '0.12+dfsg-1'
'python-openid' changed from 'absent' to '2.2.5-6'
'python-pastedeploy' changed from 'absent' to '1.5.2-1'
'python2.7-cmd2' changed from 'absent' to '1'
'python-tz' changed from 'absent' to '2014.10~dfsg1-0ubuntu2'
'libpaper-utils' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-cliff' changed from 'absent' to '2.8.0-0ubuntu1~cloud0'
'python-oslo.i18n' changed from 'absent' to '3.17.0-0ubuntu1~cloud0'
'python-appdirs' changed from 'absent' to '1.4.0-2'
'libjpeg8' changed from 'absent' to '8c-2ubuntu8'
'python-statsd' changed from 'absent' to '3.2.1-2~cloud0'
'libxslt1.1' changed from 'absent' to '1.1.28-2.1ubuntu0.1'
'python-keyring' changed from 'absent' to '7.3-1ubuntu1'
'python-django-appconf' changed from 'absent' to '1.0.1-4'
'python-oslo-utils' changed from 'absent' to '1'
'python-novaclient' changed from 'absent' to '2:9.1.0-0ubuntu1~cloud0'
'python-unicodecsv' changed from 'absent' to '0.14.1-1'
'python-mock' changed from 'absent' to '2.0.0-3~cloud0'
'python-rfc3986' changed from 'absent' to '0.3.1-2~cloud0'
'python-eventlet' changed from 'absent' to '0.18.4-1ubuntu1'
'python-django-horizon' changed from 'absent' to '3:12.0.1-0ubuntu1~cloud0'
'python2.7-pyparsing' changed from 'absent' to '1'
'python-oslo.log' changed from 'absent' to '3.30.0-0ubuntu1~cloud0'
'python-pyscss' changed from 'absent' to '1.3.4-5'
'python-pyinotify' changed from 'absent' to '0.9.6-0fakesync1'
'libjpeg-turbo8' changed from 'absent' to '1.4.2-0ubuntu3'
'python-amqp' changed from 'absent' to '2.1.4-1~cloud0'
'python-pbr' changed from 'absent' to '2.0.0-0ubuntu1~cloud0'
'libwebp5' changed from 'absent' to '0.4.4-1'
'python-vine' changed from 'absent' to '1.1.3+dfsg-2~cloud0'
'python-django-compressor' changed from 'absent' to '2.0-1ubuntu1'
'python-netifaces' changed from 'absent' to '0.10.4-0.1build2'
'python-osprofiler' changed from 'absent' to '1.11.0-0ubuntu1~cloud0'
'python-os-client-config' changed from 'absent' to '1.28.0-0ubuntu1~cloud0'
'python-oslo.messaging' changed from 'absent' to '5.30.0-0ubuntu2~cloud0'
'python-django-common' changed from 'absent' to '1.8.7-1ubuntu5.5'
'python-tempita' changed from 'absent' to '0.5.2-1build1'
'openstack-dashboard' changed from 'absent' to '3:12.0.1-0ubuntu1~cloud0'
'python-json-pointer' changed from 'absent' to '1.9-3'
'python-html5lib' changed from 'absent' to '0.999-4'
'python-swiftclient' changed from 'absent' to '1:3.4.0-0ubuntu1~cloud0'
'python-jwt' changed from 'absent' to '1.3.0-1ubuntu0.1'
'python2.7-iso8601' changed from 'absent' to '1'
'python-greenlet' changed from 'absent' to '0.4.9-2fakesync1'
'python-oslo.service' changed from 'absent' to '1.25.0-0ubuntu1~cloud0'
'python-rcssmin' changed from 'absent' to '1.0.6-1ubuntu1'
'python-ceilometerclient' changed from 'absent' to '2.9.0-0ubuntu1~cloud0'
'python-csscompressor' changed from 'absent' to '0.9.4-2'
'python-traceback2' changed from 'absent' to '1.4.0-3'
'python-keystoneauth1' changed from 'absent' to '3.1.0-0ubuntu2~cloud0'
'python-pymongo' changed from 'absent' to '3.2-1build1'
'python-requestsexceptions' changed from 'absent' to '1.1.2-0ubuntu1'
'python-oslo-context' changed from 'absent' to '1'
'python2.7-bson-ext' changed from 'absent' to '1'
'libjbig0' changed from 'absent' to '2.1-3.1'

2018-02-06 10:39:24,182 [salt.state       ][INFO    ][11352] Loading fresh modules for state activity
2018-02-06 10:39:24,217 [salt.state       ][INFO    ][11352] Completed state [openstack-dashboard] at time 10:39:24.217350 duration_in_ms=79008.858
2018-02-06 10:39:24,222 [salt.state       ][INFO    ][11352] Running state [python-lesscpy] at time 10:39:24.222508
2018-02-06 10:39:24,223 [salt.state       ][INFO    ][11352] Executing state pkg.installed for python-lesscpy
2018-02-06 10:39:25,620 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:39:25,662 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-lesscpy'] in directory '/root'
2018-02-06 10:39:27,689 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:39:27,737 [salt.state       ][INFO    ][11352] Made the following changes:
'python-ply' changed from 'absent' to '3.7-1'
'python-lesscpy' changed from 'absent' to '0.10-1'
'python-ply-yacc-3.5' changed from 'absent' to '1'
'python2.7-ply' changed from 'absent' to '1'
'python-ply-lex-3.5' changed from 'absent' to '1'

2018-02-06 10:39:27,765 [salt.state       ][INFO    ][11352] Loading fresh modules for state activity
2018-02-06 10:39:27,934 [salt.state       ][INFO    ][11352] Completed state [python-lesscpy] at time 10:39:27.934169 duration_in_ms=3711.661
2018-02-06 10:39:27,939 [salt.state       ][INFO    ][11352] Running state [python-memcache] at time 10:39:27.939713
2018-02-06 10:39:27,940 [salt.state       ][INFO    ][11352] Executing state pkg.installed for python-memcache
2018-02-06 10:39:28,374 [salt.state       ][INFO    ][11352] All specified packages are already installed
2018-02-06 10:39:28,375 [salt.state       ][INFO    ][11352] Completed state [python-memcache] at time 10:39:28.375270 duration_in_ms=435.556
2018-02-06 10:39:28,375 [salt.state       ][INFO    ][11352] Running state [gettext-base] at time 10:39:28.375851
2018-02-06 10:39:28,376 [salt.state       ][INFO    ][11352] Executing state pkg.installed for gettext-base
2018-02-06 10:39:28,383 [salt.state       ][INFO    ][11352] All specified packages are already installed
2018-02-06 10:39:28,384 [salt.state       ][INFO    ][11352] Completed state [gettext-base] at time 10:39:28.384259 duration_in_ms=8.408
2018-02-06 10:39:28,385 [salt.state       ][INFO    ][11352] Running state [openstack-dashboard-apache] at time 10:39:28.385420
2018-02-06 10:39:28,385 [salt.state       ][INFO    ][11352] Executing state pkg.purged for openstack-dashboard-apache
2018-02-06 10:39:28,398 [salt.state       ][INFO    ][11352] All specified packages are already absent
2018-02-06 10:39:28,399 [salt.state       ][INFO    ][11352] Completed state [openstack-dashboard-apache] at time 10:39:28.399351 duration_in_ms=13.931
2018-02-06 10:39:28,401 [salt.state       ][INFO    ][11352] Running state [/etc/openstack-dashboard/local_settings.py] at time 10:39:28.401725
2018-02-06 10:39:28,402 [salt.state       ][INFO    ][11352] Executing state file.managed for /etc/openstack-dashboard/local_settings.py
2018-02-06 10:39:28,434 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/local_settings/pike_settings.py'
2018-02-06 10:39:28,508 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_local_settings.py'
2018-02-06 10:39:28,577 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_horizon_settings.py'
2018-02-06 10:39:28,608 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_keystone_settings.py'
2018-02-06 10:39:28,653 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_nova_settings.py'
2018-02-06 10:39:28,678 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_glance_settings.py'
2018-02-06 10:39:28,704 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_neutron_settings.py'
2018-02-06 10:39:28,733 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_heat_settings.py'
2018-02-06 10:39:28,760 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_websso_settings.py'
2018-02-06 10:39:28,788 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_ssl_settings.py'
2018-02-06 10:39:28,800 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -1,169 +1,66 @@
-# -*- coding: utf-8 -*-
-
 import os
-
 from django.utils.translation import ugettext_lazy as _
-
-from horizon.utils import secret_key
-
-from openstack_dashboard.settings import HORIZON_CONFIG
+from openstack_dashboard import exceptions
+
+HORIZON_CONFIG = {
+    'user_home': 'openstack_dashboard.views.get_user_home',
+    'ajax_queue_limit': 10,
+    'auto_fade_alerts': {
+        'delay': 3000,
+        'fade_duration': 1500,
+        'types': ['alert-success', 'alert-info']
+    },
+    'help_url': "http://docs.openstack.org",
+    'exceptions': {'recoverable': exceptions.RECOVERABLE,
+                   'not_found': exceptions.NOT_FOUND,
+                   'unauthorized': exceptions.UNAUTHORIZED},
+    'modal_backdrop': 'static',
+    'angular_modules': [],
+    'js_files': [],
+    'js_spec_files': [],
+    'disable_password_reveal': True,
+    'password_autocomplete': 'off'
+}
+
+INSTALLED_APPS = (
+    'openstack_dashboard',
+    'django.contrib.contenttypes',
+    'django.contrib.auth',
+    'django.contrib.sessions',
+    'django.contrib.messages',
+    'django.contrib.staticfiles',
+    'django.contrib.humanize',
+    'compressor',
+    'horizon',
+    'openstack_auth',
+)
+
+
 
 DEBUG = False
 
-# This setting controls whether or not compression is enabled. Disabling
-# compression makes Horizon considerably slower, but makes it much easier
-# to debug JS and CSS changes
-#COMPRESS_ENABLED = not DEBUG
-
-# This setting controls whether compression happens on the fly, or offline
-# with `python manage.py compress`
-# See https://django-compressor.readthedocs.io/en/latest/usage/#offline-compression
-# for more information
-#COMPRESS_OFFLINE = not DEBUG
-
-# WEBROOT is the location relative to Webserver root
-# should end with a slash.
-WEBROOT = '/'
-#LOGIN_URL = WEBROOT + 'auth/login/'
-#LOGOUT_URL = WEBROOT + 'auth/logout/'
-#
-# LOGIN_REDIRECT_URL can be used as an alternative for
-# HORIZON_CONFIG.user_home, if user_home is not set.
-# Do not set it to '/home/', as this will cause circular redirect loop
-#LOGIN_REDIRECT_URL = WEBROOT
-
-# If horizon is running in production (DEBUG is False), set this
-# with the list of host/domain names that the application can serve.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
-#ALLOWED_HOSTS = ['horizon.example.com', ]
-
-# Set SSL proxy settings:
-# Pass this header from the proxy after terminating the SSL,
-# and don't forget to strip it from the client's request.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
-#SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
-
-# If Horizon is being served through SSL, then uncomment the following two
-# settings to better secure the cookies from security exploits
-#CSRF_COOKIE_SECURE = True
-#SESSION_COOKIE_SECURE = True
-
-# The absolute path to the directory where message files are collected.
-# The message file must have a .json file extension. When the user logins to
-# horizon, the message files collected are processed and displayed to the user.
-#MESSAGES_PATH=None
-
-# Overrides for OpenStack API versions. Use this setting to force the
-# OpenStack dashboard to use a specific API version for a given service API.
-# Versions specified here should be integers or floats, not strings.
-# NOTE: The version should be formatted as it appears in the URL for the
-# service API. For example, The identity service APIs have inconsistent
-# use of the decimal point, so valid options would be 2.0 or 3.
-# Minimum compute version to get the instance locked status is 2.9.
-#OPENSTACK_API_VERSIONS = {
-#    "data-processing": 1.1,
-#    "identity": 3,
-#    "image": 2,
-#    "volume": 2,
-#    "compute": 2,
-#}
-
-# Set this to True if running on a multi-domain model. When this is enabled, it
-# will require the user to enter the Domain name in addition to the username
-# for login.
-#OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
-
-# Set this to True if you want available domains displayed as a dropdown menu
-# on the login screen. It is strongly advised NOT to enable this for public
-# clouds, as advertising enabled domains to unauthenticated customers
-# irresponsibly exposes private information. This should only be used for
-# private clouds where the dashboard sits behind a corporate firewall.
-#OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN = False
-
-# If OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN is enabled, this option can be used to
-# set the available domains to choose from. This is a list of pairs whose first
-# value is the domain name and the second is the display name.
-#OPENSTACK_KEYSTONE_DOMAIN_CHOICES = (
-#  ('Default', 'Default'),
-#)
-
-# Overrides the default domain used when running on single-domain model
-# with Keystone V3. All entities will be created in the default domain.
-# NOTE: This value must be the name of the default domain, NOT the ID.
-# Also, you will most likely have a value in the keystone policy file like this
-#    "cloud_admin": "rule:admin_required and domain_id:<your domain id>"
-# This value must be the name of the domain whose ID is specified there.
-#OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
-
-# Set this to True to enable panels that provide the ability for users to
-# manage Identity Providers (IdPs) and establish a set of rules to map
-# federation protocol attributes to Identity API attributes.
-# This extension requires v3.0+ of the Identity API.
-#OPENSTACK_KEYSTONE_FEDERATION_MANAGEMENT = False
-
-# Set Console type:
-# valid options are "AUTO"(default), "VNC", "SPICE", "RDP", "SERIAL" or None
-# Set to None explicitly if you want to deactivate the console.
-#CONSOLE_TYPE = "AUTO"
-
-# If provided, a "Report Bug" link will be displayed in the site header
-# which links to the value of this setting (ideally a URL containing
-# information on how to report issues).
-#HORIZON_CONFIG["bug_url"] = "http://bug-report.example.com"
-
-# Show backdrop element outside the modal, do not close the modal
-# after clicking on backdrop.
-#HORIZON_CONFIG["modal_backdrop"] = "static"
-
-# Specify a regular expression to validate user passwords.
-#HORIZON_CONFIG["password_validator"] = {
-#    "regex": '.*',
-#    "help_text": _("Your password does not meet the requirements."),
-#}
-
-# Disable simplified floating IP address management for deployments with
-# multiple floating IP pools or complex network requirements.
-#HORIZON_CONFIG["simple_ip_management"] = False
-
-# Turn off browser autocompletion for forms including the login form and
-# the database creation workflow if so desired.
-#HORIZON_CONFIG["password_autocomplete"] = "off"
-
-# Setting this to True will disable the reveal button for password fields,
-# including on the login form.
-#HORIZON_CONFIG["disable_password_reveal"] = False
+TEMPLATE_DEBUG = DEBUG
+
+ALLOWED_HOSTS = ['*']
+
+AUTHENTICATION_URLS = ['openstack_auth.urls']
 
 LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
 
-# Set custom secret key:
-# You can either set it to a specific value or you can let horizon generate a
-# default secret key that is unique on this machine, e.i. regardless of the
-# amount of Python WSGI workers (if used behind Apache+mod_wsgi): However,
-# there may be situations where you would want to set this explicitly, e.g.
-# when multiple dashboard instances are distributed on different machines
-# (usually behind a load-balancer). Either you have to make sure that a session
-# gets all requests routed to the same dashboard instance or you set the same
-# SECRET_KEY for all of them.
-SECRET_KEY = secret_key.generate_or_read_from_file('/var/lib/openstack-dashboard/secret_key')
-
-# We recommend you use memcached for development; otherwise after every reload
-# of the django development server, you will have to login again. To use
-# memcached set CACHES to something like
+SECRET_KEY = 'opaesee8Que2yahJoh9fo0eefo1Aeyo6ahyei8zeiboh3aeth5loth7ieNa5xi5e'
 
 CACHES = {
     'default': {
+
+        'OPTIONS': {
+                'DEAD_RETRY': 1,
+                'SERVER_RETRIES': 1,
+                'SOCKET_TIMEOUT': 1,
+        },
         'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-        'LOCATION': '127.0.0.1:11211',
-    },
-}
-
-#CACHES = {
-#    'default': {
-#        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
-#    }
-#}
+        'LOCATION': "100.64.200.105:11211"
+    }
+}
 
 # Send email to the console by default
 EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
@@ -171,75 +68,238 @@
 #EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
 
 # Configure these for your outgoing email host
-#EMAIL_HOST = 'smtp.my-company.com'
-#EMAIL_PORT = 25
-#EMAIL_HOST_USER = 'djangomail'
-#EMAIL_HOST_PASSWORD = 'top-secret!'
+# EMAIL_HOST = 'smtp.my-company.com'
+# EMAIL_PORT = 25
+# EMAIL_HOST_USER = 'djangomail'
+# EMAIL_HOST_PASSWORD = 'top-secret!'
+
+# The number of objects (Swift containers/objects or images) to display
+# on a single page before providing a paging element (a "more" link)
+# to paginate results.
+API_RESULT_LIMIT = 1000
+API_RESULT_PAGE_SIZE = 20
+
+# The timezone of the server. This should correspond with the timezone
+# of your entire OpenStack installation, and hopefully be in UTC.
+TIME_ZONE = "UTC"
+
+COMPRESS_OFFLINE = True
+
+# Trove user and database extension support. By default support for
+# creating users and databases on database instances is turned on.
+# To disable these extensions set the permission here to something
+# unusable such as ["!"].
+# TROVE_ADD_USER_PERMS = []
+# TROVE_ADD_DATABASE_PERMS = []
+
+SITE_BRANDING = 'OpenStack Dashboard'
+SESSION_COOKIE_HTTPONLY = True
+BOOT_ONLY_FROM_VOLUME = True
+
+REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
+                             'LAUNCH_INSTANCE_DEFAULTS',
+                             'OPENSTACK_IMAGE_FORMATS']
+
+
+# Specify a regular expression to validate user passwords.
+# HORIZON_CONFIG["password_validator"] = {
+#     "regex": '.*',
+#     "help_text": _("Your password does not meet the requirements.")
+# }
+
+# Turn off browser autocompletion for the login form if so desired.
+# HORIZON_CONFIG["password_autocomplete"] = "off"
+
+# The Horizon Policy Enforcement engine uses these values to load per service
+# policy rule files. The content of these files should match the files the
+# OpenStack services are using to determine role based access control in the
+# target installation.
+
+SESSION_TIMEOUT = 43200
+SESSION_ENGINE = "django.contrib.sessions.backends.cache"
+DROPDOWN_MAX_ITEMS = 30
+
+# Path to directory containing policy.json files
+POLICY_FILES_PATH = "/usr/share/openstack-dashboard/openstack_dashboard/conf"
+# Map of local copy of service policy files
+POLICY_FILES = {
+    "compute": "nova_policy.json",
+    "network": "neutron_policy.json",
+    "image": "glance_policy.json",
+    "telemetry": "ceilometer_policy.json",
+    "volume": "cinder_policy.json",
+    "orchestration": "heat_policy.json",
+    "identity": "keystone_policy.json",
+}
+
+LOGGING = {
+    'version': 1,
+    # When set to True this will disable all logging except
+    # for loggers specified in this configuration dictionary. Note that
+    # if nothing is specified here and disable_existing_loggers is True,
+    # django.db.backends will still log unless it is disabled explicitly.
+    'disable_existing_loggers': False,
+    'handlers': {
+        'null': {
+            'level': 'DEBUG',
+            'class': 'django.utils.log.NullHandler',
+        },
+        'console': {
+            # Set the level to "DEBUG" for verbose output logging.
+            'level': 'INFO',
+            'class': 'logging.StreamHandler',
+        },
+        'file': {
+            'level': 'DEBUG',
+            'class': 'logging.FileHandler',
+            'filename': '/var/log/horizon/horizon.log',
+        },
+    },
+    'loggers': {
+        # Logging from django.db.backends is VERY verbose, send to null
+        # by default.
+        'django.db.backends': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'requests': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'horizon': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_dashboard': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'novaclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'cinderclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'keystoneclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'glanceclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'neutronclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'heatclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'ceilometerclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'troveclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'mistralclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'swiftclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_auth': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'scss.expression': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'nose.plugins.manager': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'django': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'iso8601': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+    }
+}
+
+
+# Overrides for OpenStack API versions. Use this setting to force the
+# OpenStack dashboard to use a specfic API version for a given service API.
+# NOTE: The version should be formatted as it appears in the URL for the
+# service API. For example, The identity service APIs have inconsistent
+# use of the decimal point, so valid options would be "2.0" or "3".
+OPENSTACK_API_VERSIONS = {
+    "identity": 3
+}
+# Set this to True if running on multi-domain model. When this is enabled, it
+# will require user to enter the Domain name in addition to username for login.
+# OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+
+# Overrides the default domain used when running on single-domain model
+# with Keystone V3. All entities will be created in the default domain.
+# OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
 
 # For multiple regions uncomment this configuration, and add (endpoint, title).
-#AVAILABLE_REGIONS = [
-#    ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
-#    ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
-#]
-
-OPENSTACK_HOST = "127.0.0.1"
-OPENSTACK_KEYSTONE_URL = "http://%s:5000/v2.0" % OPENSTACK_HOST
-OPENSTACK_KEYSTONE_DEFAULT_ROLE = "_member_"
-
-# For setting the default service region on a per-endpoint basis. Note that the
-# default value for this setting is {}, and below is just an example of how it
-# should be specified.
-#DEFAULT_SERVICE_REGIONS = {
-#    OPENSTACK_KEYSTONE_URL: 'RegionOne'
-#}
-
-# Enables keystone web single-sign-on if set to True.
-#WEBSSO_ENABLED = False
-
-# Determines which authentication choice to show as default.
-#WEBSSO_INITIAL_CHOICE = "credentials"
-
-# The list of authentication mechanisms which include keystone
-# federation protocols and identity provider/federation protocol
-# mapping keys (WEBSSO_IDP_MAPPING). Current supported protocol
-# IDs are 'saml2' and 'oidc'  which represent SAML 2.0, OpenID
-# Connect respectively.
-# Do not remove the mandatory credentials mechanism.
-# Note: The last two tuples are sample mapping keys to a identity provider
-# and federation protocol combination (WEBSSO_IDP_MAPPING).
-#WEBSSO_CHOICES = (
-#    ("credentials", _("Keystone Credentials")),
-#    ("oidc", _("OpenID Connect")),
-#    ("saml2", _("Security Assertion Markup Language")),
-#    ("acme_oidc", "ACME - OpenID Connect"),
-#    ("acme_saml2", "ACME - SAML2"),
-#)
-
-# A dictionary of specific identity provider and federation protocol
-# combinations. From the selected authentication mechanism, the value
-# will be looked up as keys in the dictionary. If a match is found,
-# it will redirect the user to a identity provider and federation protocol
-# specific WebSSO endpoint in keystone, otherwise it will use the value
-# as the protocol_id when redirecting to the WebSSO by protocol endpoint.
-# NOTE: The value is expected to be a tuple formatted as: (<idp_id>, <protocol_id>).
-#WEBSSO_IDP_MAPPING = {
-#    "acme_oidc": ("acme", "oidc"),
-#    "acme_saml2": ("acme", "saml2"),
-#}
-
-# The Keystone Provider drop down uses Keystone to Keystone federation
-# to switch between Keystone service providers.
-# Set display name for Identity Provider (dropdown display name)
-#KEYSTONE_PROVIDER_IDP_NAME = "Local Keystone"
-# This id is used for only for comparison with the service provider IDs. This ID
-# should not match any service provider IDs.
-#KEYSTONE_PROVIDER_IDP_ID = "localkeystone"
+# AVAILABLE_REGIONS = [
+#     ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
+#     ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
+# ]
+
+OPENSTACK_HOST = "192.168.10.10"
+OPENSTACK_KEYSTONE_URL = "http://%s:5000/v3" % OPENSTACK_HOST
+
+OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = "default"
+
+OPENSTACK_KEYSTONE_DEFAULT_ROLE = "Member"
 
 # Disable SSL certificate checks (useful for self-signed certificates):
-#OPENSTACK_SSL_NO_VERIFY = True
 
 # The CA certificate to use to verify SSL connections
-#OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+# OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+
+# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is 'publicURL'.
+OPENSTACK_ENDPOINT_TYPE = "internalURL"
+
+# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
+# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is None.  This
+# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
+#SECONDARY_ENDPOINT_TYPE = "publicURL"
 
 # The OPENSTACK_KEYSTONE_BACKEND settings can be used to identify the
 # capabilities of the auth backend for Keystone.
@@ -253,43 +313,13 @@
     'can_edit_group': True,
     'can_edit_project': True,
     'can_edit_domain': True,
-    'can_edit_role': True,
-}
-
-# Setting this to True, will add a new "Retrieve Password" action on instance,
-# allowing Admin session password retrieval/decryption.
-#OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
-
-# This setting allows deployers to control whether a token is deleted on log
-# out. This can be helpful when there are often long running processes being
-# run in the Horizon environment.
-#TOKEN_DELETION_DISABLED = False
-
-# The Launch Instance user experience has been significantly enhanced.
-# You can choose whether to enable the new launch instance experience,
-# the legacy experience, or both. The legacy experience will be removed
-# in a future release, but is available as a temporary backup setting to ensure
-# compatibility with existing deployments. Further development will not be
-# done on the legacy experience. Please report any problems with the new
-# experience via the Launchpad tracking system.
-#
-# Toggle LAUNCH_INSTANCE_LEGACY_ENABLED and LAUNCH_INSTANCE_NG_ENABLED to
-# determine the experience to enable.  Set them both to true to enable
-# both.
-#LAUNCH_INSTANCE_LEGACY_ENABLED = True
-#LAUNCH_INSTANCE_NG_ENABLED = False
-
-# A dictionary of settings which can be used to provide the default values for
-# properties found in the Launch Instance modal.
-#LAUNCH_INSTANCE_DEFAULTS = {
-#    'config_drive': False,
-#    'enable_scheduler_hints': True,
-#    'disable_image': False,
-#    'disable_instance_snapshot': False,
-#    'disable_volume': False,
-#    'disable_volume_snapshot': False,
-#    'create_volume': True,
-#}
+    'can_edit_role': True
+}
+
+
+# Set Console type:
+# valid options would be "AUTO", "VNC" or "SPICE"
+# CONSOLE_TYPE = "AUTO"
 
 # The Xen Hypervisor has the ability to set the mount point for volumes
 # attached to instances (other Hypervisors currently do not). Setting
@@ -298,97 +328,52 @@
 OPENSTACK_HYPERVISOR_FEATURES = {
     'can_set_mount_point': False,
     'can_set_password': False,
-    'requires_keypair': False,
-    'enable_quotas': True
-}
-
-# The OPENSTACK_CINDER_FEATURES settings can be used to enable optional
-# services provided by cinder that is not exposed by its extension API.
-OPENSTACK_CINDER_FEATURES = {
-    'enable_backup': False,
-}
-
-# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
-# services provided by neutron. Options currently available are load
-# balancer service, security groups, quotas, VPN service.
-OPENSTACK_NEUTRON_NETWORK = {
-    'enable_router': True,
-    'enable_quotas': True,
-    'enable_ipv6': True,
-    'enable_distributed_router': False,
-    'enable_ha_router': False,
-    'enable_fip_topology_check': True,
-
-    # Default dns servers you would like to use when a subnet is
-    # created.  This is only a default, users can still choose a different
-    # list of dns servers when creating a new subnet.
-    # The entries below are examples only, and are not appropriate for
-    # real deployments
-    # 'default_dns_nameservers': ["8.8.8.8", "8.8.4.4", "208.67.222.222"],
-
-    # Set which provider network types are supported. Only the network types
-    # in this list will be available to choose from when creating a network.
-    # Network types include local, flat, vlan, gre, vxlan and geneve.
-    # 'supported_provider_types': ['*'],
-
-    # You can configure available segmentation ID range per network type
-    # in your deployment.
-    # 'segmentation_id_range': {
-    #     'vlan': [1024, 2048],
-    #     'vxlan': [4094, 65536],
-    # },
-
-    # You can define additional provider network types here.
-    # 'extra_provider_types': {
-    #     'awesome_type': {
-    #         'display_name': 'Awesome New Type',
-    #         'require_physical_network': False,
-    #         'require_segmentation_id': True,
-    #     }
-    # },
-
-    # Set which VNIC types are supported for port binding. Only the VNIC
-    # types in this list will be available to choose from when creating a
-    # port.
-    # VNIC types include 'normal', 'direct', 'direct-physical', 'macvtap',
-    # 'baremetal' and 'virtio-forwarder'
-    # Set to empty list or None to disable VNIC type selection.
-    'supported_vnic_types': ['*'],
-
-    # Set list of available physical networks to be selected in the physical
-    # network field on the admin create network modal. If it's set to an empty
-    # list, the field will be a regular input field.
-    # e.g. ['default', 'test']
-    'physical_networks': [],
-
-}
-
-# The OPENSTACK_HEAT_STACK settings can be used to disable password
-# field required while launching the stack.
-OPENSTACK_HEAT_STACK = {
-    'enable_user_pass': True,
-}
+}
+
+# When set, enables the instance action "Retrieve password"
+# allowing password retrieval
+OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
+
+# When launching an instance, the menu of available flavors is
+# sorted by RAM usage, ascending.  Provide a callback method here
+# (and/or a flag for reverse sort) for the sorted() method if you'd
+# like a different behaviour.  For more info, see
+# http://docs.python.org/2/library/functions.html#sorted
+# CREATE_INSTANCE_FLAVOR_SORT = {
+#     'key': my_awesome_callback_method,
+#     'reverse': False,
+# }
+
+FLAVOR_EXTRA_KEYS = {
+    'flavor_keys': [
+        ('quota:read_bytes_sec', _('Quota: Read bytes')),
+        ('quota:write_bytes_sec', _('Quota: Write bytes')),
+        ('quota:cpu_quota', _('Quota: CPU')),
+        ('quota:cpu_period', _('Quota: CPU period')),
+        ('quota:inbound_average', _('Quota: Inbound average')),
+        ('quota:outbound_average', _('Quota: Outbound average')),
+    ]
+}
+
 
 # The OPENSTACK_IMAGE_BACKEND settings can be used to customize features
 # in the OpenStack Dashboard related to the Image service, such as the list
 # of supported image formats.
-#OPENSTACK_IMAGE_BACKEND = {
-#    'image_formats': [
-#        ('', _('Select format')),
-#        ('aki', _('AKI - Amazon Kernel Image')),
-#        ('ami', _('AMI - Amazon Machine Image')),
-#        ('ari', _('ARI - Amazon Ramdisk Image')),
-#        ('docker', _('Docker')),
-#        ('iso', _('ISO - Optical Disk Image')),
-#        ('ova', _('OVA - Open Virtual Appliance')),
-#        ('qcow2', _('QCOW2 - QEMU Emulator')),
-#        ('raw', _('Raw')),
-#        ('vdi', _('VDI - Virtual Disk Image')),
-#        ('vhd', _('VHD - Virtual Hard Disk')),
-#        ('vhdx', _('VHDX - Large Virtual Hard Disk')),
-#        ('vmdk', _('VMDK - Virtual Machine Disk')),
-#    ],
-#}
+OPENSTACK_IMAGE_BACKEND = {
+    'image_formats': [
+        ('', ''),
+        ('aki', _('AKI - Amazon Kernel Image')),
+        ('ami', _('AMI - Amazon Machine Image')),
+        ('ari', _('ARI - Amazon Ramdisk Image')),
+        ('iso', _('ISO - Optical Disk Image')),
+        ('qcow2', _('QCOW2 - QEMU Emulator')),
+        ('raw', _('Raw')),
+        ('vdi', _('VDI')),
+        ('vhd', _('VHD')),
+        ('vmdk', _('VMDK')),
+        ('docker', _('Docker Container'))
+    ]
+}
 
 # The IMAGE_CUSTOM_PROPERTY_TITLES settings is used to customize the titles for
 # image custom property attributes that appear on image detail pages.
@@ -398,273 +383,53 @@
     "ramdisk_id": _("Ramdisk ID"),
     "image_state": _("Euca2ools state"),
     "project_id": _("Project ID"),
-    "image_type": _("Image Type"),
-}
-
-# The IMAGE_RESERVED_CUSTOM_PROPERTIES setting is used to specify which image
-# custom properties should not be displayed in the Image Custom Properties
-# table.
-IMAGE_RESERVED_CUSTOM_PROPERTIES = []
-
-# Set to 'legacy' or 'direct' to allow users to upload images to glance via
-# Horizon server. When enabled, a file form field will appear on the create
-# image form. If set to 'off', there will be no file form field on the create
-# image form. See documentation for deployment considerations.
-#HORIZON_IMAGES_UPLOAD_MODE = 'legacy'
-
-# Allow a location to be set when creating or updating Glance images.
-# If using Glance V2, this value should be False unless the Glance
-# configuration and policies allow setting locations.
-#IMAGES_ALLOW_LOCATION = False
-
-# A dictionary of default settings for create image modal.
-#CREATE_IMAGE_DEFAULTS = {
-#    'image_visibility': "public",
-#}
-
-# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is 'publicURL'.
-#OPENSTACK_ENDPOINT_TYPE = "publicURL"
-
-# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
-# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is None. This
-# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
-#SECONDARY_ENDPOINT_TYPE = None
-
-# The number of objects (Swift containers/objects or images) to display
-# on a single page before providing a paging element (a "more" link)
-# to paginate results.
-API_RESULT_LIMIT = 1000
-API_RESULT_PAGE_SIZE = 20
-
-# The size of chunk in bytes for downloading objects from Swift
-SWIFT_FILE_TRANSFER_CHUNK_SIZE = 512 * 1024
-
-# The default number of lines displayed for instance console log.
-INSTANCE_LOG_LENGTH = 35
-
-# Specify a maximum number of items to display in a dropdown.
-DROPDOWN_MAX_ITEMS = 30
-
-# The timezone of the server. This should correspond with the timezone
-# of your entire OpenStack installation, and hopefully be in UTC.
-TIME_ZONE = "UTC"
-
-# When launching an instance, the menu of available flavors is
-# sorted by RAM usage, ascending. If you would like a different sort order,
-# you can provide another flavor attribute as sorting key. Alternatively, you
-# can provide a custom callback method to use for sorting. You can also provide
-# a flag for reverse sort. For more info, see
-# http://docs.python.org/2/library/functions.html#sorted
-#CREATE_INSTANCE_FLAVOR_SORT = {
-#    'key': 'name',
-#     # or
-#    'key': my_awesome_callback_method,
-#    'reverse': False,
-#}
-
-# Set this to True to display an 'Admin Password' field on the Change Password
-# form to verify that it is indeed the admin logged-in who wants to change
-# the password.
-#ENFORCE_PASSWORD_CHECK = False
-
-# Modules that provide /auth routes that can be used to handle different types
-# of user authentication. Add auth plugins that require extra route handling to
-# this list.
-#AUTHENTICATION_URLS = [
-#    'openstack_auth.urls',
-#]
-
-# The Horizon Policy Enforcement engine uses these values to load per service
-# policy rule files. The content of these files should match the files the
-# OpenStack services are using to determine role based access control in the
-# target installation.
-
-# Path to directory containing policy.json files
-#POLICY_FILES_PATH = os.path.join(ROOT_PATH, "conf")
-
-# Map of local copy of service policy files.
-# Please insure that your identity policy file matches the one being used on
-# your keystone servers. There is an alternate policy file that may be used
-# in the Keystone v3 multi-domain case, policy.v3cloudsample.json.
-# This file is not included in the Horizon repository by default but can be
-# found at
-# http://git.openstack.org/cgit/openstack/keystone/tree/etc/ \
-# policy.v3cloudsample.json
-# Having matching policy files on the Horizon and Keystone servers is essential
-# for normal operation. This holds true for all services and their policy files.
-#POLICY_FILES = {
-#    'identity': 'keystone_policy.json',
-#    'compute': 'nova_policy.json',
-#    'volume': 'cinder_policy.json',
-#    'image': 'glance_policy.json',
-#    'orchestration': 'heat_policy.json',
-#    'network': 'neutron_policy.json',
-#}
-
-# TODO: (david-lyle) remove when plugins support adding settings.
-# Note: Only used when trove-dashboard plugin is configured to be used by
-# Horizon.
-# Trove user and database extension support. By default support for
-# creating users and databases on database instances is turned on.
-# To disable these extensions set the permission here to something
-# unusable such as ["!"].
-#TROVE_ADD_USER_PERMS = []
-#TROVE_ADD_DATABASE_PERMS = []
-
-# Change this patch to the appropriate list of tuples containing
-# a key, label and static directory containing two files:
-# _variables.scss and _styles.scss
-#AVAILABLE_THEMES = [
-#    ('default', 'Default', 'themes/default'),
-#    ('material', 'Material', 'themes/material'),
-#]
-
-LOGGING = {
-    'version': 1,
-    # When set to True this will disable all logging except
-    # for loggers specified in this configuration dictionary. Note that
-    # if nothing is specified here and disable_existing_loggers is True,
-    # django.db.backends will still log unless it is disabled explicitly.
-    'disable_existing_loggers': False,
-    # If apache2 mod_wsgi is used to deploy OpenStack dashboard
-    # timestamp is output by mod_wsgi. If WSGI framework you use does not
-    # output timestamp for logging, add %(asctime)s in the following
-    # format definitions.
-    'formatters': {
-        'console': {
-            'format': '%(levelname)s %(name)s %(message)s'
-        },
-        'operation': {
-            # The format of "%(message)s" is defined by
-            # OPERATION_LOG_OPTIONS['format']
-            'format': '%(message)s'
-        },
-    },
-    'handlers': {
-        'null': {
-            'level': 'DEBUG',
-            'class': 'logging.NullHandler',
-        },
-        'console': {
-            # Set the level to "DEBUG" for verbose output logging.
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'console',
-        },
-        'operation': {
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'operation',
-        },
-    },
-    'loggers': {
-        # Logging from django.db.backends is VERY verbose, send to null
-        # by default.
-        'django.db.backends': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'requests': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'horizon': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'horizon.operation_log': {
-            'handlers': ['operation'],
-            'level': 'INFO',
-            'propagate': False,
-        },
-        'openstack_dashboard': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'novaclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'cinderclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'glanceclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'neutronclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'heatclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'swiftclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'openstack_auth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'nose.plugins.manager': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'django': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'iso8601': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'scss': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-    },
+    "image_type": _("Image Type")
+}
+
+HORIZON_IMAGES_UPLOAD_MODE = "legacy"
+IMAGES_ALLOW_LOCATION = True
+
+
+# Disable simplified floating IP address management for deployments with
+# multiple floating IP pools or complex network requirements.
+# HORIZON_CONFIG["simple_ip_management"] = False
+
+# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
+# services provided by neutron. Options currenly available are load
+# balancer service, security groups, quotas, VPN service.
+
+OPENSTACK_NEUTRON_NETWORK = {
+    'enable_lb': True,
+    'enable_firewall': False,
+    'enable_quotas': True,
+    'enable_security_group': True,
+    'enable_vpn': False,
+    # The profile_support option is used to detect if an externa lrouter can be
+    # configured via the dashboard. When using specific plugins the
+    # profile_support can be turned on if needed.
+    'profile_support': None,
+    'enable_fip_topology_check': True,
+
+    #'profile_support': 'cisco',
 }
 
 # 'direction' should not be specified for all_tcp/udp/icmp.
 # It is specified in the form.
 SECURITY_GROUP_RULES = {
     'all_tcp': {
-        'name': _('All TCP'),
+        'name': 'ALL TCP',
         'ip_protocol': 'tcp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_udp': {
-        'name': _('All UDP'),
+        'name': 'ALL UDP',
         'ip_protocol': 'udp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_icmp': {
-        'name': _('All ICMP'),
+        'name': 'ALL ICMP',
         'ip_protocol': 'icmp',
         'from_port': '-1',
         'to_port': '-1',
@@ -755,160 +520,12 @@
     },
 }
 
-# Deprecation Notice:
-#
-# The setting FLAVOR_EXTRA_KEYS has been deprecated.
-# Please load extra spec metadata into the Glance Metadata Definition Catalog.
-#
-# The sample quota definitions can be found in:
-# <glance_source>/etc/metadefs/compute-quota.json
-#
-# The metadata definition catalog supports CLI and API:
-#  $glance --os-image-api-version 2 help md-namespace-import
-#  $glance-manage db_load_metadefs <directory_with_definition_files>
-#
-# See Metadata Definitions on: http://docs.openstack.org/developer/glance/
-
-# TODO: (david-lyle) remove when plugins support settings natively
-# Note: This is only used when the Sahara plugin is configured and enabled
-# for use in Horizon.
-# Indicate to the Sahara data processing service whether or not
-# automatic floating IP allocation is in effect.  If it is not
-# in effect, the user will be prompted to choose a floating IP
-# pool for use in their cluster.  False by default.  You would want
-# to set this to True if you were running Nova Networking with
-# auto_assign_floating_ip = True.
-#SAHARA_AUTO_IP_ALLOCATION_ENABLED = False
-
-# The hash algorithm to use for authentication tokens. This must
-# match the hash algorithm that the identity server and the
-# auth_token middleware are using. Allowed values are the
-# algorithms supported by Python's hashlib library.
-#OPENSTACK_TOKEN_HASH_ALGORITHM = 'md5'
-
-# AngularJS requires some settings to be made available to
-# the client side. Some settings are required by in-tree / built-in horizon
-# features. These settings must be added to REST_API_REQUIRED_SETTINGS in the
-# form of ['SETTING_1','SETTING_2'], etc.
-#
-# You may remove settings from this list for security purposes, but do so at
-# the risk of breaking a built-in horizon feature. These settings are required
-# for horizon to function properly. Only remove them if you know what you
-# are doing. These settings may in the future be moved to be defined within
-# the enabled panel configuration.
-# You should not add settings to this list for out of tree extensions.
-# See: https://wiki.openstack.org/wiki/Horizon/RESTAPI
-REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
-                              'LAUNCH_INSTANCE_DEFAULTS',
-                              'OPENSTACK_IMAGE_FORMATS',
-                              'OPENSTACK_KEYSTONE_DEFAULT_DOMAIN',
-                              'CREATE_IMAGE_DEFAULTS']
-
-# Additional settings can be made available to the client side for
-# extensibility by specifying them in REST_API_ADDITIONAL_SETTINGS
-# !! Please use extreme caution as the settings are transferred via HTTP/S
-# and are not encrypted on the browser. This is an experimental API and
-# may be deprecated in the future without notice.
-#REST_API_ADDITIONAL_SETTINGS = []
-
-###############################################################################
-# Ubuntu Settings
-###############################################################################
-
- # The default theme if no cookie is present
-DEFAULT_THEME = 'ubuntu'
-
-# Default Ubuntu apache configuration uses /horizon as the application root.
-WEBROOT='/horizon/'
-
-# By default, validation of the HTTP Host header is disabled.  Production
-# installations should have this set accordingly.  For more information
-# see https://docs.djangoproject.com/en/dev/ref/settings/.
-ALLOWED_HOSTS = '*'
-
-# Compress all assets offline as part of packaging installation
-COMPRESS_OFFLINE = True
-
-# DISALLOW_IFRAME_EMBED can be used to prevent Horizon from being embedded
-# within an iframe. Legacy browsers are still vulnerable to a Cross-Frame
-# Scripting (XFS) vulnerability, so this option allows extra security hardening
-# where iframes are not used in deployment. Default setting is True.
-# For more information see:
-# http://tinyurl.com/anticlickjack
-#DISALLOW_IFRAME_EMBED = True
-
-# Help URL can be made available for the client. To provide a help URL, edit the
-# following attribute to the URL of your choice.
-#HORIZON_CONFIG["help_url"] = "http://openstack.mycompany.org"
-
-# Settings for OperationLogMiddleware
-# OPERATION_LOG_ENABLED is flag to use the function to log an operation on
-# Horizon.
-# mask_targets is arrangement for appointing a target to mask.
-# method_targets is arrangement of HTTP method to output log.
-# format is the log contents.
-#OPERATION_LOG_ENABLED = False
-#OPERATION_LOG_OPTIONS = {
-#    'mask_fields': ['password'],
-#    'target_methods': ['POST'],
-#    'ignored_urls': ['/js/', '/static/', '^/api/'],
-#    'format': ("[%(client_ip)s] [%(domain_name)s]"
-#        " [%(domain_id)s] [%(project_name)s]"
-#        " [%(project_id)s] [%(user_name)s] [%(user_id)s] [%(request_scheme)s]"
-#        " [%(referer_url)s] [%(request_url)s] [%(message)s] [%(method)s]"
-#        " [%(http_status)s] [%(param)s]"),
-#}
-
-# The default date range in the Overview panel meters - either <today> minus N
-# days (if the value is integer N), or from the beginning of the current month
-# until today (if set to None). This setting should be used to limit the amount
-# of data fetched by default when rendering the Overview panel.
-#OVERVIEW_DAYS_RANGE = 1
-
-# To allow operators to require users provide a search criteria first
-# before loading any data into the views, set the following dict
-# attributes to True in each one of the panels you want to enable this feature.
-# Follow the convention <dashboard>.<view>
-#FILTER_DATA_FIRST = {
-#    'admin.instances': False,
-#    'admin.images': False,
-#    'admin.networks': False,
-#    'admin.routers': False,
-#    'admin.volumes': False,
-#    'identity.users': False,
-#    'identity.projects': False,
-#    'identity.groups': False,
-#    'identity.roles': False
-#}
-
-# Dict used to restrict user private subnet cidr range.
-# An empty list means that user input will not be restricted
-# for a corresponding IP version. By default, there is
-# no restriction for IPv4 or IPv6. To restrict
-# user private subnet cidr range set ALLOWED_PRIVATE_SUBNET_CIDR
-# to something like
-#ALLOWED_PRIVATE_SUBNET_CIDR = {
-#    'ipv4': ['10.0.0.0/8', '192.168.0.0/16'],
-#    'ipv6': ['fc00::/7']
-#}
-ALLOWED_PRIVATE_SUBNET_CIDR = {'ipv4': [], 'ipv6': []}
-
-# Projects and users can have extra attributes as defined by keystone v3.
-# Horizon has the ability to display these extra attributes via this setting.
-# If you'd like to display extra data in the project or user tables, set the
-# corresponding dict key to the attribute name, followed by the display name.
-# For more information, see horizon's customization (http://docs.openstack.org/developer/horizon/topics/customizing.html#horizon-customization-module-overrides)
-#PROJECT_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-#USER_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-
-# Password will have an expiration date when using keystone v3 and enabling the
-# feature.
-# This setting allows you to set the number of days that the user will be alerted
-# prior to the password expiration.
-# Once the password expires keystone will deny the access and users must
-# contact an admin to change their password.
-#PASSWORD_EXPIRES_WARNING_THRESHOLD_DAYS = 0+
+
+
+
+
+USE_SSL = True
+CSRF_COOKIE_SECURE = True
+CSRF_COOKIE_SECURE = True
+SESSION_COOKIE_HTTPONLY = True

2018-02-06 10:39:28,834 [salt.state       ][INFO    ][11352] Loading fresh modules for state activity
2018-02-06 10:39:28,878 [salt.state       ][INFO    ][11352] Completed state [/etc/openstack-dashboard/local_settings.py] at time 10:39:28.878027 duration_in_ms=476.302
2018-02-06 10:39:28,884 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 10:39:28.884887
2018-02-06 10:39:28,885 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json
2018-02-06 10:39:28,926 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/nova_policy.json'
2018-02-06 10:39:28,933 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -2,175 +2,436 @@
     "context_is_admin":  "role:admin",
     "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
     "default": "rule:admin_or_owner",
+
+    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
+
+    "compute:create": "rule:admin_or_owner",
+    "compute:create:attach_network": "rule:admin_or_owner",
+    "compute:create:attach_volume": "rule:admin_or_owner",
+    "compute:create:forced_host": "is_admin:True",
+
+    "compute:get": "rule:admin_or_owner",
+    "compute:get_all": "rule:admin_or_owner",
+    "compute:get_all_tenants": "is_admin:True",
+
+    "compute:update": "rule:admin_or_owner",
+
+    "compute:get_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_system_metadata": "rule:admin_or_owner",
+    "compute:update_instance_metadata": "rule:admin_or_owner",
+    "compute:delete_instance_metadata": "rule:admin_or_owner",
+
+    "compute:get_diagnostics": "rule:admin_or_owner",
+    "compute:get_instance_diagnostics": "rule:admin_or_owner",
+
+    "compute:start": "rule:admin_or_owner",
+    "compute:stop": "rule:admin_or_owner",
+
+    "compute:lock": "rule:admin_or_owner",
+    "compute:unlock": "rule:admin_or_owner",
+    "compute:unlock_override": "rule:admin_api",
+
+    "compute:get_vnc_console": "rule:admin_or_owner",
+    "compute:get_spice_console": "rule:admin_or_owner",
+    "compute:get_rdp_console": "rule:admin_or_owner",
+    "compute:get_serial_console": "rule:admin_or_owner",
+    "compute:get_mks_console": "rule:admin_or_owner",
+    "compute:get_console_output": "rule:admin_or_owner",
+
+    "compute:reset_network": "rule:admin_or_owner",
+    "compute:inject_network_info": "rule:admin_or_owner",
+    "compute:add_fixed_ip": "rule:admin_or_owner",
+    "compute:remove_fixed_ip": "rule:admin_or_owner",
+
+    "compute:attach_volume": "rule:admin_or_owner",
+    "compute:detach_volume": "rule:admin_or_owner",
+    "compute:swap_volume": "rule:admin_api",
+
+    "compute:attach_interface": "rule:admin_or_owner",
+    "compute:detach_interface": "rule:admin_or_owner",
+
+    "compute:set_admin_password": "rule:admin_or_owner",
+
+    "compute:rescue": "rule:admin_or_owner",
+    "compute:unrescue": "rule:admin_or_owner",
+
+    "compute:suspend": "rule:admin_or_owner",
+    "compute:resume": "rule:admin_or_owner",
+
+    "compute:pause": "rule:admin_or_owner",
+    "compute:unpause": "rule:admin_or_owner",
+
+    "compute:shelve": "rule:admin_or_owner",
+    "compute:shelve_offload": "rule:admin_or_owner",
+    "compute:unshelve": "rule:admin_or_owner",
+
+    "compute:snapshot": "rule:admin_or_owner",
+    "compute:snapshot_volume_backed": "rule:admin_or_owner",
+    "compute:backup": "rule:admin_or_owner",
+
+    "compute:resize": "rule:admin_or_owner",
+    "compute:confirm_resize": "rule:admin_or_owner",
+    "compute:revert_resize": "rule:admin_or_owner",
+
+    "compute:rebuild": "rule:admin_or_owner",
+    "compute:reboot": "rule:admin_or_owner",
+    "compute:delete": "rule:admin_or_owner",
+    "compute:soft_delete": "rule:admin_or_owner",
+    "compute:force_delete": "rule:admin_or_owner",
+
+    "compute:security_groups:add_to_instance": "rule:admin_or_owner",
+    "compute:security_groups:remove_from_instance": "rule:admin_or_owner",
+
+    "compute:restore": "rule:admin_or_owner",
+
+    "compute:volume_snapshot_create": "rule:admin_or_owner",
+    "compute:volume_snapshot_delete": "rule:admin_or_owner",
+
     "admin_api": "is_admin:True",
-
+    "compute_extension:accounts": "rule:admin_api",
+    "compute_extension:admin_actions": "rule:admin_api",
+    "compute_extension:admin_actions:pause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unpause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:suspend": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resume": "rule:admin_or_owner",
+    "compute_extension:admin_actions:lock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unlock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resetNetwork": "rule:admin_api",
+    "compute_extension:admin_actions:injectNetworkInfo": "rule:admin_api",
+    "compute_extension:admin_actions:createBackup": "rule:admin_or_owner",
+    "compute_extension:admin_actions:migrateLive": "rule:admin_api",
+    "compute_extension:admin_actions:resetState": "rule:admin_api",
+    "compute_extension:admin_actions:migrate": "rule:admin_api",
+    "compute_extension:aggregates": "rule:admin_api",
+    "compute_extension:agents": "rule:admin_api",
+    "compute_extension:attach_interfaces": "rule:admin_or_owner",
+    "compute_extension:baremetal_nodes": "rule:admin_api",
+    "compute_extension:cells": "rule:admin_api",
+    "compute_extension:cells:create": "rule:admin_api",
+    "compute_extension:cells:delete": "rule:admin_api",
+    "compute_extension:cells:update": "rule:admin_api",
+    "compute_extension:cells:sync_instances": "rule:admin_api",
+    "compute_extension:certificates": "rule:admin_or_owner",
+    "compute_extension:cloudpipe": "rule:admin_api",
+    "compute_extension:cloudpipe_update": "rule:admin_api",
+    "compute_extension:config_drive": "rule:admin_or_owner",
+    "compute_extension:console_output": "rule:admin_or_owner",
+    "compute_extension:consoles": "rule:admin_or_owner",
+    "compute_extension:createserverext": "rule:admin_or_owner",
+    "compute_extension:deferred_delete": "rule:admin_or_owner",
+    "compute_extension:disk_config": "rule:admin_or_owner",
+    "compute_extension:evacuate": "rule:admin_api",
+    "compute_extension:extended_server_attributes": "rule:admin_api",
+    "compute_extension:extended_status": "rule:admin_or_owner",
+    "compute_extension:extended_availability_zone": "rule:admin_or_owner",
+    "compute_extension:extended_ips": "rule:admin_or_owner",
+    "compute_extension:extended_ips_mac": "rule:admin_or_owner",
+    "compute_extension:extended_vif_net": "rule:admin_or_owner",
+    "compute_extension:extended_volumes": "rule:admin_or_owner",
+    "compute_extension:fixed_ips": "rule:admin_api",
+    "compute_extension:flavor_access": "rule:admin_or_owner",
+    "compute_extension:flavor_access:addTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_access:removeTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_disabled": "rule:admin_or_owner",
+    "compute_extension:flavor_rxtx": "rule:admin_or_owner",
+    "compute_extension:flavor_swap": "rule:admin_or_owner",
+    "compute_extension:flavorextradata": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:index": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:show": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:create": "rule:admin_api",
+    "compute_extension:flavorextraspecs:update": "rule:admin_api",
+    "compute_extension:flavorextraspecs:delete": "rule:admin_api",
+    "compute_extension:flavormanage": "rule:admin_api",
+    "compute_extension:floating_ip_dns": "rule:admin_or_owner",
+    "compute_extension:floating_ip_pools": "rule:admin_or_owner",
+    "compute_extension:floating_ips": "rule:admin_or_owner",
+    "compute_extension:floating_ips_bulk": "rule:admin_api",
+    "compute_extension:fping": "rule:admin_or_owner",
+    "compute_extension:fping:all_tenants": "rule:admin_api",
+    "compute_extension:hide_server_addresses": "is_admin:False",
+    "compute_extension:hosts": "rule:admin_api",
+    "compute_extension:hypervisors": "rule:admin_api",
+    "compute_extension:image_size": "rule:admin_or_owner",
+    "compute_extension:instance_actions": "rule:admin_or_owner",
+    "compute_extension:instance_actions:events": "rule:admin_api",
+    "compute_extension:instance_usage_audit_log": "rule:admin_api",
+    "compute_extension:keypairs": "rule:admin_or_owner",
+    "compute_extension:keypairs:index": "rule:admin_or_owner",
+    "compute_extension:keypairs:show": "rule:admin_or_owner",
+    "compute_extension:keypairs:create": "rule:admin_or_owner",
+    "compute_extension:keypairs:delete": "rule:admin_or_owner",
+    "compute_extension:multinic": "rule:admin_or_owner",
+    "compute_extension:networks": "rule:admin_api",
+    "compute_extension:networks:view": "rule:admin_or_owner",
+    "compute_extension:networks_associate": "rule:admin_api",
+    "compute_extension:os-tenant-networks": "rule:admin_or_owner",
+    "compute_extension:quotas:show": "rule:admin_or_owner",
+    "compute_extension:quotas:update": "rule:admin_api",
+    "compute_extension:quotas:delete": "rule:admin_api",
+    "compute_extension:quota_classes": "rule:admin_or_owner",
+    "compute_extension:rescue": "rule:admin_or_owner",
+    "compute_extension:security_group_default_rules": "rule:admin_api",
+    "compute_extension:security_groups": "rule:admin_or_owner",
+    "compute_extension:server_diagnostics": "rule:admin_api",
+    "compute_extension:server_groups": "rule:admin_or_owner",
+    "compute_extension:server_password": "rule:admin_or_owner",
+    "compute_extension:server_usage": "rule:admin_or_owner",
+    "compute_extension:services": "rule:admin_api",
+    "compute_extension:shelve": "rule:admin_or_owner",
+    "compute_extension:shelveOffload": "rule:admin_api",
+    "compute_extension:simple_tenant_usage:show": "rule:admin_or_owner",
+    "compute_extension:simple_tenant_usage:list": "rule:admin_api",
+    "compute_extension:unshelve": "rule:admin_or_owner",
+    "compute_extension:users": "rule:admin_api",
+    "compute_extension:virtual_interfaces": "rule:admin_or_owner",
+    "compute_extension:virtual_storage_arrays": "rule:admin_or_owner",
+    "compute_extension:volumes": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:index": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:show": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:create": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:update": "rule:admin_api",
+    "compute_extension:volume_attachments:delete": "rule:admin_or_owner",
+    "compute_extension:volumetypes": "rule:admin_or_owner",
+    "compute_extension:availability_zone:list": "rule:admin_or_owner",
+    "compute_extension:availability_zone:detail": "rule:admin_api",
+    "compute_extension:used_limits_for_admin": "rule:admin_api",
+    "compute_extension:migrations:index": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "compute_extension:console_auth_tokens": "rule:admin_api",
+    "compute_extension:os-server-external-events:create": "rule:admin_api",
+
+    "network:get_all": "rule:admin_or_owner",
+    "network:get": "rule:admin_or_owner",
+    "network:create": "rule:admin_or_owner",
+    "network:delete": "rule:admin_or_owner",
+    "network:associate": "rule:admin_or_owner",
+    "network:disassociate": "rule:admin_or_owner",
+    "network:get_vifs_by_instance": "rule:admin_or_owner",
+    "network:allocate_for_instance": "rule:admin_or_owner",
+    "network:deallocate_for_instance": "rule:admin_or_owner",
+    "network:validate_networks": "rule:admin_or_owner",
+    "network:get_instance_uuids_by_ip_filter": "rule:admin_or_owner",
+    "network:get_instance_id_by_floating_address": "rule:admin_or_owner",
+    "network:setup_networks_on_host": "rule:admin_or_owner",
+    "network:get_backdoor_port": "rule:admin_or_owner",
+
+    "network:get_floating_ip": "rule:admin_or_owner",
+    "network:get_floating_ip_pools": "rule:admin_or_owner",
+    "network:get_floating_ip_by_address": "rule:admin_or_owner",
+    "network:get_floating_ips_by_project": "rule:admin_or_owner",
+    "network:get_floating_ips_by_fixed_address": "rule:admin_or_owner",
+    "network:allocate_floating_ip": "rule:admin_or_owner",
+    "network:associate_floating_ip": "rule:admin_or_owner",
+    "network:disassociate_floating_ip": "rule:admin_or_owner",
+    "network:release_floating_ip": "rule:admin_or_owner",
+    "network:migrate_instance_start": "rule:admin_or_owner",
+    "network:migrate_instance_finish": "rule:admin_or_owner",
+
+    "network:get_fixed_ip": "rule:admin_or_owner",
+    "network:get_fixed_ip_by_address": "rule:admin_or_owner",
+    "network:add_fixed_ip_to_instance": "rule:admin_or_owner",
+    "network:remove_fixed_ip_from_instance": "rule:admin_or_owner",
+    "network:add_network_to_project": "rule:admin_or_owner",
+    "network:get_instance_nw_info": "rule:admin_or_owner",
+
+    "network:get_dns_domains": "rule:admin_or_owner",
+    "network:add_dns_entry": "rule:admin_or_owner",
+    "network:modify_dns_entry": "rule:admin_or_owner",
+    "network:delete_dns_entry": "rule:admin_or_owner",
+    "network:get_dns_entries_by_address": "rule:admin_or_owner",
+    "network:get_dns_entries_by_name": "rule:admin_or_owner",
+    "network:create_private_dns_domain": "rule:admin_or_owner",
+    "network:create_public_dns_domain": "rule:admin_or_owner",
+    "network:delete_dns_domain": "rule:admin_or_owner",
+    "network:attach_external_network": "rule:admin_api",
+    "network:get_vif_by_mac_address": "rule:admin_or_owner",
+
+    "os_compute_api:servers:detail:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:index:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:create": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
+    "os_compute_api:servers:create:forced_host": "rule:admin_api",
+    "os_compute_api:servers:delete": "rule:admin_or_owner",
+    "os_compute_api:servers:update": "rule:admin_or_owner",
+    "os_compute_api:servers:detail": "rule:admin_or_owner",
+    "os_compute_api:servers:index": "rule:admin_or_owner",
+    "os_compute_api:servers:reboot": "rule:admin_or_owner",
+    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
+    "os_compute_api:servers:resize": "rule:admin_or_owner",
+    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:show": "rule:admin_or_owner",
+    "os_compute_api:servers:show:host_status": "rule:admin_api",
+    "os_compute_api:servers:create_image": "rule:admin_or_owner",
+    "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
+    "os_compute_api:servers:start": "rule:admin_or_owner",
+    "os_compute_api:servers:stop": "rule:admin_or_owner",
+    "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
+    "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
+    "os_compute_api:servers:migrations:delete": "rule:admin_api",
+    "os_compute_api:servers:discoverable": "@",
+    "os_compute_api:servers:migrations:index": "rule:admin_api",
+    "os_compute_api:servers:migrations:show": "rule:admin_api",
+    "os_compute_api:os-access-ips:discoverable": "@",
+    "os_compute_api:os-access-ips": "rule:admin_or_owner",
+    "os_compute_api:os-admin-actions": "rule:admin_api",
     "os_compute_api:os-admin-actions:discoverable": "@",
+    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
     "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
-    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
-    "os_compute_api:os-admin-actions": "rule:admin_api",
-    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-password": "rule:admin_or_owner",
     "os_compute_api:os-admin-password:discoverable": "@",
-    "os_compute_api:os-admin-password": "rule:admin_or_owner",
+    "os_compute_api:os-aggregates:discoverable": "@",
+    "os_compute_api:os-aggregates:index": "rule:admin_api",
+    "os_compute_api:os-aggregates:create": "rule:admin_api",
+    "os_compute_api:os-aggregates:show": "rule:admin_api",
+    "os_compute_api:os-aggregates:update": "rule:admin_api",
+    "os_compute_api:os-aggregates:delete": "rule:admin_api",
+    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
     "os_compute_api:os-agents": "rule:admin_api",
     "os_compute_api:os-agents:discoverable": "@",
-    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
-    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:discoverable": "@",
-    "os_compute_api:os-aggregates:create": "rule:admin_api",
-    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:update": "rule:admin_api",
-    "os_compute_api:os-aggregates:index": "rule:admin_api",
-    "os_compute_api:os-aggregates:delete": "rule:admin_api",
-    "os_compute_api:os-aggregates:show": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
     "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
     "os_compute_api:os-attach-interfaces:discoverable": "@",
-    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:discoverable": "@",
-    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
+    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
     "os_compute_api:os-baremetal-nodes:discoverable": "@",
-    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
-    "network:attach_external_network": "is_admin:True",
-    "os_compute_api:os-block-device-mapping:discoverable": "@",
     "os_compute_api:os-block-device-mapping-v1:discoverable": "@",
+    "os_compute_api:os-cells": "rule:admin_api",
+    "os_compute_api:os-cells:create": "rule:admin_api",
+    "os_compute_api:os-cells:delete": "rule:admin_api",
+    "os_compute_api:os-cells:update": "rule:admin_api",
+    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
     "os_compute_api:os-cells:discoverable": "@",
-    "os_compute_api:os-cells:update": "rule:admin_api",
-    "os_compute_api:os-cells:create": "rule:admin_api",
-    "os_compute_api:os-cells": "rule:admin_api",
-    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
-    "os_compute_api:os-cells:delete": "rule:admin_api",
-    "cells_scheduler_filter:DifferentCellFilter": "is_admin:True",
-    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
-    "os_compute_api:os-certificates:discoverable": "@",
     "os_compute_api:os-certificates:create": "rule:admin_or_owner",
     "os_compute_api:os-certificates:show": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:discoverable": "@",
     "os_compute_api:os-cloudpipe": "rule:admin_api",
     "os_compute_api:os-cloudpipe:discoverable": "@",
+    "os_compute_api:os-config-drive": "rule:admin_or_owner",
     "os_compute_api:os-config-drive:discoverable": "@",
-    "os_compute_api:os-config-drive": "rule:admin_or_owner",
-    "os_compute_api:os-console-auth-tokens:discoverable": "@",
-    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-consoles:discoverable": "@",
+    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
     "os_compute_api:os-console-output:discoverable": "@",
     "os_compute_api:os-console-output": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:discoverable": "@",
-    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles:discoverable": "@",
     "os_compute_api:os-create-backup:discoverable": "@",
     "os_compute_api:os-create-backup": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
     "os_compute_api:os-deferred-delete:discoverable": "@",
-    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config:discoverable": "@",
+    "os_compute_api:os-evacuate": "rule:admin_api",
     "os_compute_api:os-evacuate:discoverable": "@",
-    "os_compute_api:os-evacuate": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes:discoverable": "@",
+    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:os-extended-status:discoverable": "@",
     "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
     "os_compute_api:os-extended-availability-zone:discoverable": "@",
-    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
-    "os_compute_api:os-extended-server-attributes:discoverable": "@",
-    "os_compute_api:os-extended-status:discoverable": "@",
-    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:extensions": "rule:admin_or_owner",
+    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:extension_info:discoverable": "@",
     "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
     "os_compute_api:os-extended-volumes:discoverable": "@",
-    "os_compute_api:extension_info:discoverable": "@",
-    "os_compute_api:extensions": "rule:admin_or_owner",
-    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:os-fixed-ips": "rule:admin_api",
     "os_compute_api:os-fixed-ips:discoverable": "@",
-    "os_compute_api:os-fixed-ips": "rule:admin_api",
-    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
     "os_compute_api:os-flavor-access:discoverable": "@",
     "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-rxtx:discoverable": "@",
+    "os_compute_api:flavors": "rule:admin_or_owner",
+    "os_compute_api:flavors:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
     "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
     "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
     "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
     "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-manage:discoverable": "@",
     "os_compute_api:os-flavor-manage": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:discoverable": "@",
-    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-rxtx:discoverable": "@",
-    "os_compute_api:flavors:discoverable": "@",
-    "os_compute_api:flavors": "rule:admin_or_owner",
     "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-dns:discoverable": "@",
     "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
-    "os_compute_api:os-floating-ip-dns:discoverable": "@",
     "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
+    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
     "os_compute_api:os-floating-ip-pools:discoverable": "@",
-    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
     "os_compute_api:os-floating-ips": "rule:admin_or_owner",
     "os_compute_api:os-floating-ips:discoverable": "@",
+    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
     "os_compute_api:os-floating-ips-bulk:discoverable": "@",
-    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
+    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-fping:discoverable": "@",
     "os_compute_api:os-fping:all_tenants": "rule:admin_api",
-    "os_compute_api:os-fping:discoverable": "@",
-    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-hide-server-addresses": "is_admin:False",
     "os_compute_api:os-hide-server-addresses:discoverable": "@",
-    "os_compute_api:os-hide-server-addresses": "is_admin:False",
+    "os_compute_api:os-hosts": "rule:admin_api",
     "os_compute_api:os-hosts:discoverable": "@",
-    "os_compute_api:os-hosts": "rule:admin_api",
+    "os_compute_api:os-hypervisors": "rule:admin_api",
     "os_compute_api:os-hypervisors:discoverable": "@",
-    "os_compute_api:os-hypervisors": "rule:admin_api",
-    "os_compute_api:image-metadata:discoverable": "@",
+    "os_compute_api:images:discoverable": "@",
+    "os_compute_api:image-size": "rule:admin_or_owner",
     "os_compute_api:image-size:discoverable": "@",
-    "os_compute_api:image-size": "rule:admin_or_owner",
-    "os_compute_api:images:discoverable": "@",
-    "os_compute_api:os-instance-actions:events": "rule:admin_api",
     "os_compute_api:os-instance-actions": "rule:admin_or_owner",
     "os_compute_api:os-instance-actions:discoverable": "@",
+    "os_compute_api:os-instance-actions:events": "rule:admin_api",
     "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
     "os_compute_api:os-instance-usage-audit-log:discoverable": "@",
     "os_compute_api:ips:discoverable": "@",
+    "os_compute_api:ips:index": "rule:admin_or_owner",
     "os_compute_api:ips:show": "rule:admin_or_owner",
-    "os_compute_api:ips:index": "rule:admin_or_owner",
     "os_compute_api:os-keypairs:discoverable": "@",
+    "os_compute_api:os-keypairs": "rule:admin_or_owner",
     "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
     "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
     "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs": "rule:admin_or_owner",
     "os_compute_api:limits:discoverable": "@",
     "os_compute_api:limits": "rule:admin_or_owner",
     "os_compute_api:os-lock-server:discoverable": "@",
     "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
     "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
-    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
+    "os_compute_api:os-migrate-server:discoverable": "@",
     "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
-    "os_compute_api:os-migrate-server:discoverable": "@",
     "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
-    "os_compute_api:os-migrations:index": "rule:admin_api",
-    "os_compute_api:os-migrations:discoverable": "@",
     "os_compute_api:os-multinic": "rule:admin_or_owner",
     "os_compute_api:os-multinic:discoverable": "@",
-    "os_compute_api:os-multiple-create:discoverable": "@",
-    "os_compute_api:os-networks:discoverable": "@",
     "os_compute_api:os-networks": "rule:admin_api",
     "os_compute_api:os-networks:view": "rule:admin_or_owner",
+    "os_compute_api:os-networks:discoverable": "@",
     "os_compute_api:os-networks-associate": "rule:admin_api",
     "os_compute_api:os-networks-associate:discoverable": "@",
-    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
     "os_compute_api:os-pause-server:discoverable": "@",
     "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
+    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
+    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
+    "os_compute_api:os-pci:discoverable": "@",
     "os_compute_api:os-pci:index": "rule:admin_api",
     "os_compute_api:os-pci:detail": "rule:admin_api",
-    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
     "os_compute_api:os-pci:show": "rule:admin_api",
-    "os_compute_api:os-pci:discoverable": "@",
+    "os_compute_api:os-personality:discoverable": "@",
+    "os_compute_api:os-preserve-ephemeral-rebuild:discoverable": "@",
+    "os_compute_api:os-quota-sets:discoverable": "@",
+    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
+    "os_compute_api:os-quota-sets:defaults": "@",
+    "os_compute_api:os-quota-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
+    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
     "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
     "os_compute_api:os-quota-class-sets:discoverable": "@",
-    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:defaults": "@",
-    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
-    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
-    "os_compute_api:os-quota-sets:discoverable": "@",
-    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
-    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
-    "os_compute_api:os-remote-consoles:discoverable": "@",
+    "os_compute_api:os-rescue": "rule:admin_or_owner",
     "os_compute_api:os-rescue:discoverable": "@",
-    "os_compute_api:os-rescue": "rule:admin_or_owner",
     "os_compute_api:os-scheduler-hints:discoverable": "@",
     "os_compute_api:os-security-group-default-rules:discoverable": "@",
     "os_compute_api:os-security-group-default-rules": "rule:admin_api",
@@ -178,82 +439,62 @@
     "os_compute_api:os-security-groups:discoverable": "@",
     "os_compute_api:os-server-diagnostics": "rule:admin_api",
     "os_compute_api:os-server-diagnostics:discoverable": "@",
-    "os_compute_api:os-server-external-events:create": "rule:admin_api",
-    "os_compute_api:os-server-external-events:discoverable": "@",
+    "os_compute_api:os-server-password": "rule:admin_or_owner",
+    "os_compute_api:os-server-password:discoverable": "@",
+    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "os_compute_api:os-server-usage:discoverable": "@",
+    "os_compute_api:os-server-groups": "rule:admin_or_owner",
     "os_compute_api:os-server-groups:discoverable": "@",
-    "os_compute_api:os-server-groups": "rule:admin_or_owner",
+    "os_compute_api:os-server-tags:index": "@",
+    "os_compute_api:os-server-tags:show": "@",
+    "os_compute_api:os-server-tags:update": "@",
+    "os_compute_api:os-server-tags:update_all": "@",
+    "os_compute_api:os-server-tags:delete": "@",
+    "os_compute_api:os-server-tags:delete_all": "@",
+    "os_compute_api:os-services": "rule:admin_api",
+    "os_compute_api:os-services:discoverable": "@",
+    "os_compute_api:server-metadata:discoverable": "@",
     "os_compute_api:server-metadata:index": "rule:admin_or_owner",
     "os_compute_api:server-metadata:show": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
     "os_compute_api:server-metadata:create": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:discoverable": "@",
+    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
     "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
-    "os_compute_api:os-server-password": "rule:admin_or_owner",
-    "os_compute_api:os-server-password:discoverable": "@",
-    "os_compute_api:os-server-tags:delete_all": "@",
-    "os_compute_api:os-server-tags:index": "@",
-    "os_compute_api:os-server-tags:update_all": "@",
-    "os_compute_api:os-server-tags:delete": "@",
-    "os_compute_api:os-server-tags:update": "@",
-    "os_compute_api:os-server-tags:show": "@",
-    "os_compute_api:os-server-tags:discoverable": "@",
-    "os_compute_api:os-server-usage": "rule:admin_or_owner",
-    "os_compute_api:os-server-usage:discoverable": "@",
-    "os_compute_api:servers:index": "rule:admin_or_owner",
-    "os_compute_api:servers:detail": "rule:admin_or_owner",
-    "os_compute_api:servers:detail:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:index:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:show": "rule:admin_or_owner",
-    "os_compute_api:servers:show:host_status": "rule:admin_api",
-    "os_compute_api:servers:create": "rule:admin_or_owner",
-    "os_compute_api:servers:create:forced_host": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
-    "os_compute_api:servers:delete": "rule:admin_or_owner",
-    "os_compute_api:servers:update": "rule:admin_or_owner",
-    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:reboot": "rule:admin_or_owner",
-    "os_compute_api:servers:resize": "rule:admin_or_owner",
-    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
-    "os_compute_api:servers:create_image": "rule:admin_or_owner",
-    "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
-    "os_compute_api:servers:start": "rule:admin_or_owner",
-    "os_compute_api:servers:stop": "rule:admin_or_owner",
-    "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
-    "os_compute_api:servers:discoverable": "@",
-    "os_compute_api:servers:migrations:show": "rule:admin_api",
-    "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
-    "os_compute_api:servers:migrations:delete": "rule:admin_api",
-    "os_compute_api:servers:migrations:index": "rule:admin_api",
-    "os_compute_api:server-migrations:discoverable": "@",
-    "os_compute_api:os-services": "rule:admin_api",
-    "os_compute_api:os-services:discoverable": "@",
     "os_compute_api:os-shelve:shelve": "rule:admin_or_owner",
-    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-shelve:shelve:discoverable": "@",
     "os_compute_api:os-shelve:shelve_offload": "rule:admin_api",
-    "os_compute_api:os-shelve:discoverable": "@",
+    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
     "os_compute_api:os-simple-tenant-usage:show": "rule:admin_or_owner",
     "os_compute_api:os-simple-tenant-usage:list": "rule:admin_api",
-    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
+    "os_compute_api:os-suspend-server:discoverable": "@",
+    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-suspend-server:resume": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:discoverable": "@",
     "os_compute_api:os-tenant-networks": "rule:admin_or_owner",
     "os_compute_api:os-tenant-networks:discoverable": "@",
+    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-user-data:discoverable": "@",
+    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-virtual-interfaces:discoverable": "@",
+    "os_compute_api:os-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-volumes:discoverable": "@",
+    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
+    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:discoverable": "@",
+    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
+    "os_compute_api:os-availability-zone:discoverable": "@",
+    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
+    "os_compute_api:os-used-limits": "rule:admin_api",
     "os_compute_api:os-used-limits:discoverable": "@",
-    "os_compute_api:os-used-limits": "rule:admin_api",
-    "os_compute_api:os-user-data:discoverable": "@",
-    "os_compute_api:versions:discoverable": "@",
-    "os_compute_api:os-virtual-interfaces:discoverable": "@",
-    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-volumes:discoverable": "@",
-    "os_compute_api:os-volumes": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:discoverable": "@",
-    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
-    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner"
+    "os_compute_api:os-migrations:index": "rule:admin_api",
+    "os_compute_api:os-migrations:discoverable": "@",
+    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
+    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-console-auth-tokens:discoverable": "@",
+    "os_compute_api:os-server-external-events:create": "rule:admin_api",
+    "os_compute_api:os-server-external-events:discoverable": "@"
 }

2018-02-06 10:39:28,934 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 10:39:28.934512 duration_in_ms=49.624
2018-02-06 10:39:28,935 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 10:39:28.935052
2018-02-06 10:39:28,935 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json
2018-02-06 10:39:28,962 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/neutron_policy.json'
2018-02-06 10:39:28,971 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -8,6 +8,8 @@
     "admin_only": "rule:context_is_admin",
     "regular_user": "",
     "shared": "field:networks:shared=True",
+    "shared_firewalls": "field:firewalls:shared=True",
+    "shared_firewall_policies": "field:firewall_policies:shared=True",
     "shared_subnetpools": "field:subnetpools:shared=True",
     "shared_address_scopes": "field:address_scopes:shared=True",
     "external": "field:networks:router:external=True",
@@ -111,8 +113,27 @@
     "create_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
     "update_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
 
+    "create_firewall": "",
+    "get_firewall": "rule:admin_or_owner",
+    "create_firewall:shared": "rule:admin_only",
+    "get_firewall:shared": "rule:admin_only",
+    "update_firewall": "rule:admin_or_owner",
+    "update_firewall:shared": "rule:admin_only",
+    "delete_firewall": "rule:admin_or_owner",
+
+    "create_firewall_policy": "",
+    "get_firewall_policy": "rule:admin_or_owner or rule:shared_firewall_policies",
+    "create_firewall_policy:shared": "rule:admin_or_owner",
+    "update_firewall_policy": "rule:admin_or_owner",
+    "delete_firewall_policy": "rule:admin_or_owner",
+
     "insert_rule": "rule:admin_or_owner",
     "remove_rule": "rule:admin_or_owner",
+
+    "create_firewall_rule": "",
+    "get_firewall_rule": "rule:admin_or_owner or rule:shared_firewalls",
+    "update_firewall_rule": "rule:admin_or_owner",
+    "delete_firewall_rule": "rule:admin_or_owner",
 
     "create_qos_queue": "rule:admin_only",
     "get_qos_queue": "rule:admin_only",

2018-02-06 10:39:28,972 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 10:39:28.972101 duration_in_ms=37.048
2018-02-06 10:39:28,973 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 10:39:28.973755
2018-02-06 10:39:28,974 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json
2018-02-06 10:39:29,002 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/glance_policy.json'
2018-02-06 10:39:29,004 [salt.state       ][INFO    ][11352] File /usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json is in the correct state
2018-02-06 10:39:29,005 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 10:39:29.005004 duration_in_ms=31.249
2018-02-06 10:39:29,006 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 10:39:29.006791
2018-02-06 10:39:29,007 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json
2018-02-06 10:39:29,028 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/ceilometer_policy.json'
2018-02-06 10:39:29,029 [salt.state       ][INFO    ][11352] File changed:
New file
2018-02-06 10:39:29,030 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 10:39:29.030282 duration_in_ms=23.491
2018-02-06 10:39:29,030 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 10:39:29.030811
2018-02-06 10:39:29,031 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json
2018-02-06 10:39:29,056 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/cinder_policy.json'
2018-02-06 10:39:29,058 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -95,16 +95,16 @@
     "snapshot_extension:snapshot_manage": "rule:admin_api",
     "snapshot_extension:snapshot_unmanage": "rule:admin_api",
 
-    "consistencygroup:create" : "",
-    "consistencygroup:delete": "",
-    "consistencygroup:update": "",
-    "consistencygroup:get": "",
-    "consistencygroup:get_all": "",
+    "consistencygroup:create" : "group:nobody",
+    "consistencygroup:delete": "group:nobody",
+    "consistencygroup:update": "group:nobody",
+    "consistencygroup:get": "group:nobody",
+    "consistencygroup:get_all": "group:nobody",
 
-    "consistencygroup:create_cgsnapshot" : "",
-    "consistencygroup:delete_cgsnapshot": "",
-    "consistencygroup:get_cgsnapshot": "",
-    "consistencygroup:get_all_cgsnapshots": "",
+    "consistencygroup:create_cgsnapshot" : "group:nobody",
+    "consistencygroup:delete_cgsnapshot": "group:nobody",
+    "consistencygroup:get_cgsnapshot": "group:nobody",
+    "consistencygroup:get_all_cgsnapshots": "group:nobody",
 
     "scheduler_extension:scheduler_stats:get_pools" : "rule:admin_api",
     "message:delete": "rule:admin_or_owner",

2018-02-06 10:39:29,060 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 10:39:29.060457 duration_in_ms=29.645
2018-02-06 10:39:29,061 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 10:39:29.061099
2018-02-06 10:39:29,061 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json
2018-02-06 10:39:29,085 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/heat_policy.json'
2018-02-06 10:39:29,086 [salt.state       ][INFO    ][11352] File /usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json is in the correct state
2018-02-06 10:39:29,086 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 10:39:29.086312 duration_in_ms=25.213
2018-02-06 10:39:29,086 [salt.state       ][INFO    ][11352] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 10:39:29.086821
2018-02-06 10:39:29,087 [salt.state       ][INFO    ][11352] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json
2018-02-06 10:39:29,110 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/keystone_policy.json'
2018-02-06 10:39:29,114 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -28,7 +28,7 @@
     "identity:update_endpoint": "rule:admin_required",
     "identity:delete_endpoint": "rule:admin_required",
 
-    "identity:get_domain": "rule:admin_required or token.project.domain.id:%(target.domain.id)s",
+    "identity:get_domain": "rule:admin_required",
     "identity:list_domains": "rule:admin_required",
     "identity:create_domain": "rule:admin_required",
     "identity:update_domain": "rule:admin_required",
@@ -41,7 +41,7 @@
     "identity:update_project": "rule:admin_required",
     "identity:delete_project": "rule:admin_required",
 
-    "identity:get_user": "rule:admin_or_owner",
+    "identity:get_user": "rule:admin_required",
     "identity:list_users": "rule:admin_required",
     "identity:create_user": "rule:admin_required",
     "identity:update_user": "rule:admin_required",
@@ -173,10 +173,10 @@
     "identity:get_auth_projects": "",
     "identity:get_auth_domains": "",
 
-    "identity:list_projects_for_user": "",
-    "identity:list_domains_for_user": "",
+    "identity:list_projects_for_groups": "",
+    "identity:list_domains_for_groups": "",
 
-    "identity:list_revoke_events": "rule:service_or_admin",
+    "identity:list_revoke_events": "",
 
     "identity:create_policy_association_for_endpoint": "rule:admin_required",
     "identity:check_policy_association_for_endpoint": "rule:admin_required",
@@ -192,7 +192,6 @@
 
     "identity:create_domain_config": "rule:admin_required",
     "identity:get_domain_config": "rule:admin_required",
-    "identity:get_security_compliance_domain_config": "",
     "identity:update_domain_config": "rule:admin_required",
     "identity:delete_domain_config": "rule:admin_required",
     "identity:get_domain_config_default": "rule:admin_required"

2018-02-06 10:39:29,116 [salt.state       ][INFO    ][11352] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 10:39:29.116588 duration_in_ms=29.766
2018-02-06 10:39:29,117 [salt.state       ][INFO    ][11352] Running state [/etc/apache2/ports.conf] at time 10:39:29.117255
2018-02-06 10:39:29,117 [salt.state       ][INFO    ][11352] Executing state file.managed for /etc/apache2/ports.conf
2018-02-06 10:39:29,141 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/ports.conf'
2018-02-06 10:39:29,190 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -1,15 +1,16 @@
+
 # If you just change the port or add more ports here, you will likely also
 # have to change the VirtualHost statement in
 # /etc/apache2/sites-enabled/000-default.conf
 
-Listen 80
+Listen 0.0.0.0:8078
 
 <IfModule ssl_module>
-	Listen 443
+        Listen 0.0.0.0:443
 </IfModule>
 
 <IfModule mod_gnutls.c>
-	Listen 443
+        Listen 0.0.0.0:443
 </IfModule>
 
 # vim: syntax=apache ts=4 sw=4 sts=4 sr noet

2018-02-06 10:39:29,191 [salt.state       ][INFO    ][11352] Completed state [/etc/apache2/ports.conf] at time 10:39:29.191285 duration_in_ms=74.03
2018-02-06 10:39:29,191 [salt.state       ][INFO    ][11352] Running state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 10:39:29.191828
2018-02-06 10:39:29,192 [salt.state       ][INFO    ][11352] Executing state file.managed for /etc/apache2/conf-available/openstack-dashboard.conf
2018-02-06 10:39:29,214 [salt.fileclient  ][INFO    ][11352] Fetching file from saltenv 'base', ** done ** 'horizon/files/openstack-dashboard.conf.Debian'
2018-02-06 10:39:29,265 [salt.state       ][INFO    ][11352] File changed:
--- 
+++ 
@@ -1,14 +1,36 @@
-WSGIScriptAlias /horizon /usr/share/openstack-dashboard/openstack_dashboard/wsgi/django.wsgi process-group=horizon
-WSGIDaemonProcess horizon user=horizon group=horizon processes=3 threads=10 display-name=%{GROUP}
-WSGIProcessGroup horizon
 
-Alias /static /var/lib/openstack-dashboard/static/
-Alias /horizon/static /var/lib/openstack-dashboard/static/
 
-<Directory /usr/share/openstack-dashboard/openstack_dashboard/wsgi>
-  Require all granted
-</Directory>
+<VirtualHost 0.0.0.0:8078>
+  ServerName openstack-dashboard
 
-<Directory /var/lib/openstack-dashboard/static>
-  Require all granted
-</Directory>
+  WSGIScriptAlias / /usr/share/openstack-dashboard/openstack_dashboard/wsgi/django.wsgi
+  WSGIDaemonProcess horizon user=horizon group=horizon processes=3 threads=10
+  WSGIProcessGroup horizon
+
+  Alias /static /usr/share/openstack-dashboard/static
+
+  <Directory /usr/share/openstack-dashboard/openstack_dashboard/wsgi>
+    Order allow,deny
+    Allow from all
+  </Directory>
+
+  <Directory /usr/share/openstack-dashboard/static>
+    <IfModule mod_expires.c>
+      ExpiresActive On
+      ExpiresDefault "access 6 month"
+    </IfModule>
+    <IfModule mod_deflate.c>
+      SetOutputFilter DEFLATE
+    </IfModule>
+
+    Require all granted
+  </Directory>
+  ServerSignature Off
+  LogFormat "%h %t %m \"%U%q\" %H %>s %O %D \"%{Referer}i\" \"%{User-Agent}i\"" horizon
+  ErrorLog "/var/log/apache2/openstack_dashboard_error.log"
+  CustomLog "/var/log/apache2/openstack_dashboard_access.log" horizon
+  SetEnvIf X-Forwarded-Proto https HTTPS=1
+
+</VirtualHost>
+
+

2018-02-06 10:39:29,269 [salt.state       ][INFO    ][11352] Completed state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 10:39:29.269329 duration_in_ms=77.501
2018-02-06 10:39:29,273 [salt.state       ][INFO    ][11352] Running state [wsgi] at time 10:39:29.273212
2018-02-06 10:39:29,273 [salt.state       ][INFO    ][11352] Executing state apache_module.enabled for wsgi
2018-02-06 10:39:29,275 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['a2enmod', 'wsgi'] in directory '/root'
2018-02-06 10:39:29,367 [salt.state       ][INFO    ][11352] {'new': 'wsgi', 'old': None}
2018-02-06 10:39:29,367 [salt.state       ][INFO    ][11352] Completed state [wsgi] at time 10:39:29.367812 duration_in_ms=94.6
2018-02-06 10:39:29,370 [salt.state       ][INFO    ][11352] Running state [openstack-dashboard] at time 10:39:29.370755
2018-02-06 10:39:29,371 [salt.state       ][INFO    ][11352] Executing state apache_conf.enabled for openstack-dashboard
2018-02-06 10:39:29,371 [salt.state       ][INFO    ][11352] openstack-dashboard already enabled.
2018-02-06 10:39:29,372 [salt.state       ][INFO    ][11352] Completed state [openstack-dashboard] at time 10:39:29.372067 duration_in_ms=1.313
2018-02-06 10:39:29,568 [salt.state       ][INFO    ][11352] Running state [/var/log/horizon] at time 10:39:29.568551
2018-02-06 10:39:29,569 [salt.state       ][INFO    ][11352] Executing state file.directory for /var/log/horizon
2018-02-06 10:39:29,570 [salt.state       ][INFO    ][11352] {'/var/log/horizon': 'New Dir'}
2018-02-06 10:39:29,571 [salt.state       ][INFO    ][11352] Completed state [/var/log/horizon] at time 10:39:29.571155 duration_in_ms=2.604
2018-02-06 10:39:29,571 [salt.state       ][INFO    ][11352] Running state [/var/log/horizon/horizon.log] at time 10:39:29.571662
2018-02-06 10:39:29,572 [salt.state       ][INFO    ][11352] Executing state file.managed for /var/log/horizon/horizon.log
2018-02-06 10:39:29,572 [salt.loaded.int.states.file][WARNING ][11352] State for file: /var/log/horizon/horizon.log - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-02-06 10:39:29,573 [salt.state       ][INFO    ][11352] {'new': 'file /var/log/horizon/horizon.log created', 'group': 'adm', 'mode': '0640', 'user': 'horizon'}
2018-02-06 10:39:29,574 [salt.state       ][INFO    ][11352] Completed state [/var/log/horizon/horizon.log] at time 10:39:29.574056 duration_in_ms=2.394
2018-02-06 10:39:29,575 [salt.state       ][INFO    ][11352] Running state [apache2] at time 10:39:29.575002
2018-02-06 10:39:29,575 [salt.state       ][INFO    ][11352] Executing state service.running for apache2
2018-02-06 10:39:29,576 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-02-06 10:39:29,601 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-02-06 10:39:29,626 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-02-06 10:39:29,655 [salt.state       ][INFO    ][11352] The service apache2 is already running
2018-02-06 10:39:29,655 [salt.state       ][INFO    ][11352] Completed state [apache2] at time 10:39:29.655870 duration_in_ms=80.868
2018-02-06 10:39:29,656 [salt.state       ][INFO    ][11352] Running state [apache2] at time 10:39:29.656305
2018-02-06 10:39:29,656 [salt.state       ][INFO    ][11352] Executing state service.mod_watch for apache2
2018-02-06 10:39:29,657 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-02-06 10:39:29,678 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-02-06 10:39:29,708 [salt.loaded.int.module.cmdmod][INFO    ][11352] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'apache2.service'] in directory '/root'
2018-02-06 10:39:32,009 [salt.state       ][INFO    ][11352] {'apache2': True}
2018-02-06 10:39:32,010 [salt.state       ][INFO    ][11352] Completed state [apache2] at time 10:39:32.010405 duration_in_ms=2354.099
2018-02-06 10:39:32,016 [salt.minion      ][INFO    ][11352] Returning information for job: 20180206103752951148
2018-02-06 10:39:32,822 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command state.sls with jid 20180206103932801853
2018-02-06 10:39:32,844 [salt.minion      ][INFO    ][16243] Starting a new job with PID 16243
2018-02-06 10:39:35,246 [salt.state       ][INFO    ][16243] Loading fresh modules for state activity
2018-02-06 10:39:35,327 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/init.sls'
2018-02-06 10:39:35,355 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/server.sls'
2018-02-06 10:39:35,429 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/server/users.sls'
2018-02-06 10:39:35,475 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/server/sites.sls'
2018-02-06 10:39:35,642 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/init.sls'
2018-02-06 10:39:35,664 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/server.sls'
2018-02-06 10:39:35,698 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/_salt.sls'
2018-02-06 10:39:38,282 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'git/init.sls'
2018-02-06 10:39:38,307 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'git/client.sls'
2018-02-06 10:39:38,345 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'git/map.jinja'
2018-02-06 10:39:38,823 [salt.state       ][INFO    ][16243] Running state [cat /etc/ssl/certs/100.64.200.103.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/100.64.200.103-with-chain.crt] at time 10:39:38.823198
2018-02-06 10:39:38,824 [salt.state       ][INFO    ][16243] Executing state cmd.run for cat /etc/ssl/certs/100.64.200.103.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/100.64.200.103-with-chain.crt
2018-02-06 10:39:38,826 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command 'cat /etc/ssl/certs/100.64.200.103.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/100.64.200.103-with-chain.crt' in directory '/root'
2018-02-06 10:39:38,851 [salt.state       ][INFO    ][16243] {'pid': 16264, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-02-06 10:39:38,853 [salt.state       ][INFO    ][16243] Completed state [cat /etc/ssl/certs/100.64.200.103.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/100.64.200.103-with-chain.crt] at time 10:39:38.853195 duration_in_ms=29.997
2018-02-06 10:39:40,535 [salt.state       ][INFO    ][16243] Running state [nginx] at time 10:39:40.535164
2018-02-06 10:39:40,535 [salt.state       ][INFO    ][16243] Executing state pkg.installed for nginx
2018-02-06 10:39:40,536 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:39:40,952 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['apt-cache', '-q', 'policy', 'nginx'] in directory '/root'
2018-02-06 10:39:41,044 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-02-06 10:39:42,722 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:39:42,762 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'nginx'] in directory '/root'
2018-02-06 10:39:42,925 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103942901196
2018-02-06 10:39:42,948 [salt.minion      ][INFO    ][16572] Starting a new job with PID 16572
2018-02-06 10:39:42,962 [salt.minion      ][INFO    ][16572] Returning information for job: 20180206103942901196
2018-02-06 10:39:47,690 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:39:47,743 [salt.state       ][INFO    ][16243] Made the following changes:
'libgd3' changed from 'absent' to '2.1.1-4ubuntu0.16.04.8'
'nginx-core' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libxpm4' changed from 'absent' to '1:3.5.11-1ubuntu0.16.04.1'
'nginx' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'nginx-common' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libfontconfig' changed from 'absent' to '1'
'fonts-dejavu-core' changed from 'absent' to '2.35-1'
'fontconfig-config' changed from 'absent' to '2.11.94-0ubuntu1.1'
'libvpx3' changed from 'absent' to '1.5.0-2ubuntu1'
'libfontconfig1' changed from 'absent' to '2.11.94-0ubuntu1.1'

2018-02-06 10:39:47,774 [salt.state       ][INFO    ][16243] Loading fresh modules for state activity
2018-02-06 10:39:47,949 [salt.state       ][INFO    ][16243] Completed state [nginx] at time 10:39:47.949361 duration_in_ms=7414.197
2018-02-06 10:39:47,955 [salt.state       ][INFO    ][16243] Running state [apache2-utils] at time 10:39:47.955317
2018-02-06 10:39:47,955 [salt.state       ][INFO    ][16243] Executing state pkg.installed for apache2-utils
2018-02-06 10:39:48,447 [salt.state       ][INFO    ][16243] All specified packages are already installed
2018-02-06 10:39:48,448 [salt.state       ][INFO    ][16243] Completed state [apache2-utils] at time 10:39:48.448535 duration_in_ms=493.217
2018-02-06 10:39:48,451 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 10:39:48.451358
2018-02-06 10:39:48,451 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf
2018-02-06 10:39:48,493 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/proxy.conf'
2018-02-06 10:39:48,556 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_name.conf'
2018-02-06 10:39:48,588 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl.conf'
2018-02-06 10:39:48,630 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl_secure.conf'
2018-02-06 10:39:48,661 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_auth.conf'
2018-02-06 10:39:48,691 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_access_policy.conf'
2018-02-06 10:39:48,699 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:48,700 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 10:39:48.700192 duration_in_ms=248.833
2018-02-06 10:39:48,700 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 10:39:48.700601
2018-02-06 10:39:48,701 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf
2018-02-06 10:39:48,702 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf'}
2018-02-06 10:39:48,702 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 10:39:48.702890 duration_in_ms=2.29
2018-02-06 10:39:48,703 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 10:39:48.703838
2018-02-06 10:39:48,704 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf
2018-02-06 10:39:48,851 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:48,852 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 10:39:48.852346 duration_in_ms=148.508
2018-02-06 10:39:48,852 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 10:39:48.852705
2018-02-06 10:39:48,853 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf
2018-02-06 10:39:48,854 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf'}
2018-02-06 10:39:48,854 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 10:39:48.854785 duration_in_ms=2.08
2018-02-06 10:39:48,855 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 10:39:48.855665
2018-02-06 10:39:48,856 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf
2018-02-06 10:39:48,998 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:48,999 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 10:39:48.999355 duration_in_ms=143.689
2018-02-06 10:39:49,000 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 10:39:49.000093
2018-02-06 10:39:49,000 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf
2018-02-06 10:39:49,002 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf'}
2018-02-06 10:39:49,002 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 10:39:49.002614 duration_in_ms=2.52
2018-02-06 10:39:49,003 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 10:39:49.003955
2018-02-06 10:39:49,004 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf
2018-02-06 10:39:49,150 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,151 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 10:39:49.151080 duration_in_ms=147.124
2018-02-06 10:39:49,151 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 10:39:49.151907
2018-02-06 10:39:49,152 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf
2018-02-06 10:39:49,154 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf'}
2018-02-06 10:39:49,154 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 10:39:49.154319 duration_in_ms=2.412
2018-02-06 10:39:49,155 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 10:39:49.155162
2018-02-06 10:39:49,155 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_web.conf
2018-02-06 10:39:49,310 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,311 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 10:39:49.311171 duration_in_ms=156.008
2018-02-06 10:39:49,312 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 10:39:49.312183
2018-02-06 10:39:49,312 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf
2018-02-06 10:39:49,314 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf'}
2018-02-06 10:39:49,314 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 10:39:49.314873 duration_in_ms=2.69
2018-02-06 10:39:49,316 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 10:39:49.316019
2018-02-06 10:39:49,316 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf
2018-02-06 10:39:49,463 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,463 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 10:39:49.463922 duration_in_ms=147.904
2018-02-06 10:39:49,464 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 10:39:49.464285
2018-02-06 10:39:49,464 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf
2018-02-06 10:39:49,466 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf'}
2018-02-06 10:39:49,466 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 10:39:49.466361 duration_in_ms=2.076
2018-02-06 10:39:49,467 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 10:39:49.467229
2018-02-06 10:39:49,467 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_novnc.conf
2018-02-06 10:39:49,603 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,603 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 10:39:49.603893 duration_in_ms=136.663
2018-02-06 10:39:49,604 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 10:39:49.604438
2018-02-06 10:39:49,605 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_novnc.conf
2018-02-06 10:39:49,607 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_novnc.conf'}
2018-02-06 10:39:49,608 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 10:39:49.608702 duration_in_ms=4.264
2018-02-06 10:39:49,609 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 10:39:49.609575
2018-02-06 10:39:49,609 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf
2018-02-06 10:39:49,751 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,752 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 10:39:49.752182 duration_in_ms=142.606
2018-02-06 10:39:49,752 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 10:39:49.752578
2018-02-06 10:39:49,753 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf
2018-02-06 10:39:49,754 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf'}
2018-02-06 10:39:49,754 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 10:39:49.754893 duration_in_ms=2.315
2018-02-06 10:39:49,755 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 10:39:49.755882
2018-02-06 10:39:49,756 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf
2018-02-06 10:39:49,896 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:49,897 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 10:39:49.897152 duration_in_ms=141.269
2018-02-06 10:39:49,897 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 10:39:49.897550
2018-02-06 10:39:49,897 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf
2018-02-06 10:39:49,899 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf'}
2018-02-06 10:39:49,900 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 10:39:49.900296 duration_in_ms=2.745
2018-02-06 10:39:49,901 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 10:39:49.901378
2018-02-06 10:39:49,901 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf
2018-02-06 10:39:50,038 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,038 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 10:39:50.038672 duration_in_ms=137.294
2018-02-06 10:39:50,039 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 10:39:50.039105
2018-02-06 10:39:50,040 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf
2018-02-06 10:39:50,042 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf'}
2018-02-06 10:39:50,042 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 10:39:50.042585 duration_in_ms=3.48
2018-02-06 10:39:50,043 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf] at time 10:39:50.043558
2018-02-06 10:39:50,043 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf
2018-02-06 10:39:50,176 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,176 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf] at time 10:39:50.176827 duration_in_ms=133.268
2018-02-06 10:39:50,177 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf] at time 10:39:50.177312
2018-02-06 10:39:50,177 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf
2018-02-06 10:39:50,179 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf'}
2018-02-06 10:39:50,179 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf] at time 10:39:50.179924 duration_in_ms=2.611
2018-02-06 10:39:50,181 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 10:39:50.181033
2018-02-06 10:39:50,181 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf
2018-02-06 10:39:50,204 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/redirect.conf'
2018-02-06 10:39:50,216 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,216 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 10:39:50.216624 duration_in_ms=35.591
2018-02-06 10:39:50,217 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 10:39:50.217180
2018-02-06 10:39:50,217 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf
2018-02-06 10:39:50,220 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf'}
2018-02-06 10:39:50,220 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 10:39:50.220323 duration_in_ms=3.144
2018-02-06 10:39:50,221 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 10:39:50.221204
2018-02-06 10:39:50,221 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_static_reclass_doc.conf
2018-02-06 10:39:50,244 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/static.conf'
2018-02-06 10:39:50,305 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/_log.conf'
2018-02-06 10:39:50,364 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,365 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 10:39:50.365304 duration_in_ms=144.099
2018-02-06 10:39:50,365 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 10:39:50.365661
2018-02-06 10:39:50,366 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_static_reclass_doc.conf
2018-02-06 10:39:50,367 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf'}
2018-02-06 10:39:50,367 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 10:39:50.367705 duration_in_ms=2.044
2018-02-06 10:39:50,368 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 10:39:50.368558
2018-02-06 10:39:50,368 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf
2018-02-06 10:39:50,508 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,508 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 10:39:50.508397 duration_in_ms=139.839
2018-02-06 10:39:50,508 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 10:39:50.508758
2018-02-06 10:39:50,509 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf
2018-02-06 10:39:50,510 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf'}
2018-02-06 10:39:50,510 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 10:39:50.510828 duration_in_ms=2.07
2018-02-06 10:39:50,511 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 10:39:50.511696
2018-02-06 10:39:50,512 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_stats_stats.conf
2018-02-06 10:39:50,532 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/stats.conf'
2018-02-06 10:39:50,536 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,536 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 10:39:50.536857 duration_in_ms=25.161
2018-02-06 10:39:50,537 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 10:39:50.537240
2018-02-06 10:39:50,537 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_stats_stats.conf
2018-02-06 10:39:50,538 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_stats_stats.conf'}
2018-02-06 10:39:50,539 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 10:39:50.539237 duration_in_ms=1.997
2018-02-06 10:39:50,540 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 10:39:50.540094
2018-02-06 10:39:50,540 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf
2018-02-06 10:39:50,678 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,679 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 10:39:50.679286 duration_in_ms=139.191
2018-02-06 10:39:50,679 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 10:39:50.679806
2018-02-06 10:39:50,680 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf
2018-02-06 10:39:50,682 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf'}
2018-02-06 10:39:50,682 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 10:39:50.682698 duration_in_ms=2.891
2018-02-06 10:39:50,683 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 10:39:50.683938
2018-02-06 10:39:50,684 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf
2018-02-06 10:39:50,830 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:39:50,830 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 10:39:50.830695 duration_in_ms=146.756
2018-02-06 10:39:50,831 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 10:39:50.831154
2018-02-06 10:39:50,831 [salt.state       ][INFO    ][16243] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf
2018-02-06 10:39:50,833 [salt.state       ][INFO    ][16243] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf'}
2018-02-06 10:39:50,833 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 10:39:50.833884 duration_in_ms=2.73
2018-02-06 10:39:50,834 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-enabled/default] at time 10:39:50.834844
2018-02-06 10:39:50,835 [salt.state       ][INFO    ][16243] Executing state file.absent for /etc/nginx/sites-enabled/default
2018-02-06 10:39:50,836 [salt.state       ][INFO    ][16243] {'removed': '/etc/nginx/sites-enabled/default'}
2018-02-06 10:39:50,836 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-enabled/default] at time 10:39:50.836407 duration_in_ms=1.563
2018-02-06 10:39:50,837 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/sites-available/default] at time 10:39:50.837370
2018-02-06 10:39:50,837 [salt.state       ][INFO    ][16243] Executing state file.absent for /etc/nginx/sites-available/default
2018-02-06 10:39:50,838 [salt.state       ][INFO    ][16243] {'removed': '/etc/nginx/sites-available/default'}
2018-02-06 10:39:50,838 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/sites-available/default] at time 10:39:50.838576 duration_in_ms=1.205
2018-02-06 10:39:50,839 [salt.state       ][INFO    ][16243] Running state [/etc/nginx/nginx.conf] at time 10:39:50.839526
2018-02-06 10:39:50,839 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/nginx/nginx.conf
2018-02-06 10:39:50,864 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'nginx/files/nginx.conf'
2018-02-06 10:39:50,904 [salt.state       ][INFO    ][16243] File changed:
--- 
+++ 
@@ -1,85 +1,100 @@
 user www-data;
 worker_processes auto;
+worker_rlimit_nofile 20000;
 pid /run/nginx.pid;
 
+
 events {
-	worker_connections 768;
-	# multi_accept on;
+        worker_connections 1024;
+        # multi_accept on;
 }
 
 http {
 
-	##
-	# Basic Settings
-	##
+        ##
+        # Basic Settings
+        ##
 
-	sendfile on;
-	tcp_nopush on;
-	tcp_nodelay on;
-	keepalive_timeout 65;
-	types_hash_max_size 2048;
-	# server_tokens off;
+        sendfile on;
+        tcp_nopush on;
+        tcp_nodelay on;
+        keepalive_timeout 65;
+        types_hash_max_size 2048;
+        server_tokens off;
 
-	# server_names_hash_bucket_size 64;
-	# server_name_in_redirect off;
+        server_names_hash_bucket_size 128;
+        # server_name_in_redirect off;
 
-	include /etc/nginx/mime.types;
-	default_type application/octet-stream;
+        include /etc/nginx/mime.types;
+        default_type application/octet-stream;
 
-	##
-	# SSL Settings
-	##
+        ##
+        # Logging Settings
+        ##
 
-	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
-	ssl_prefer_server_ciphers on;
+        access_log /var/log/nginx/access.log;
+        error_log /var/log/nginx/error.log;
 
-	##
-	# Logging Settings
-	##
+        ##
+        # Gzip Settings
+        ##
 
-	access_log /var/log/nginx/access.log;
-	error_log /var/log/nginx/error.log;
+        gzip on;
+        gzip_disable "msie6";
 
-	##
-	# Gzip Settings
-	##
+        # gzip_vary on;
+        # gzip_proxied any;
+        # gzip_comp_level 6;
+        # gzip_buffers 16 8k;
+        # gzip_http_version 1.1;
+        # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
 
-	gzip on;
-	gzip_disable "msie6";
+        ##
+        # nginx-naxsi config
+        ##
+        # Uncomment it if you installed nginx-naxsi
+        ##
 
-	# gzip_vary on;
-	# gzip_proxied any;
-	# gzip_comp_level 6;
-	# gzip_buffers 16 8k;
-	# gzip_http_version 1.1;
-	# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
+        #include /etc/nginx/naxsi_core.rules;
 
-	##
-	# Virtual Host Configs
-	##
+        ##
+        # nginx-passenger config
+        ##
+        # Uncomment it if you installed nginx-passenger
+        ##
 
-	include /etc/nginx/conf.d/*.conf;
-	include /etc/nginx/sites-enabled/*;
+        #passenger_root /usr;
+        #passenger_ruby /usr/bin/ruby;
+
+
+
+        ##
+        # Virtual Host Configs
+        ##
+
+        include /etc/nginx/conf.d/*.conf;
+        include /etc/nginx/sites-enabled/*.conf;
 }
 
 
+
 #mail {
-#	# See sample authentication script at:
-#	# http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
-# 
-#	# auth_http localhost/auth.php;
-#	# pop3_capabilities "TOP" "USER";
-#	# imap_capabilities "IMAP4rev1" "UIDPLUS";
-# 
-#	server {
-#		listen     localhost:110;
-#		protocol   pop3;
-#		proxy      on;
-#	}
-# 
-#	server {
-#		listen     localhost:143;
-#		protocol   imap;
-#		proxy      on;
-#	}
+#       # See sample authentication script at:
+#       # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
+#
+#       # auth_http localhost/auth.php;
+#       # pop3_capabilities "TOP" "USER";
+#       # imap_capabilities "IMAP4rev1" "UIDPLUS";
+#
+#       server {
+#               listen     localhost:110;
+#               protocol   pop3;
+#               proxy      on;
+#       }
+#
+#       server {
+#               listen     localhost:143;
+#               protocol   imap;
+#               proxy      on;
+#       }
 #}

2018-02-06 10:39:50,905 [salt.state       ][INFO    ][16243] Completed state [/etc/nginx/nginx.conf] at time 10:39:50.905357 duration_in_ms=65.831
2018-02-06 10:39:50,906 [salt.state       ][INFO    ][16243] Running state [/etc/ssl/private] at time 10:39:50.906122
2018-02-06 10:39:50,906 [salt.state       ][INFO    ][16243] Executing state file.directory for /etc/ssl/private
2018-02-06 10:39:50,906 [salt.state       ][INFO    ][16243] Directory /etc/ssl/private is in the correct state
2018-02-06 10:39:50,907 [salt.state       ][INFO    ][16243] Completed state [/etc/ssl/private] at time 10:39:50.907129 duration_in_ms=1.007
2018-02-06 10:39:50,932 [salt.state       ][INFO    ][16243] Running state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 10:39:50.932320
2018-02-06 10:39:50,932 [salt.state       ][INFO    ][16243] Executing state cmd.run for openssl dhparam -out /etc/ssl/dhparams.pem 2048
2018-02-06 10:39:50,933 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command 'openssl dhparam -out /etc/ssl/dhparams.pem 2048' in directory '/root'
2018-02-06 10:39:53,157 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206103953132212
2018-02-06 10:39:53,178 [salt.minion      ][INFO    ][17026] Starting a new job with PID 17026
2018-02-06 10:39:53,193 [salt.minion      ][INFO    ][17026] Returning information for job: 20180206103953132212
2018-02-06 10:40:03,191 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104003168636
2018-02-06 10:40:03,213 [salt.minion      ][INFO    ][17031] Starting a new job with PID 17031
2018-02-06 10:40:03,229 [salt.minion      ][INFO    ][17031] Returning information for job: 20180206104003168636
2018-02-06 10:40:13,212 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104013194818
2018-02-06 10:40:13,235 [salt.minion      ][INFO    ][17036] Starting a new job with PID 17036
2018-02-06 10:40:13,250 [salt.minion      ][INFO    ][17036] Returning information for job: 20180206104013194818
2018-02-06 10:40:15,994 [salt.state       ][INFO    ][16243] {'pid': 17022, 'retcode': 0, 'stderr': "Generating DH parameters, 2048 bit long safe prime, generator 2\nThis is going to take a long time\n.......................................................................................................................................................................................................+....................+.........................................+..................................................................+..........+............................+...........................................................+.......................................................................................................................................................+...........................................+...........................................................................................................................................................................................................................................................................................+......................+..............................................................................................................................................................................................................+.......................................................................................................................................................................+..+...................................................................................................+..............................................................................................................................................................+............................................................................+...............+.......+...............................................................................................................................+..................................................................................................................................................+.............+...................................................................+...............................................+..........................................+.........................................................................................................................................................................................................................................................................................+......................+.............................................................................................................................................................................................................+...............................................................+....................++*++*\nunable to write 'random state'", 'stdout': ''}
2018-02-06 10:40:16,001 [salt.state       ][INFO    ][16243] Completed state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 10:40:16.001725 duration_in_ms=25069.403
2018-02-06 10:40:16,013 [salt.state       ][INFO    ][16243] Running state [nginx] at time 10:40:16.013663
2018-02-06 10:40:16,014 [salt.state       ][INFO    ][16243] Executing state service.running for nginx
2018-02-06 10:40:16,015 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemctl', 'status', 'nginx.service', '-n', '0'] in directory '/root'
2018-02-06 10:40:16,043 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-02-06 10:40:16,066 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-02-06 10:40:16,089 [salt.state       ][INFO    ][16243] The service nginx is already running
2018-02-06 10:40:16,090 [salt.state       ][INFO    ][16243] Completed state [nginx] at time 10:40:16.090194 duration_in_ms=76.53
2018-02-06 10:40:16,091 [salt.state       ][INFO    ][16243] Running state [nginx] at time 10:40:16.091101
2018-02-06 10:40:16,092 [salt.state       ][INFO    ][16243] Executing state service.mod_watch for nginx
2018-02-06 10:40:16,093 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-02-06 10:40:16,114 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-02-06 10:40:16,136 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'nginx.service'] in directory '/root'
2018-02-06 10:40:16,285 [salt.state       ][INFO    ][16243] {'nginx': True}
2018-02-06 10:40:16,285 [salt.state       ][INFO    ][16243] Completed state [nginx] at time 10:40:16.285858 duration_in_ms=194.757
2018-02-06 10:40:16,289 [salt.state       ][INFO    ][16243] Running state [root] at time 10:40:16.289846
2018-02-06 10:40:16,290 [salt.state       ][INFO    ][16243] Executing state user.present for root
2018-02-06 10:40:16,292 [salt.state       ][INFO    ][16243] User root is present and up to date
2018-02-06 10:40:16,293 [salt.state       ][INFO    ][16243] Completed state [root] at time 10:40:16.293326 duration_in_ms=3.48
2018-02-06 10:40:16,294 [salt.state       ][INFO    ][16243] Running state [/root] at time 10:40:16.294934
2018-02-06 10:40:16,296 [salt.state       ][INFO    ][16243] Executing state file.directory for /root
2018-02-06 10:40:16,297 [salt.state       ][INFO    ][16243] Directory /root is in the correct state
2018-02-06 10:40:16,298 [salt.state       ][INFO    ][16243] Completed state [/root] at time 10:40:16.298316 duration_in_ms=3.382
2018-02-06 10:40:16,298 [salt.state       ][INFO    ][16243] Running state [/etc/sudoers.d/90-salt-user-root] at time 10:40:16.298796
2018-02-06 10:40:16,299 [salt.state       ][INFO    ][16243] Executing state file.absent for /etc/sudoers.d/90-salt-user-root
2018-02-06 10:40:16,300 [salt.state       ][INFO    ][16243] File /etc/sudoers.d/90-salt-user-root is not present
2018-02-06 10:40:16,300 [salt.state       ][INFO    ][16243] Completed state [/etc/sudoers.d/90-salt-user-root] at time 10:40:16.300815 duration_in_ms=2.019
2018-02-06 10:40:16,301 [salt.state       ][INFO    ][16243] Running state [ubuntu] at time 10:40:16.301331
2018-02-06 10:40:16,301 [salt.state       ][INFO    ][16243] Executing state user.present for ubuntu
2018-02-06 10:40:16,303 [salt.state       ][INFO    ][16243] User ubuntu is present and up to date
2018-02-06 10:40:16,304 [salt.state       ][INFO    ][16243] Completed state [ubuntu] at time 10:40:16.304119 duration_in_ms=2.788
2018-02-06 10:40:16,305 [salt.state       ][INFO    ][16243] Running state [/home/ubuntu] at time 10:40:16.305319
2018-02-06 10:40:16,305 [salt.state       ][INFO    ][16243] Executing state file.directory for /home/ubuntu
2018-02-06 10:40:16,306 [salt.state       ][INFO    ][16243] Directory /home/ubuntu is in the correct state
2018-02-06 10:40:16,307 [salt.state       ][INFO    ][16243] Completed state [/home/ubuntu] at time 10:40:16.307198 duration_in_ms=1.879
2018-02-06 10:40:16,308 [salt.state       ][INFO    ][16243] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 10:40:16.308847
2018-02-06 10:40:16,309 [salt.state       ][INFO    ][16243] Executing state file.managed for /etc/sudoers.d/90-salt-user-ubuntu
2018-02-06 10:40:16,337 [salt.state       ][INFO    ][16243] File /etc/sudoers.d/90-salt-user-ubuntu is in the correct state
2018-02-06 10:40:16,338 [salt.state       ][INFO    ][16243] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 10:40:16.338084 duration_in_ms=29.236
2018-02-06 10:40:16,338 [salt.state       ][INFO    ][16243] Running state [git-core] at time 10:40:16.338789
2018-02-06 10:40:16,339 [salt.state       ][INFO    ][16243] Executing state pkg.installed for git-core
2018-02-06 10:40:16,349 [salt.state       ][INFO    ][16243] All specified packages are already installed
2018-02-06 10:40:16,349 [salt.state       ][INFO    ][16243] Completed state [git-core] at time 10:40:16.349939 duration_in_ms=11.15
2018-02-06 10:40:16,350 [salt.state       ][INFO    ][16243] Running state [python-sphinx] at time 10:40:16.350412
2018-02-06 10:40:16,350 [salt.state       ][INFO    ][16243] Executing state pkg.installed for python-sphinx
2018-02-06 10:40:16,371 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-02-06 10:40:16,414 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-sphinx'] in directory '/root'
2018-02-06 10:40:19,741 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-02-06 10:40:19,816 [salt.state       ][INFO    ][16243] Made the following changes:
'python-sphinx' changed from 'absent' to '1.5.6-2~cloud0'
'python-imagesize' changed from 'absent' to '0.7.1-1~cloud0'
'python-alabaster' changed from 'absent' to '0.7.7-1'
'sphinx-common' changed from 'absent' to '1.5.6-2~cloud0'
'python2.7-alabaster' changed from 'absent' to '1'

2018-02-06 10:40:19,847 [salt.state       ][INFO    ][16243] Loading fresh modules for state activity
2018-02-06 10:40:19,889 [salt.state       ][INFO    ][16243] Completed state [python-sphinx] at time 10:40:19.889271 duration_in_ms=3538.858
2018-02-06 10:40:19,892 [salt.state       ][INFO    ][16243] Running state [/srv/static/sites] at time 10:40:19.891974
2018-02-06 10:40:19,892 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/sites
2018-02-06 10:40:19,896 [salt.state       ][INFO    ][16243] {'/srv/static/sites': 'New Dir'}
2018-02-06 10:40:19,896 [salt.state       ][INFO    ][16243] Completed state [/srv/static/sites] at time 10:40:19.896663 duration_in_ms=4.689
2018-02-06 10:40:19,900 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern] at time 10:40:19.896893
2018-02-06 10:40:19,901 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/extern
2018-02-06 10:40:19,901 [salt.state       ][INFO    ][16243] {'/srv/static/extern': 'New Dir'}
2018-02-06 10:40:19,902 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern] at time 10:40:19.902092 duration_in_ms=5.199
2018-02-06 10:40:19,903 [salt.state       ][INFO    ][16243] Running state [/srv/static/sites/reclass_doc] at time 10:40:19.903734
2018-02-06 10:40:19,904 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/sites/reclass_doc
2018-02-06 10:40:19,904 [salt.state       ][INFO    ][16243] {'/srv/static/sites/reclass_doc': 'New Dir'}
2018-02-06 10:40:19,908 [salt.state       ][INFO    ][16243] Completed state [/srv/static/sites/reclass_doc] at time 10:40:19.904919 duration_in_ms=1.185
2018-02-06 10:40:19,909 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/_static] at time 10:40:19.909207
2018-02-06 10:40:19,909 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/extern/salt/source/_static
2018-02-06 10:40:19,911 [salt.state       ][INFO    ][16243] {'/srv/static/extern/salt/source/_static': 'New Dir'}
2018-02-06 10:40:19,911 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/_static] at time 10:40:19.911521 duration_in_ms=2.314
2018-02-06 10:40:19,911 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/services] at time 10:40:19.911751
2018-02-06 10:40:19,912 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/extern/salt/source/services
2018-02-06 10:40:19,913 [salt.state       ][INFO    ][16243] {'/srv/static/extern/salt/source/services': 'New Dir'}
2018-02-06 10:40:19,913 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/services] at time 10:40:19.913201 duration_in_ms=1.45
2018-02-06 10:40:19,913 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes] at time 10:40:19.913424
2018-02-06 10:40:19,913 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/extern/salt/source/nodes
2018-02-06 10:40:19,914 [salt.state       ][INFO    ][16243] {'/srv/static/extern/salt/source/nodes': 'New Dir'}
2018-02-06 10:40:19,914 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes] at time 10:40:19.914893 duration_in_ms=1.469
2018-02-06 10:40:19,915 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/devices] at time 10:40:19.915127
2018-02-06 10:40:19,915 [salt.state       ][INFO    ][16243] Executing state file.directory for /srv/static/extern/salt/source/devices
2018-02-06 10:40:19,916 [salt.state       ][INFO    ][16243] {'/srv/static/extern/salt/source/devices': 'New Dir'}
2018-02-06 10:40:19,916 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/devices] at time 10:40:19.916579 duration_in_ms=1.452
2018-02-06 10:40:19,917 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/Makefile] at time 10:40:19.917606
2018-02-06 10:40:19,917 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/Makefile
2018-02-06 10:40:19,949 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/Makefile'
2018-02-06 10:40:19,951 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:19,952 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/Makefile] at time 10:40:19.951957 duration_in_ms=34.351
2018-02-06 10:40:19,952 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/conf.py] at time 10:40:19.952793
2018-02-06 10:40:19,953 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/conf.py
2018-02-06 10:40:19,984 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/conf.py'
2018-02-06 10:40:19,996 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,019 [salt.state       ][INFO    ][16243] Loading fresh modules for state activity
2018-02-06 10:40:20,055 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/conf.py] at time 10:40:20.055112 duration_in_ms=102.318
2018-02-06 10:40:20,058 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/index.rst] at time 10:40:20.058121
2018-02-06 10:40:20,058 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/index.rst
2018-02-06 10:40:20,093 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/index.rst'
2018-02-06 10:40:20,106 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,106 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/index.rst] at time 10:40:20.106788 duration_in_ms=48.667
2018-02-06 10:40:20,107 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/services/index.rst] at time 10:40:20.107873
2018-02-06 10:40:20,108 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/services/index.rst
2018-02-06 10:40:20,133 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/services/index.rst'
2018-02-06 10:40:20,137 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,138 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/services/index.rst] at time 10:40:20.138287 duration_in_ms=30.414
2018-02-06 10:40:20,140 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/services/monitoring.rst] at time 10:40:20.140066
2018-02-06 10:40:20,140 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/services/monitoring.rst
2018-02-06 10:40:20,166 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/services/monitoring.rst'
2018-02-06 10:40:20,236 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,237 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/services/monitoring.rst] at time 10:40:20.237635 duration_in_ms=97.567
2018-02-06 10:40:20,238 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/services/endpoints.rst] at time 10:40:20.238724
2018-02-06 10:40:20,239 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/services/endpoints.rst
2018-02-06 10:40:20,260 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/services/endpoints.rst'
2018-02-06 10:40:20,325 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,326 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/services/endpoints.rst] at time 10:40:20.325976 duration_in_ms=87.252
2018-02-06 10:40:20,327 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/services/catalog.rst] at time 10:40:20.327037
2018-02-06 10:40:20,327 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/services/catalog.rst
2018-02-06 10:40:20,347 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/services/catalog.rst'
2018-02-06 10:40:20,394 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,394 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/services/catalog.rst] at time 10:40:20.394670 duration_in_ms=67.633
2018-02-06 10:40:20,395 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/index.rst] at time 10:40:20.395699
2018-02-06 10:40:20,396 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/index.rst
2018-02-06 10:40:20,415 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/nodes/index.rst'
2018-02-06 10:40:20,466 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,467 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/index.rst] at time 10:40:20.466986 duration_in_ms=71.286
2018-02-06 10:40:20,468 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/ctl03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.467982
2018-02-06 10:40:20,468 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/ctl03.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,486 [salt.fileclient  ][INFO    ][16243] Fetching file from saltenv 'base', ** done ** 'sphinx/files/salt/source/nodes/node.rst'
2018-02-06 10:40:20,510 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,511 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/ctl03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.511210 duration_in_ms=43.228
2018-02-06 10:40:20,512 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/dbs02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.512621
2018-02-06 10:40:20,513 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/dbs02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,547 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,547 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/dbs02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.547722 duration_in_ms=35.1
2018-02-06 10:40:20,548 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/ctl01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.548789
2018-02-06 10:40:20,549 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/ctl01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,583 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,584 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/ctl01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.584207 duration_in_ms=35.418
2018-02-06 10:40:20,585 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/msg03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.585337
2018-02-06 10:40:20,585 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/msg03.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,621 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,622 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/msg03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.622387 duration_in_ms=37.05
2018-02-06 10:40:20,623 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/prx01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.623483
2018-02-06 10:40:20,624 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/prx01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,658 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,659 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/prx01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.659250 duration_in_ms=35.767
2018-02-06 10:40:20,660 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/mdb03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.660865
2018-02-06 10:40:20,661 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/mdb03.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,694 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,694 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/mdb03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.694939 duration_in_ms=34.074
2018-02-06 10:40:20,696 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/mdb02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.696007
2018-02-06 10:40:20,696 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/mdb02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,729 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,729 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/mdb02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.729902 duration_in_ms=33.895
2018-02-06 10:40:20,731 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/ctl02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.731004
2018-02-06 10:40:20,731 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/ctl02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,767 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,767 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/ctl02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.767844 duration_in_ms=36.84
2018-02-06 10:40:20,768 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/kvm03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.768876
2018-02-06 10:40:20,769 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/kvm03.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,803 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,804 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/kvm03.mcp-pike-ovs-ha.local.rst] at time 10:40:20.804261 duration_in_ms=35.385
2018-02-06 10:40:20,805 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/msg01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.805373
2018-02-06 10:40:20,805 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/msg01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,843 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,844 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/msg01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.844588 duration_in_ms=39.214
2018-02-06 10:40:20,845 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/kvm01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.845699
2018-02-06 10:40:20,846 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/kvm01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,883 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,883 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/kvm01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.883902 duration_in_ms=38.202
2018-02-06 10:40:20,885 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/dbs01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.884996
2018-02-06 10:40:20,885 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/dbs01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,923 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,923 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/dbs01.mcp-pike-ovs-ha.local.rst] at time 10:40:20.923644 duration_in_ms=38.648
2018-02-06 10:40:20,924 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/kvm02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.924648
2018-02-06 10:40:20,925 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/kvm02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,955 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,956 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/kvm02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.956016 duration_in_ms=31.368
2018-02-06 10:40:20,957 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/cmp001.mcp-pike-ovs-ha.local.rst] at time 10:40:20.957021
2018-02-06 10:40:20,957 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/cmp001.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:20,992 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:20,993 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/cmp001.mcp-pike-ovs-ha.local.rst] at time 10:40:20.992961 duration_in_ms=35.939
2018-02-06 10:40:20,993 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/msg02.mcp-pike-ovs-ha.local.rst] at time 10:40:20.993943
2018-02-06 10:40:20,994 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/msg02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,026 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,028 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/msg02.mcp-pike-ovs-ha.local.rst] at time 10:40:21.028161 duration_in_ms=34.218
2018-02-06 10:40:21,029 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/mas01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.029171
2018-02-06 10:40:21,029 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/mas01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,063 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,064 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/mas01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.064571 duration_in_ms=35.4
2018-02-06 10:40:21,065 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/cfg01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.065569
2018-02-06 10:40:21,066 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/cfg01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,106 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,107 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/cfg01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.106989 duration_in_ms=41.42
2018-02-06 10:40:21,108 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/mdb01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.108083
2018-02-06 10:40:21,108 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/mdb01.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,147 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,148 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/mdb01.mcp-pike-ovs-ha.local.rst] at time 10:40:21.147983 duration_in_ms=39.9
2018-02-06 10:40:21,149 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/prx02.mcp-pike-ovs-ha.local.rst] at time 10:40:21.149046
2018-02-06 10:40:21,149 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/prx02.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,185 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,185 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/prx02.mcp-pike-ovs-ha.local.rst] at time 10:40:21.185822 duration_in_ms=36.776
2018-02-06 10:40:21,186 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/dbs03.mcp-pike-ovs-ha.local.rst] at time 10:40:21.186810
2018-02-06 10:40:21,187 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/dbs03.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,219 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,220 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/dbs03.mcp-pike-ovs-ha.local.rst] at time 10:40:21.220549 duration_in_ms=33.739
2018-02-06 10:40:21,221 [salt.state       ][INFO    ][16243] Running state [/srv/static/extern/salt/source/nodes/cmp002.mcp-pike-ovs-ha.local.rst] at time 10:40:21.221571
2018-02-06 10:40:21,222 [salt.state       ][INFO    ][16243] Executing state file.managed for /srv/static/extern/salt/source/nodes/cmp002.mcp-pike-ovs-ha.local.rst
2018-02-06 10:40:21,259 [salt.state       ][INFO    ][16243] File changed:
New file
2018-02-06 10:40:21,259 [salt.state       ][INFO    ][16243] Completed state [/srv/static/extern/salt/source/nodes/cmp002.mcp-pike-ovs-ha.local.rst] at time 10:40:21.259952 duration_in_ms=38.381
2018-02-06 10:40:21,261 [salt.state       ][INFO    ][16243] Running state [sphinx-build -b singlehtml /srv/static/extern/salt/source /srv/static/sites/reclass_doc] at time 10:40:21.261791
2018-02-06 10:40:21,262 [salt.state       ][INFO    ][16243] Executing state cmd.run for sphinx-build -b singlehtml /srv/static/extern/salt/source /srv/static/sites/reclass_doc
2018-02-06 10:40:21,265 [salt.loaded.int.module.cmdmod][INFO    ][16243] Executing command 'sphinx-build -b singlehtml /srv/static/extern/salt/source /srv/static/sites/reclass_doc' in directory '/root'
2018-02-06 10:40:23,240 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104023219838
2018-02-06 10:40:23,273 [salt.minion      ][INFO    ][17438] Starting a new job with PID 17438
2018-02-06 10:40:23,289 [salt.minion      ][INFO    ][17438] Returning information for job: 20180206104023219838
2018-02-06 10:40:26,613 [salt.state       ][INFO    ][16243] {'pid': 17431, 'retcode': 0, 'stderr': '/srv/static/extern/salt/source/nodes/cmp001.mcp-pike-ovs-ha.local.rst:142: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :widths: 15 15 70\n   :header-rows: 1\n\n   *  - **Service Role**\n      - **Parameter**\n      - **Value**\n/srv/static/extern/salt/source/nodes/cmp002.mcp-pike-ovs-ha.local.rst:142: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :widths: 15 15 70\n   :header-rows: 1\n\n   *  - **Service Role**\n      - **Parameter**\n      - **Value**\n/srv/static/extern/salt/source/nodes/dbs01.mcp-pike-ovs-ha.local.rst:12: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :widths: 15 15 70\n   :header-rows: 1\n\n   *  - **Service Role**\n      - **Parameter**\n      - **Value**\n/srv/static/extern/salt/source/nodes/dbs02.mcp-pike-ovs-ha.local.rst:12: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :widths: 15 15 70\n   :header-rows: 1\n\n   *  - **Service Role**\n      - **Parameter**\n      - **Value**\n/srv/static/extern/salt/source/nodes/dbs03.mcp-pike-ovs-ha.local.rst:12: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :widths: 15 15 70\n   :header-rows: 1\n\n   *  - **Service Role**\n      - **Parameter**\n      - **Value**\n/srv/static/extern/salt/source/services/catalog.rst:16: WARNING: Explicit markup ends without a blank line; unexpected unindent.\n/srv/static/extern/salt/source/services/monitoring.rst:48: ERROR: Insufficient data supplied (1 row(s)); no data remaining for table body, required by "list-table" directive.\n\n.. list-table::\n   :header-rows: 1\n\n   *  - **Node**\n      - **Alarm**\n      - **Trigger**\n      - **Metric**\n      - **Operator**\n      - **Threshold**\n      - **Function**\n      - **Duration**', 'stdout': 'Running Sphinx v1.5.6\nloading pickled environment... not yet created\nbuilding [mo]: targets for 0 po files that are out of date\nbuilding [singlehtml]: all documents\nupdating environment: 27 added, 0 changed, 0 removed\nreading sources... [  3%] index\nreading sources... [  7%] nodes/cfg01.mcp-pike-ovs-ha.local\nreading sources... [ 11%] nodes/cmp001.mcp-pike-ovs-ha.local\nreading sources... [ 14%] nodes/cmp002.mcp-pike-ovs-ha.local\nreading sources... [ 18%] nodes/ctl01.mcp-pike-ovs-ha.local\nreading sources... [ 22%] nodes/ctl02.mcp-pike-ovs-ha.local\nreading sources... [ 25%] nodes/ctl03.mcp-pike-ovs-ha.local\nreading sources... [ 29%] nodes/dbs01.mcp-pike-ovs-ha.local\nreading sources... [ 33%] nodes/dbs02.mcp-pike-ovs-ha.local\nreading sources... [ 37%] nodes/dbs03.mcp-pike-ovs-ha.local\nreading sources... [ 40%] nodes/index\nreading sources... [ 44%] nodes/kvm01.mcp-pike-ovs-ha.local\nreading sources... [ 48%] nodes/kvm02.mcp-pike-ovs-ha.local\nreading sources... [ 51%] nodes/kvm03.mcp-pike-ovs-ha.local\nreading sources... [ 55%] nodes/mas01.mcp-pike-ovs-ha.local\nreading sources... [ 59%] nodes/mdb01.mcp-pike-ovs-ha.local\nreading sources... [ 62%] nodes/mdb02.mcp-pike-ovs-ha.local\nreading sources... [ 66%] nodes/mdb03.mcp-pike-ovs-ha.local\nreading sources... [ 70%] nodes/msg01.mcp-pike-ovs-ha.local\nreading sources... [ 74%] nodes/msg02.mcp-pike-ovs-ha.local\nreading sources... [ 77%] nodes/msg03.mcp-pike-ovs-ha.local\nreading sources... [ 81%] nodes/prx01.mcp-pike-ovs-ha.local\nreading sources... [ 85%] nodes/prx02.mcp-pike-ovs-ha.local\nreading sources... [ 88%] services/catalog\nreading sources... [ 92%] services/endpoints\nreading sources... [ 96%] services/index\nreading sources... [100%] services/monitoring\n\nlooking for now-outdated files... none found\npickling environment... done\nchecking consistency... done\npreparing documents... done\nassembling single document... services/index services/endpoints services/catalog services/monitoring nodes/index nodes/cfg01.mcp-pike-ovs-ha.local nodes/cmp001.mcp-pike-ovs-ha.local nodes/cmp002.mcp-pike-ovs-ha.local nodes/ctl01.mcp-pike-ovs-ha.local nodes/ctl02.mcp-pike-ovs-ha.local nodes/ctl03.mcp-pike-ovs-ha.local nodes/dbs01.mcp-pike-ovs-ha.local nodes/dbs02.mcp-pike-ovs-ha.local nodes/dbs03.mcp-pike-ovs-ha.local nodes/kvm01.mcp-pike-ovs-ha.local nodes/kvm02.mcp-pike-ovs-ha.local nodes/kvm03.mcp-pike-ovs-ha.local nodes/mas01.mcp-pike-ovs-ha.local nodes/mdb01.mcp-pike-ovs-ha.local nodes/mdb02.mcp-pike-ovs-ha.local nodes/mdb03.mcp-pike-ovs-ha.local nodes/msg01.mcp-pike-ovs-ha.local nodes/msg02.mcp-pike-ovs-ha.local nodes/msg03.mcp-pike-ovs-ha.local nodes/prx01.mcp-pike-ovs-ha.local nodes/prx02.mcp-pike-ovs-ha.local \nwriting... done\nwriting additional files...\ncopying static files... done\ncopying extra files... done\ndumping object inventory... done\nbuild succeeded, 7 warnings.'}
2018-02-06 10:40:26,617 [salt.state       ][INFO    ][16243] Completed state [sphinx-build -b singlehtml /srv/static/extern/salt/source /srv/static/sites/reclass_doc] at time 10:40:26.616962 duration_in_ms=5355.17
2018-02-06 10:40:26,638 [salt.minion      ][INFO    ][16243] Returning information for job: 20180206103932801853
2018-02-06 10:41:17,253 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command file.symlink with jid 20180206104117236693
2018-02-06 10:41:17,277 [salt.minion      ][INFO    ][17445] Starting a new job with PID 17445
2018-02-06 10:41:17,291 [salt.minion      ][INFO    ][17445] Returning information for job: 20180206104117236693
2018-02-06 10:41:18,008 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command cmd.run with jid 20180206104117990261
2018-02-06 10:41:18,030 [salt.minion      ][INFO    ][17450] Starting a new job with PID 17450
2018-02-06 10:41:18,036 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][17450] Executing command '/usr/share/openstack-dashboard/manage.py collectstatic --noinput' in directory '/root'
2018-02-06 10:41:19,830 [salt.minion      ][INFO    ][17450] Returning information for job: 20180206104117990261
2018-02-06 10:41:20,673 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command cmd.run with jid 20180206104120653091
2018-02-06 10:41:20,695 [salt.minion      ][INFO    ][17462] Starting a new job with PID 17462
2018-02-06 10:41:20,701 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][17462] Executing command '/usr/share/openstack-dashboard/manage.py compress --force' in directory '/root'
2018-02-06 10:41:30,783 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104130763998
2018-02-06 10:41:30,808 [salt.minion      ][INFO    ][17472] Starting a new job with PID 17472
2018-02-06 10:41:30,822 [salt.minion      ][INFO    ][17472] Returning information for job: 20180206104130763998
2018-02-06 10:41:40,812 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104140787471
2018-02-06 10:41:40,838 [salt.minion      ][INFO    ][17477] Starting a new job with PID 17477
2018-02-06 10:41:40,853 [salt.minion      ][INFO    ][17477] Returning information for job: 20180206104140787471
2018-02-06 10:41:50,834 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104150815427
2018-02-06 10:41:50,867 [salt.minion      ][INFO    ][17482] Starting a new job with PID 17482
2018-02-06 10:41:50,883 [salt.minion      ][INFO    ][17482] Returning information for job: 20180206104150815427
2018-02-06 10:42:00,865 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command saltutil.find_job with jid 20180206104200848404
2018-02-06 10:42:00,892 [salt.minion      ][INFO    ][17487] Starting a new job with PID 17487
2018-02-06 10:42:00,908 [salt.minion      ][INFO    ][17487] Returning information for job: 20180206104200848404
2018-02-06 10:42:01,795 [salt.minion      ][INFO    ][17462] Returning information for job: 20180206104120653091
2018-02-06 10:42:05,515 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command file.append with jid 20180206104205498069
2018-02-06 10:42:05,537 [salt.minion      ][INFO    ][17494] Starting a new job with PID 17494
2018-02-06 10:42:05,550 [salt.minion      ][INFO    ][17494] Returning information for job: 20180206104205498069
2018-02-06 10:42:06,305 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command service.reload with jid 20180206104206285673
2018-02-06 10:42:06,330 [salt.minion      ][INFO    ][17499] Starting a new job with PID 17499
2018-02-06 10:42:07,347 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][17499] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-02-06 10:42:07,370 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][17499] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-02-06 10:42:07,412 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][17499] Executing command ['systemd-run', '--scope', 'systemctl', 'reload', 'apache2.service'] in directory '/root'
2018-02-06 10:42:07,613 [salt.minion      ][INFO    ][17499] Returning information for job: 20180206104206285673
2018-02-06 10:43:07,950 [salt.minion      ][INFO    ][1359] User sudo_ubuntu Executing command cp.push_dir with jid 20180206104307926525
2018-02-06 10:43:07,978 [salt.minion      ][INFO    ][17642] Starting a new job with PID 17642
