2018-03-30 05:51:16,246 [salt.loaded.int.states.file][WARNING ][1444] State for file: /etc/ssl/certs/ca-salt_master_ca.crt - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-03-30 05:51:16,271 [salt.loaded.int.module.cmdmod][ERROR   ][1444] Command 'while true; do salt-call saltutil.running|grep fun: && continue; salt-call --local service.restart salt-minion; break; done' failed with return code: None
2018-03-30 05:51:18,836 [salt.loaded.int.module.cmdmod][INFO    ][2085] Executing command ['systemctl', 'status', 'salt-minion.service', '-n', '0'] in directory '/root'
2018-03-30 05:51:18,859 [salt.loaded.int.module.cmdmod][INFO    ][2085] Executing command ['systemctl', 'is-enabled', 'salt-minion.service'] in directory '/root'
2018-03-30 05:51:18,894 [salt.loaded.int.module.cmdmod][INFO    ][2085] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'salt-minion.service'] in directory '/root'
2018-03-30 05:51:18,937 [salt.utils.parsers][WARNING ][1240] Minion received a SIGTERM. Exiting.
2018-03-30 05:51:19,376 [salt.cli.daemons ][INFO    ][2133] Setting up the Salt Minion "prx01.mcp-pike-ovs-dpdk-ha.local"
2018-03-30 05:51:19,466 [salt.cli.daemons ][INFO    ][2133] Starting up the Salt Minion
2018-03-30 05:51:19,466 [salt.utils.event ][INFO    ][2133] Starting pull socket on /var/run/salt/minion/minion_event_bd4454dbe8_pull.ipc
2018-03-30 05:51:19,959 [salt.minion      ][INFO    ][2133] Creating minion process manager
2018-03-30 05:51:21,180 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][2133] Executing command ['date', '+%z'] in directory '/root'
2018-03-30 05:51:21,216 [salt.utils.schedule][INFO    ][2133] Updating job settings for scheduled job: __mine_interval
2018-03-30 05:51:21,324 [salt.minion      ][INFO    ][2133] Added mine.update to scheduler
2018-03-30 05:51:21,369 [salt.minion      ][INFO    ][2133] Minion is starting as user 'root'
2018-03-30 05:51:21,388 [salt.minion      ][INFO    ][2133] Minion is ready to receive requests!
2018-03-30 05:51:22,390 [salt.utils.schedule][INFO    ][2133] Running scheduled job: __mine_interval
2018-03-30 05:52:41,422 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command state.apply with jid 20180330055241432131
2018-03-30 05:52:41,454 [salt.minion      ][INFO    ][2224] Starting a new job with PID 2224
2018-03-30 05:52:44,407 [salt.state       ][INFO    ][2224] Loading fresh modules for state activity
2018-03-30 05:52:44,461 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/init.sls'
2018-03-30 05:52:44,501 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/init.sls'
2018-03-30 05:52:44,701 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/env.sls'
2018-03-30 05:52:44,797 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/profile.sls'
2018-03-30 05:52:44,883 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/repo.sls'
2018-03-30 05:52:45,010 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/package.sls'
2018-03-30 05:52:45,086 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/timezone.sls'
2018-03-30 05:52:45,155 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/kernel.sls'
2018-03-30 05:52:45,251 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/cpu.sls'
2018-03-30 05:52:45,321 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/sysfs.sls'
2018-03-30 05:52:45,383 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/locale.sls'
2018-03-30 05:52:45,446 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/user.sls'
2018-03-30 05:52:45,524 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/group.sls'
2018-03-30 05:52:45,591 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/limit.sls'
2018-03-30 05:52:45,689 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/service.sls'
2018-03-30 05:52:45,782 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/systemd.sls'
2018-03-30 05:52:45,859 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/system/apt.sls'
2018-03-30 05:52:45,937 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/network/init.sls'
2018-03-30 05:52:46,000 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/network/hostname.sls'
2018-03-30 05:52:46,070 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/network/host.sls'
2018-03-30 05:52:46,176 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/network/interface.sls'
2018-03-30 05:52:46,309 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/network/proxy.sls'
2018-03-30 05:52:46,376 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/storage/init.sls'
2018-03-30 05:52:46,576 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'ntp/init.sls'
2018-03-30 05:52:46,602 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'ntp/client.sls'
2018-03-30 05:52:46,658 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'ntp/server.sls'
2018-03-30 05:52:46,698 [salt.state       ][INFO    ][2224] Running state [/etc/environment] at time 05:52:46.698699
2018-03-30 05:52:46,699 [salt.state       ][INFO    ][2224] Executing state file.blockreplace for /etc/environment
2018-03-30 05:52:46,709 [salt.state       ][INFO    ][2224] File changed:
--- 
+++ 
@@ -1 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
+# SALT MANAGED VARIABLES - DO NOT EDIT - START
+# +# SALT MANAGED VARIABLES - END

2018-03-30 05:52:46,710 [salt.state       ][INFO    ][2224] Completed state [/etc/environment] at time 05:52:46.710635 duration_in_ms=11.936
2018-03-30 05:52:46,711 [salt.state       ][INFO    ][2224] Running state [/etc/profile.d] at time 05:52:46.711266
2018-03-30 05:52:46,711 [salt.state       ][INFO    ][2224] Executing state file.directory for /etc/profile.d
2018-03-30 05:52:46,729 [salt.state       ][INFO    ][2224] Directory /etc/profile.d is in the correct state
2018-03-30 05:52:46,730 [salt.state       ][INFO    ][2224] Completed state [/etc/profile.d] at time 05:52:46.730102 duration_in_ms=18.836
2018-03-30 05:52:47,144 [salt.state       ][INFO    ][2224] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 05:52:47.144699
2018-03-30 05:52:47,145 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/apt/apt.conf.d/99prefer_ipv4-salt
2018-03-30 05:52:47,170 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf'
2018-03-30 05:52:47,177 [salt.state       ][INFO    ][2224] File changed:
New file
2018-03-30 05:52:47,177 [salt.state       ][INFO    ][2224] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 05:52:47.177941 duration_in_ms=33.242
2018-03-30 05:52:47,178 [salt.state       ][INFO    ][2224] Running state [linux_repo_prereq_pkgs] at time 05:52:47.178687
2018-03-30 05:52:47,179 [salt.state       ][INFO    ][2224] Executing state pkg.installed for linux_repo_prereq_pkgs
2018-03-30 05:52:47,179 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:52:47,477 [salt.state       ][INFO    ][2224] All specified packages are already installed
2018-03-30 05:52:47,478 [salt.state       ][INFO    ][2224] Completed state [linux_repo_prereq_pkgs] at time 05:52:47.478090 duration_in_ms=299.403
2018-03-30 05:52:47,478 [salt.state       ][INFO    ][2224] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 05:52:47.478630
2018-03-30 05:52:47,479 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/apt/apt.conf.d/99proxies-salt-uca
2018-03-30 05:52:47,479 [salt.state       ][INFO    ][2224] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-03-30 05:52:47,480 [salt.state       ][INFO    ][2224] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 05:52:47.480122 duration_in_ms=1.491
2018-03-30 05:52:47,480 [salt.state       ][INFO    ][2224] Running state [/etc/apt/preferences.d/uca] at time 05:52:47.480511
2018-03-30 05:52:47,480 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/apt/preferences.d/uca
2018-03-30 05:52:47,481 [salt.state       ][INFO    ][2224] File /etc/apt/preferences.d/uca is not present
2018-03-30 05:52:47,481 [salt.state       ][INFO    ][2224] Completed state [/etc/apt/preferences.d/uca] at time 05:52:47.481700 duration_in_ms=1.189
2018-03-30 05:52:47,483 [salt.state       ][INFO    ][2224] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 05:52:47.483648
2018-03-30 05:52:47,484 [salt.state       ][INFO    ][2224] Executing state cmd.run for apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA
2018-03-30 05:52:47,484 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-03-30 05:52:48,064 [salt.state       ][INFO    ][2224] {'pid': 2284, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: public key "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" imported\ngpg: Total number processed: 1\ngpg:               imported: 1  (RSA: 1)', 'stdout': 'Executing: /tmp/tmp.FOr8R24LXS/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-03-30 05:52:48,066 [salt.state       ][INFO    ][2224] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 05:52:48.065969 duration_in_ms=582.32
2018-03-30 05:52:48,072 [salt.state       ][INFO    ][2224] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 05:52:48.072301
2018-03-30 05:52:48,073 [salt.state       ][INFO    ][2224] Executing state pkgrepo.managed for deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main
2018-03-30 05:52:48,159 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 05:52:51,450 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055251450287
2018-03-30 05:52:51,491 [salt.minion      ][INFO    ][2710] Starting a new job with PID 2710
2018-03-30 05:52:51,521 [salt.minion      ][INFO    ][2710] Returning information for job: 20180330055251450287
2018-03-30 05:52:52,263 [salt.state       ][INFO    ][2224] {'repo': 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main'}
2018-03-30 05:52:52,264 [salt.state       ][INFO    ][2224] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 05:52:52.264534 duration_in_ms=4192.234
2018-03-30 05:52:52,265 [salt.state       ][INFO    ][2224] Running state [linux_extra_packages_latest] at time 05:52:52.264989
2018-03-30 05:52:52,265 [salt.state       ][INFO    ][2224] Executing state pkg.latest for linux_extra_packages_latest
2018-03-30 05:52:52,275 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-03-30 05:52:52,331 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 05:52:52,351 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'libapache2-mod-wsgi'] in directory '/root'
2018-03-30 05:53:01,515 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055301511717
2018-03-30 05:53:01,538 [salt.minion      ][INFO    ][3030] Starting a new job with PID 3030
2018-03-30 05:53:01,551 [salt.minion      ][INFO    ][3030] Returning information for job: 20180330055301511717
2018-03-30 05:53:04,936 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:53:04,971 [salt.state       ][INFO    ][2224] Made the following changes:
'libaprutil1-ldap' changed from 'absent' to '1.5.4-1build1'
'libapr1' changed from 'absent' to '1.5.2-3'
'libpython2.7' changed from 'absent' to '2.7.12-1ubuntu0~16.04.3'
'libapache2-mod-wsgi' changed from 'absent' to '4.3.0-1.1build1'
'apache2-api-20120211' changed from 'absent' to '1'
'libaprutil1' changed from 'absent' to '1.5.4-1build1'
'liblua5.1-0' changed from 'absent' to '5.1.5-8ubuntu1'
'libaprutil1-dbd-sqlite3' changed from 'absent' to '1.5.4-1build1'
'httpd-wsgi' changed from 'absent' to '1'
'apache2-bin' changed from 'absent' to '2.4.18-2ubuntu3.5'

2018-03-30 05:53:04,999 [salt.state       ][INFO    ][2224] Loading fresh modules for state activity
2018-03-30 05:53:05,033 [salt.state       ][INFO    ][2224] Completed state [linux_extra_packages_latest] at time 05:53:05.033241 duration_in_ms=12768.251
2018-03-30 05:53:05,036 [salt.state       ][INFO    ][2224] Running state [UTC] at time 05:53:05.036365
2018-03-30 05:53:05,036 [salt.state       ][INFO    ][2224] Executing state timezone.system for UTC
2018-03-30 05:53:05,039 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['timedatectl'] in directory '/root'
2018-03-30 05:53:05,374 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['timedatectl'] in directory '/root'
2018-03-30 05:53:05,416 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'timedatectl set-timezone UTC' in directory '/root'
2018-03-30 05:53:05,441 [salt.state       ][INFO    ][2224] {'timezone': 'UTC'}
2018-03-30 05:53:05,442 [salt.state       ][INFO    ][2224] Completed state [UTC] at time 05:53:05.442446 duration_in_ms=406.081
2018-03-30 05:53:05,445 [salt.state       ][INFO    ][2224] Running state [nf_conntrack] at time 05:53:05.445125
2018-03-30 05:53:05,445 [salt.state       ][INFO    ][2224] Executing state kmod.present for nf_conntrack
2018-03-30 05:53:05,446 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'lsmod' in directory '/root'
2018-03-30 05:53:05,660 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'lsmod' in directory '/root'
2018-03-30 05:53:05,682 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'modprobe nf_conntrack' in directory '/root'
2018-03-30 05:53:05,753 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'lsmod' in directory '/root'
2018-03-30 05:53:05,813 [salt.state       ][INFO    ][2224] {'nf_conntrack': 'loaded'}
2018-03-30 05:53:05,814 [salt.state       ][INFO    ][2224] Completed state [nf_conntrack] at time 05:53:05.814368 duration_in_ms=369.242
2018-03-30 05:53:05,816 [salt.state       ][INFO    ][2224] Running state [kernel.panic] at time 05:53:05.816912
2018-03-30 05:53:05,817 [salt.state       ][INFO    ][2224] Executing state sysctl.present for kernel.panic
2018-03-30 05:53:05,818 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,108 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w kernel.panic="60"' in directory '/root'
2018-03-30 05:53:06,134 [salt.state       ][INFO    ][2224] {'kernel.panic': 60}
2018-03-30 05:53:06,136 [salt.state       ][INFO    ][2224] Completed state [kernel.panic] at time 05:53:06.135872 duration_in_ms=318.959
2018-03-30 05:53:06,137 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_keepalive_probes] at time 05:53:06.136931
2018-03-30 05:53:06,137 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_keepalive_probes
2018-03-30 05:53:06,139 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,262 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_keepalive_probes="8"' in directory '/root'
2018-03-30 05:53:06,285 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_keepalive_probes': 8}
2018-03-30 05:53:06,287 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_keepalive_probes] at time 05:53:06.287209 duration_in_ms=150.279
2018-03-30 05:53:06,288 [salt.state       ][INFO    ][2224] Running state [fs.file-max] at time 05:53:06.288297
2018-03-30 05:53:06,289 [salt.state       ][INFO    ][2224] Executing state sysctl.present for fs.file-max
2018-03-30 05:53:06,290 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,339 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w fs.file-max="124165"' in directory '/root'
2018-03-30 05:53:06,356 [salt.state       ][INFO    ][2224] {'fs.file-max': 124165}
2018-03-30 05:53:06,357 [salt.state       ][INFO    ][2224] Completed state [fs.file-max] at time 05:53:06.357390 duration_in_ms=69.093
2018-03-30 05:53:06,358 [salt.state       ][INFO    ][2224] Running state [net.core.somaxconn] at time 05:53:06.358249
2018-03-30 05:53:06,359 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.core.somaxconn
2018-03-30 05:53:06,360 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,412 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.core.somaxconn="4096"' in directory '/root'
2018-03-30 05:53:06,432 [salt.state       ][INFO    ][2224] {'net.core.somaxconn': 4096}
2018-03-30 05:53:06,433 [salt.state       ][INFO    ][2224] Completed state [net.core.somaxconn] at time 05:53:06.433457 duration_in_ms=75.208
2018-03-30 05:53:06,434 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_max_syn_backlog] at time 05:53:06.434551
2018-03-30 05:53:06,435 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_max_syn_backlog
2018-03-30 05:53:06,437 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,490 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_max_syn_backlog="8192"' in directory '/root'
2018-03-30 05:53:06,514 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_max_syn_backlog': 8192}
2018-03-30 05:53:06,515 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_max_syn_backlog] at time 05:53:06.515690 duration_in_ms=81.139
2018-03-30 05:53:06,516 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_tw_reuse] at time 05:53:06.516562
2018-03-30 05:53:06,517 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_tw_reuse
2018-03-30 05:53:06,518 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,566 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_tw_reuse="1"' in directory '/root'
2018-03-30 05:53:06,587 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_tw_reuse': 1}
2018-03-30 05:53:06,587 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_tw_reuse] at time 05:53:06.587893 duration_in_ms=71.33
2018-03-30 05:53:06,588 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_congestion_control] at time 05:53:06.588662
2018-03-30 05:53:06,589 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_congestion_control
2018-03-30 05:53:06,590 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:06,634 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_congestion_control="yeah"' in directory '/root'
2018-03-30 05:53:07,015 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_congestion_control': 'yeah'}
2018-03-30 05:53:07,016 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_congestion_control] at time 05:53:07.016108 duration_in_ms=427.444
2018-03-30 05:53:07,017 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_retries2] at time 05:53:07.017246
2018-03-30 05:53:07,018 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_retries2
2018-03-30 05:53:07,019 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,084 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_retries2="5"' in directory '/root'
2018-03-30 05:53:07,112 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_retries2': 5}
2018-03-30 05:53:07,113 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_retries2] at time 05:53:07.113012 duration_in_ms=95.839
2018-03-30 05:53:07,113 [salt.state       ][INFO    ][2224] Running state [net.core.netdev_max_backlog] at time 05:53:07.113652
2018-03-30 05:53:07,114 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.core.netdev_max_backlog
2018-03-30 05:53:07,115 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,230 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.core.netdev_max_backlog="261144"' in directory '/root'
2018-03-30 05:53:07,258 [salt.state       ][INFO    ][2224] {'net.core.netdev_max_backlog': 261144}
2018-03-30 05:53:07,259 [salt.state       ][INFO    ][2224] Completed state [net.core.netdev_max_backlog] at time 05:53:07.258970 duration_in_ms=145.316
2018-03-30 05:53:07,259 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_slow_start_after_idle] at time 05:53:07.259698
2018-03-30 05:53:07,260 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_slow_start_after_idle
2018-03-30 05:53:07,261 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,322 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_slow_start_after_idle="0"' in directory '/root'
2018-03-30 05:53:07,348 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_slow_start_after_idle': 0}
2018-03-30 05:53:07,349 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 05:53:07.348903 duration_in_ms=89.205
2018-03-30 05:53:07,349 [salt.state       ][INFO    ][2224] Running state [vm.swappiness] at time 05:53:07.349524
2018-03-30 05:53:07,350 [salt.state       ][INFO    ][2224] Executing state sysctl.present for vm.swappiness
2018-03-30 05:53:07,351 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,439 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w vm.swappiness="10"' in directory '/root'
2018-03-30 05:53:07,467 [salt.state       ][INFO    ][2224] {'vm.swappiness': 10}
2018-03-30 05:53:07,468 [salt.state       ][INFO    ][2224] Completed state [vm.swappiness] at time 05:53:07.468386 duration_in_ms=118.862
2018-03-30 05:53:07,469 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_keepalive_intvl] at time 05:53:07.469189
2018-03-30 05:53:07,469 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_keepalive_intvl
2018-03-30 05:53:07,471 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,531 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_keepalive_intvl="3"' in directory '/root'
2018-03-30 05:53:07,557 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_keepalive_intvl': 3}
2018-03-30 05:53:07,558 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_keepalive_intvl] at time 05:53:07.557892 duration_in_ms=88.703
2018-03-30 05:53:07,558 [salt.state       ][INFO    ][2224] Running state [net.ipv4.neigh.default.gc_thresh1] at time 05:53:07.558583
2018-03-30 05:53:07,559 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh1
2018-03-30 05:53:07,560 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,649 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh1="4096"' in directory '/root'
2018-03-30 05:53:07,676 [salt.state       ][INFO    ][2224] {'net.ipv4.neigh.default.gc_thresh1': 4096}
2018-03-30 05:53:07,677 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 05:53:07.677348 duration_in_ms=118.764
2018-03-30 05:53:07,678 [salt.state       ][INFO    ][2224] Running state [net.ipv4.neigh.default.gc_thresh2] at time 05:53:07.678027
2018-03-30 05:53:07,678 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh2
2018-03-30 05:53:07,679 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,737 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh2="8192"' in directory '/root'
2018-03-30 05:53:07,760 [salt.state       ][INFO    ][2224] {'net.ipv4.neigh.default.gc_thresh2': 8192}
2018-03-30 05:53:07,761 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 05:53:07.761355 duration_in_ms=83.328
2018-03-30 05:53:07,761 [salt.state       ][INFO    ][2224] Running state [net.ipv4.neigh.default.gc_thresh3] at time 05:53:07.761885
2018-03-30 05:53:07,762 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh3
2018-03-30 05:53:07,763 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:07,853 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh3="16384"' in directory '/root'
2018-03-30 05:53:07,880 [salt.state       ][INFO    ][2224] {'net.ipv4.neigh.default.gc_thresh3': 16384}
2018-03-30 05:53:07,881 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 05:53:07.881173 duration_in_ms=119.287
2018-03-30 05:53:07,881 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_fin_timeout] at time 05:53:07.881908
2018-03-30 05:53:07,882 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_fin_timeout
2018-03-30 05:53:07,883 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:08,002 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_fin_timeout="30"' in directory '/root'
2018-03-30 05:53:08,029 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_fin_timeout': 30}
2018-03-30 05:53:08,030 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_fin_timeout] at time 05:53:08.030389 duration_in_ms=148.481
2018-03-30 05:53:08,031 [salt.state       ][INFO    ][2224] Running state [net.ipv4.tcp_keepalive_time] at time 05:53:08.031060
2018-03-30 05:53:08,031 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.ipv4.tcp_keepalive_time
2018-03-30 05:53:08,033 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:08,088 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.ipv4.tcp_keepalive_time="30"' in directory '/root'
2018-03-30 05:53:08,114 [salt.state       ][INFO    ][2224] {'net.ipv4.tcp_keepalive_time': 30}
2018-03-30 05:53:08,114 [salt.state       ][INFO    ][2224] Completed state [net.ipv4.tcp_keepalive_time] at time 05:53:08.114877 duration_in_ms=83.817
2018-03-30 05:53:08,115 [salt.state       ][INFO    ][2224] Running state [net.nf_conntrack_max] at time 05:53:08.115409
2018-03-30 05:53:08,115 [salt.state       ][INFO    ][2224] Executing state sysctl.present for net.nf_conntrack_max
2018-03-30 05:53:08,116 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:53:08,224 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'sysctl -w net.nf_conntrack_max="1048576"' in directory '/root'
2018-03-30 05:53:08,253 [salt.state       ][INFO    ][2224] {'net.nf_conntrack_max': 1048576}
2018-03-30 05:53:08,253 [salt.state       ][INFO    ][2224] Completed state [net.nf_conntrack_max] at time 05:53:08.253727 duration_in_ms=138.317
2018-03-30 05:53:08,262 [salt.state       ][INFO    ][2224] Running state [linux_sysfs_package] at time 05:53:08.262352
2018-03-30 05:53:08,262 [salt.state       ][INFO    ][2224] Executing state pkg.installed for linux_sysfs_package
2018-03-30 05:53:08,825 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['apt-cache', '-q', 'policy', 'sysfsutils'] in directory '/root'
2018-03-30 05:53:08,886 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 05:53:10,956 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 05:53:10,997 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'sysfsutils'] in directory '/root'
2018-03-30 05:53:11,744 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055311746378
2018-03-30 05:53:11,768 [salt.minion      ][INFO    ][3487] Starting a new job with PID 3487
2018-03-30 05:53:11,796 [salt.minion      ][INFO    ][3487] Returning information for job: 20180330055311746378
2018-03-30 05:53:19,725 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:53:19,836 [salt.state       ][INFO    ][2224] Made the following changes:
'libsysfs2' changed from 'absent' to '2.1.0+repack-4'
'sysfsutils' changed from 'absent' to '2.1.0+repack-4'

2018-03-30 05:53:19,859 [salt.state       ][INFO    ][2224] Loading fresh modules for state activity
2018-03-30 05:53:19,897 [salt.state       ][INFO    ][2224] Completed state [linux_sysfs_package] at time 05:53:19.897031 duration_in_ms=11634.678
2018-03-30 05:53:19,901 [salt.state       ][INFO    ][2224] Running state [/etc/sysfs.d] at time 05:53:19.901602
2018-03-30 05:53:19,902 [salt.state       ][INFO    ][2224] Executing state file.directory for /etc/sysfs.d
2018-03-30 05:53:19,906 [salt.state       ][INFO    ][2224] Directory /etc/sysfs.d is in the correct state
2018-03-30 05:53:19,906 [salt.state       ][INFO    ][2224] Completed state [/etc/sysfs.d] at time 05:53:19.906496 duration_in_ms=4.893
2018-03-30 05:53:20,155 [salt.state       ][INFO    ][2224] Running state [ondemand] at time 05:53:20.155856
2018-03-30 05:53:20,156 [salt.state       ][INFO    ][2224] Executing state service.dead for ondemand
2018-03-30 05:53:20,157 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-03-30 05:53:20,179 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,199 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,216 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,280 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,298 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,315 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,333 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', '/usr/sbin/update-rc.d', '-f', 'ondemand', 'remove'] in directory '/root'
2018-03-30 05:53:20,553 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-03-30 05:53:20,590 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'runlevel' in directory '/root'
2018-03-30 05:53:20,614 [salt.state       ][INFO    ][2224] {'ondemand': True}
2018-03-30 05:53:20,615 [salt.state       ][INFO    ][2224] Completed state [ondemand] at time 05:53:20.615230 duration_in_ms=459.373
2018-03-30 05:53:20,620 [salt.state       ][INFO    ][2224] Running state [cs_CZ.UTF-8] at time 05:53:20.620177
2018-03-30 05:53:20,620 [salt.state       ][INFO    ][2224] Executing state locale.present for cs_CZ.UTF-8
2018-03-30 05:53:20,621 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'locale -a' in directory '/root'
2018-03-30 05:53:20,731 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['locale-gen', 'cs_CZ.utf8'] in directory '/root'
2018-03-30 05:53:21,930 [salt.state       ][INFO    ][2224] {'locale': 'cs_CZ.UTF-8'}
2018-03-30 05:53:21,931 [salt.state       ][INFO    ][2224] Completed state [cs_CZ.UTF-8] at time 05:53:21.931332 duration_in_ms=1311.155
2018-03-30 05:53:21,931 [salt.state       ][INFO    ][2224] Running state [en_US.UTF-8] at time 05:53:21.931656
2018-03-30 05:53:21,931 [salt.state       ][INFO    ][2224] Executing state locale.present for en_US.UTF-8
2018-03-30 05:53:21,932 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'locale -a' in directory '/root'
2018-03-30 05:53:21,941 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055321945025
2018-03-30 05:53:21,966 [salt.state       ][INFO    ][2224] Locale en_US.UTF-8 is already present
2018-03-30 05:53:21,967 [salt.state       ][INFO    ][2224] Completed state [en_US.UTF-8] at time 05:53:21.967138 duration_in_ms=35.479
2018-03-30 05:53:21,971 [salt.state       ][INFO    ][2224] Running state [en_US.UTF-8] at time 05:53:21.971137
2018-03-30 05:53:21,972 [salt.state       ][INFO    ][2224] Executing state locale.system for en_US.UTF-8
2018-03-30 05:53:21,977 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'localectl' in directory '/root'
2018-03-30 05:53:21,982 [salt.minion      ][INFO    ][3955] Starting a new job with PID 3955
2018-03-30 05:53:22,004 [salt.minion      ][INFO    ][3955] Returning information for job: 20180330055321945025
2018-03-30 05:53:22,288 [salt.state       ][INFO    ][2224] System locale en_US.UTF-8 already set
2018-03-30 05:53:22,289 [salt.state       ][INFO    ][2224] Completed state [en_US.UTF-8] at time 05:53:22.288857 duration_in_ms=317.721
2018-03-30 05:53:22,293 [salt.state       ][INFO    ][2224] Running state [root] at time 05:53:22.292928
2018-03-30 05:53:22,293 [salt.state       ][INFO    ][2224] Executing state user.present for root
2018-03-30 05:53:22,305 [salt.state       ][INFO    ][2224] User root is present and up to date
2018-03-30 05:53:22,305 [salt.state       ][INFO    ][2224] Completed state [root] at time 05:53:22.305749 duration_in_ms=12.82
2018-03-30 05:53:22,307 [salt.state       ][INFO    ][2224] Running state [/root] at time 05:53:22.307800
2018-03-30 05:53:22,308 [salt.state       ][INFO    ][2224] Executing state file.directory for /root
2018-03-30 05:53:22,309 [salt.state       ][INFO    ][2224] Directory /root is in the correct state
2018-03-30 05:53:22,309 [salt.state       ][INFO    ][2224] Completed state [/root] at time 05:53:22.309658 duration_in_ms=1.858
2018-03-30 05:53:22,310 [salt.state       ][INFO    ][2224] Running state [/etc/sudoers.d/90-salt-user-root] at time 05:53:22.309980
2018-03-30 05:53:22,310 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/sudoers.d/90-salt-user-root
2018-03-30 05:53:22,310 [salt.state       ][INFO    ][2224] File /etc/sudoers.d/90-salt-user-root is not present
2018-03-30 05:53:22,311 [salt.state       ][INFO    ][2224] Completed state [/etc/sudoers.d/90-salt-user-root] at time 05:53:22.311011 duration_in_ms=1.031
2018-03-30 05:53:22,311 [salt.state       ][INFO    ][2224] Running state [ubuntu] at time 05:53:22.311334
2018-03-30 05:53:22,311 [salt.state       ][INFO    ][2224] Executing state user.present for ubuntu
2018-03-30 05:53:22,316 [salt.state       ][INFO    ][2224] {'passwd': 'XXX-REDACTED-XXX'}
2018-03-30 05:53:22,316 [salt.state       ][INFO    ][2224] Completed state [ubuntu] at time 05:53:22.316351 duration_in_ms=5.016
2018-03-30 05:53:22,317 [salt.state       ][INFO    ][2224] Running state [/home/ubuntu] at time 05:53:22.317551
2018-03-30 05:53:22,317 [salt.state       ][INFO    ][2224] Executing state file.directory for /home/ubuntu
2018-03-30 05:53:22,319 [salt.state       ][INFO    ][2224] {'mode': '0700'}
2018-03-30 05:53:22,319 [salt.state       ][INFO    ][2224] Completed state [/home/ubuntu] at time 05:53:22.319408 duration_in_ms=1.857
2018-03-30 05:53:22,320 [salt.state       ][INFO    ][2224] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 05:53:22.320392
2018-03-30 05:53:22,320 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/sudoers.d/90-salt-user-ubuntu
2018-03-30 05:53:22,346 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/files/sudoer'
2018-03-30 05:53:22,352 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command '/usr/sbin/visudo -c -f /tmp/tmpy7XV6i' in directory '/root'
2018-03-30 05:53:22,406 [salt.state       ][INFO    ][2224] File changed:
New file
2018-03-30 05:53:22,407 [salt.state       ][INFO    ][2224] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 05:53:22.407301 duration_in_ms=86.908
2018-03-30 05:53:22,408 [salt.state       ][INFO    ][2224] Running state [/etc/security/limits.d/90-salt-default.conf] at time 05:53:22.407945
2018-03-30 05:53:22,408 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/security/limits.d/90-salt-default.conf
2018-03-30 05:53:22,441 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/files/limits.conf'
2018-03-30 05:53:22,516 [salt.state       ][INFO    ][2224] File changed:
New file
2018-03-30 05:53:22,516 [salt.state       ][INFO    ][2224] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 05:53:22.516444 duration_in_ms=108.499
2018-03-30 05:53:22,516 [salt.state       ][INFO    ][2224] Running state [apt-daily.timer] at time 05:53:22.516839
2018-03-30 05:53:22,517 [salt.state       ][INFO    ][2224] Executing state service.dead for apt-daily.timer
2018-03-30 05:53:22,518 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'status', 'apt-daily.timer', '-n', '0'] in directory '/root'
2018-03-30 05:53:22,589 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,622 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,648 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,691 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,716 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,740 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,766 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'disable', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,873 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:53:22,901 [salt.state       ][INFO    ][2224] {'apt-daily.timer': True}
2018-03-30 05:53:22,902 [salt.state       ][INFO    ][2224] Completed state [apt-daily.timer] at time 05:53:22.902538 duration_in_ms=385.699
2018-03-30 05:53:22,903 [salt.state       ][INFO    ][2224] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 05:53:22.903350
2018-03-30 05:53:22,904 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/systemd/system.conf.d/90-salt.conf
2018-03-30 05:53:22,937 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/files/systemd.conf'
2018-03-30 05:53:23,015 [salt.state       ][INFO    ][2224] File changed:
New file
2018-03-30 05:53:23,016 [salt.state       ][INFO    ][2224] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 05:53:23.016458 duration_in_ms=113.108
2018-03-30 05:53:23,018 [salt.state       ][INFO    ][2224] Running state [service.systemctl_reload] at time 05:53:23.018402
2018-03-30 05:53:23,018 [salt.state       ][INFO    ][2224] Executing state module.wait for service.systemctl_reload
2018-03-30 05:53:23,019 [salt.state       ][INFO    ][2224] No changes made for service.systemctl_reload
2018-03-30 05:53:23,019 [salt.state       ][INFO    ][2224] Completed state [service.systemctl_reload] at time 05:53:23.019857 duration_in_ms=1.455
2018-03-30 05:53:23,020 [salt.state       ][INFO    ][2224] Running state [service.systemctl_reload] at time 05:53:23.020312
2018-03-30 05:53:23,020 [salt.state       ][INFO    ][2224] Executing state module.mod_watch for service.systemctl_reload
2018-03-30 05:53:23,021 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', '--system', 'daemon-reload'] in directory '/root'
2018-03-30 05:53:23,135 [salt.state       ][INFO    ][2224] {'ret': True}
2018-03-30 05:53:23,136 [salt.state       ][INFO    ][2224] Completed state [service.systemctl_reload] at time 05:53:23.136549 duration_in_ms=116.236
2018-03-30 05:53:23,137 [salt.state       ][INFO    ][2224] Running state [/etc/hostname] at time 05:53:23.137091
2018-03-30 05:53:23,137 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/hostname
2018-03-30 05:53:23,244 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'linux/files/hostname'
2018-03-30 05:53:23,252 [salt.state       ][INFO    ][2224] File changed:
--- 
+++ 
@@ -1 +1 @@
-ubuntu
+prx01

2018-03-30 05:53:23,254 [salt.state       ][INFO    ][2224] Completed state [/etc/hostname] at time 05:53:23.254911 duration_in_ms=117.819
2018-03-30 05:53:23,259 [salt.state       ][INFO    ][2224] Running state [hostname prx01] at time 05:53:23.258922
2018-03-30 05:53:23,259 [salt.state       ][INFO    ][2224] Executing state cmd.wait for hostname prx01
2018-03-30 05:53:23,260 [salt.state       ][INFO    ][2224] No changes made for hostname prx01
2018-03-30 05:53:23,261 [salt.state       ][INFO    ][2224] Completed state [hostname prx01] at time 05:53:23.261624 duration_in_ms=2.702
2018-03-30 05:53:23,262 [salt.state       ][INFO    ][2224] Running state [hostname prx01] at time 05:53:23.262492
2018-03-30 05:53:23,263 [salt.state       ][INFO    ][2224] Executing state cmd.mod_watch for hostname prx01
2018-03-30 05:53:23,265 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command 'hostname prx01' in directory '/root'
2018-03-30 05:53:23,289 [salt.state       ][INFO    ][2224] {'pid': 4021, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-03-30 05:53:23,290 [salt.state       ][INFO    ][2224] Completed state [hostname prx01] at time 05:53:23.290799 duration_in_ms=28.306
2018-03-30 05:53:23,292 [salt.state       ][INFO    ][2224] Running state [mdb02] at time 05:53:23.292767
2018-03-30 05:53:23,293 [salt.state       ][INFO    ][2224] Executing state host.present for mdb02
2018-03-30 05:53:23,295 [salt.state       ][INFO    ][2224] {'host': 'mdb02'}
2018-03-30 05:53:23,296 [salt.state       ][INFO    ][2224] Completed state [mdb02] at time 05:53:23.296336 duration_in_ms=3.569
2018-03-30 05:53:23,297 [salt.state       ][INFO    ][2224] Running state [mdb02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.297138
2018-03-30 05:53:23,297 [salt.state       ][INFO    ][2224] Executing state host.present for mdb02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,335 [salt.state       ][INFO    ][2224] {'host': 'mdb02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,336 [salt.state       ][INFO    ][2224] Completed state [mdb02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.336103 duration_in_ms=38.965
2018-03-30 05:53:23,337 [salt.state       ][INFO    ][2224] Running state [mdb03] at time 05:53:23.337029
2018-03-30 05:53:23,337 [salt.state       ][INFO    ][2224] Executing state host.present for mdb03
2018-03-30 05:53:23,497 [salt.state       ][INFO    ][2224] {'host': 'mdb03'}
2018-03-30 05:53:23,498 [salt.state       ][INFO    ][2224] Completed state [mdb03] at time 05:53:23.497984 duration_in_ms=160.955
2018-03-30 05:53:23,498 [salt.state       ][INFO    ][2224] Running state [mdb03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.498903
2018-03-30 05:53:23,499 [salt.state       ][INFO    ][2224] Executing state host.present for mdb03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,503 [salt.state       ][INFO    ][2224] {'host': 'mdb03.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,504 [salt.state       ][INFO    ][2224] Completed state [mdb03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.503917 duration_in_ms=5.014
2018-03-30 05:53:23,505 [salt.state       ][INFO    ][2224] Running state [mdb01] at time 05:53:23.504979
2018-03-30 05:53:23,505 [salt.state       ][INFO    ][2224] Executing state host.present for mdb01
2018-03-30 05:53:23,509 [salt.state       ][INFO    ][2224] {'host': 'mdb01'}
2018-03-30 05:53:23,510 [salt.state       ][INFO    ][2224] Completed state [mdb01] at time 05:53:23.509942 duration_in_ms=4.963
2018-03-30 05:53:23,510 [salt.state       ][INFO    ][2224] Running state [mdb01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.510851
2018-03-30 05:53:23,511 [salt.state       ][INFO    ][2224] Executing state host.present for mdb01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,515 [salt.state       ][INFO    ][2224] {'host': 'mdb01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,515 [salt.state       ][INFO    ][2224] Completed state [mdb01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.515903 duration_in_ms=5.053
2018-03-30 05:53:23,516 [salt.state       ][INFO    ][2224] Running state [mdb] at time 05:53:23.516898
2018-03-30 05:53:23,517 [salt.state       ][INFO    ][2224] Executing state host.present for mdb
2018-03-30 05:53:23,521 [salt.state       ][INFO    ][2224] {'host': 'mdb'}
2018-03-30 05:53:23,522 [salt.state       ][INFO    ][2224] Completed state [mdb] at time 05:53:23.521947 duration_in_ms=5.05
2018-03-30 05:53:23,522 [salt.state       ][INFO    ][2224] Running state [mdb.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.522867
2018-03-30 05:53:23,523 [salt.state       ][INFO    ][2224] Executing state host.present for mdb.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,527 [salt.state       ][INFO    ][2224] {'host': 'mdb.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,528 [salt.state       ][INFO    ][2224] Completed state [mdb.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.527931 duration_in_ms=5.065
2018-03-30 05:53:23,529 [salt.state       ][INFO    ][2224] Running state [cfg01] at time 05:53:23.528945
2018-03-30 05:53:23,529 [salt.state       ][INFO    ][2224] Executing state host.present for cfg01
2018-03-30 05:53:23,533 [salt.state       ][INFO    ][2224] {'host': 'cfg01'}
2018-03-30 05:53:23,533 [salt.state       ][INFO    ][2224] Completed state [cfg01] at time 05:53:23.533875 duration_in_ms=4.93
2018-03-30 05:53:23,534 [salt.state       ][INFO    ][2224] Running state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.534746
2018-03-30 05:53:23,535 [salt.state       ][INFO    ][2224] Executing state host.present for cfg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,539 [salt.state       ][INFO    ][2224] {'host': 'cfg01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,539 [salt.state       ][INFO    ][2224] Completed state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.539911 duration_in_ms=5.164
2018-03-30 05:53:23,540 [salt.state       ][INFO    ][2224] Running state [prx01] at time 05:53:23.540834
2018-03-30 05:53:23,541 [salt.state       ][INFO    ][2224] Executing state host.present for prx01
2018-03-30 05:53:23,544 [salt.state       ][INFO    ][2224] {'host': 'prx01'}
2018-03-30 05:53:23,545 [salt.state       ][INFO    ][2224] Completed state [prx01] at time 05:53:23.544981 duration_in_ms=4.147
2018-03-30 05:53:23,545 [salt.state       ][INFO    ][2224] Running state [prx01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.545880
2018-03-30 05:53:23,546 [salt.state       ][INFO    ][2224] Executing state host.present for prx01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,549 [salt.state       ][INFO    ][2224] {'host': 'prx01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,550 [salt.state       ][INFO    ][2224] Completed state [prx01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.550706 duration_in_ms=4.827
2018-03-30 05:53:23,551 [salt.state       ][INFO    ][2224] Running state [kvm01] at time 05:53:23.551569
2018-03-30 05:53:23,552 [salt.state       ][INFO    ][2224] Executing state host.present for kvm01
2018-03-30 05:53:23,555 [salt.state       ][INFO    ][2224] {'host': 'kvm01'}
2018-03-30 05:53:23,556 [salt.state       ][INFO    ][2224] Completed state [kvm01] at time 05:53:23.556650 duration_in_ms=5.081
2018-03-30 05:53:23,557 [salt.state       ][INFO    ][2224] Running state [kvm01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.557619
2018-03-30 05:53:23,558 [salt.state       ][INFO    ][2224] Executing state host.present for kvm01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,561 [salt.state       ][INFO    ][2224] {'host': 'kvm01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,562 [salt.state       ][INFO    ][2224] Completed state [kvm01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.562686 duration_in_ms=5.067
2018-03-30 05:53:23,563 [salt.state       ][INFO    ][2224] Running state [kvm03] at time 05:53:23.563561
2018-03-30 05:53:23,564 [salt.state       ][INFO    ][2224] Executing state host.present for kvm03
2018-03-30 05:53:23,567 [salt.state       ][INFO    ][2224] {'host': 'kvm03'}
2018-03-30 05:53:23,568 [salt.state       ][INFO    ][2224] Completed state [kvm03] at time 05:53:23.568627 duration_in_ms=5.066
2018-03-30 05:53:23,569 [salt.state       ][INFO    ][2224] Running state [kvm03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.569540
2018-03-30 05:53:23,570 [salt.state       ][INFO    ][2224] Executing state host.present for kvm03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,578 [salt.state       ][INFO    ][2224] {'host': 'kvm03.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,579 [salt.state       ][INFO    ][2224] Completed state [kvm03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.579209 duration_in_ms=9.668
2018-03-30 05:53:23,580 [salt.state       ][INFO    ][2224] Running state [kvm02] at time 05:53:23.580003
2018-03-30 05:53:23,580 [salt.state       ][INFO    ][2224] Executing state host.present for kvm02
2018-03-30 05:53:23,711 [salt.state       ][INFO    ][2224] {'host': 'kvm02'}
2018-03-30 05:53:23,712 [salt.state       ][INFO    ][2224] Completed state [kvm02] at time 05:53:23.712602 duration_in_ms=132.598
2018-03-30 05:53:23,713 [salt.state       ][INFO    ][2224] Running state [kvm02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.713709
2018-03-30 05:53:23,714 [salt.state       ][INFO    ][2224] Executing state host.present for kvm02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,717 [salt.state       ][INFO    ][2224] {'host': 'kvm02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,718 [salt.state       ][INFO    ][2224] Completed state [kvm02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.718532 duration_in_ms=4.823
2018-03-30 05:53:23,719 [salt.state       ][INFO    ][2224] Running state [dbs] at time 05:53:23.719467
2018-03-30 05:53:23,720 [salt.state       ][INFO    ][2224] Executing state host.present for dbs
2018-03-30 05:53:23,723 [salt.state       ][INFO    ][2224] {'host': 'dbs'}
2018-03-30 05:53:23,724 [salt.state       ][INFO    ][2224] Completed state [dbs] at time 05:53:23.724524 duration_in_ms=5.057
2018-03-30 05:53:23,725 [salt.state       ][INFO    ][2224] Running state [dbs.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.725517
2018-03-30 05:53:23,726 [salt.state       ][INFO    ][2224] Executing state host.present for dbs.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,729 [salt.state       ][INFO    ][2224] {'host': 'dbs.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,730 [salt.state       ][INFO    ][2224] Completed state [dbs.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.730482 duration_in_ms=4.966
2018-03-30 05:53:23,731 [salt.state       ][INFO    ][2224] Running state [prx] at time 05:53:23.731346
2018-03-30 05:53:23,732 [salt.state       ][INFO    ][2224] Executing state host.present for prx
2018-03-30 05:53:23,735 [salt.state       ][INFO    ][2224] {'host': 'prx'}
2018-03-30 05:53:23,736 [salt.state       ][INFO    ][2224] Completed state [prx] at time 05:53:23.736529 duration_in_ms=5.183
2018-03-30 05:53:23,737 [salt.state       ][INFO    ][2224] Running state [prx.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.737472
2018-03-30 05:53:23,738 [salt.state       ][INFO    ][2224] Executing state host.present for prx.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,741 [salt.state       ][INFO    ][2224] {'host': 'prx.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,742 [salt.state       ][INFO    ][2224] Completed state [prx.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.742508 duration_in_ms=5.036
2018-03-30 05:53:23,743 [salt.state       ][INFO    ][2224] Running state [prx02] at time 05:53:23.743378
2018-03-30 05:53:23,744 [salt.state       ][INFO    ][2224] Executing state host.present for prx02
2018-03-30 05:53:23,747 [salt.state       ][INFO    ][2224] {'host': 'prx02'}
2018-03-30 05:53:23,748 [salt.state       ][INFO    ][2224] Completed state [prx02] at time 05:53:23.748537 duration_in_ms=5.159
2018-03-30 05:53:23,749 [salt.state       ][INFO    ][2224] Running state [prx02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.749456
2018-03-30 05:53:23,750 [salt.state       ][INFO    ][2224] Executing state host.present for prx02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,753 [salt.state       ][INFO    ][2224] {'host': 'prx02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,754 [salt.state       ][INFO    ][2224] Completed state [prx02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.754491 duration_in_ms=5.036
2018-03-30 05:53:23,755 [salt.state       ][INFO    ][2224] Running state [msg02] at time 05:53:23.755318
2018-03-30 05:53:23,756 [salt.state       ][INFO    ][2224] Executing state host.present for msg02
2018-03-30 05:53:23,759 [salt.state       ][INFO    ][2224] {'host': 'msg02'}
2018-03-30 05:53:23,760 [salt.state       ][INFO    ][2224] Completed state [msg02] at time 05:53:23.760512 duration_in_ms=5.195
2018-03-30 05:53:23,761 [salt.state       ][INFO    ][2224] Running state [msg02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.761389
2018-03-30 05:53:23,762 [salt.state       ][INFO    ][2224] Executing state host.present for msg02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:23,765 [salt.state       ][INFO    ][2224] {'host': 'msg02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:23,766 [salt.state       ][INFO    ][2224] Completed state [msg02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.766532 duration_in_ms=5.142
2018-03-30 05:53:23,767 [salt.state       ][INFO    ][2224] Running state [msg03] at time 05:53:23.767373
2018-03-30 05:53:23,768 [salt.state       ][INFO    ][2224] Executing state host.present for msg03
2018-03-30 05:53:23,956 [salt.state       ][INFO    ][2224] {'host': 'msg03'}
2018-03-30 05:53:23,957 [salt.state       ][INFO    ][2224] Completed state [msg03] at time 05:53:23.957571 duration_in_ms=190.198
2018-03-30 05:53:23,958 [salt.state       ][INFO    ][2224] Running state [msg03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:23.958530
2018-03-30 05:53:23,959 [salt.state       ][INFO    ][2224] Executing state host.present for msg03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,136 [salt.state       ][INFO    ][2224] {'host': 'msg03.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,138 [salt.state       ][INFO    ][2224] Completed state [msg03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.137999 duration_in_ms=179.468
2018-03-30 05:53:24,139 [salt.state       ][INFO    ][2224] Running state [msg01] at time 05:53:24.139051
2018-03-30 05:53:24,140 [salt.state       ][INFO    ][2224] Executing state host.present for msg01
2018-03-30 05:53:24,151 [salt.state       ][INFO    ][2224] {'host': 'msg01'}
2018-03-30 05:53:24,152 [salt.state       ][INFO    ][2224] Completed state [msg01] at time 05:53:24.152315 duration_in_ms=13.264
2018-03-30 05:53:24,153 [salt.state       ][INFO    ][2224] Running state [msg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.153300
2018-03-30 05:53:24,154 [salt.state       ][INFO    ][2224] Executing state host.present for msg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,157 [salt.state       ][INFO    ][2224] {'host': 'msg01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,158 [salt.state       ][INFO    ][2224] Completed state [msg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.158265 duration_in_ms=4.965
2018-03-30 05:53:24,159 [salt.state       ][INFO    ][2224] Running state [msg] at time 05:53:24.159178
2018-03-30 05:53:24,160 [salt.state       ][INFO    ][2224] Executing state host.present for msg
2018-03-30 05:53:24,211 [salt.state       ][INFO    ][2224] {'host': 'msg'}
2018-03-30 05:53:24,212 [salt.state       ][INFO    ][2224] Completed state [msg] at time 05:53:24.212312 duration_in_ms=53.134
2018-03-30 05:53:24,213 [salt.state       ][INFO    ][2224] Running state [msg.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.213356
2018-03-30 05:53:24,214 [salt.state       ][INFO    ][2224] Executing state host.present for msg.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,271 [salt.state       ][INFO    ][2224] {'host': 'msg.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,273 [salt.state       ][INFO    ][2224] Completed state [msg.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.273031 duration_in_ms=59.676
2018-03-30 05:53:24,274 [salt.state       ][INFO    ][2224] Running state [cfg01] at time 05:53:24.273967
2018-03-30 05:53:24,274 [salt.state       ][INFO    ][2224] Executing state host.present for cfg01
2018-03-30 05:53:24,276 [salt.state       ][INFO    ][2224] Host cfg01 (10.167.4.11) already present
2018-03-30 05:53:24,276 [salt.state       ][INFO    ][2224] Completed state [cfg01] at time 05:53:24.276836 duration_in_ms=2.87
2018-03-30 05:53:24,277 [salt.state       ][INFO    ][2224] Running state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.277688
2018-03-30 05:53:24,278 [salt.state       ][INFO    ][2224] Executing state host.present for cfg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,279 [salt.state       ][INFO    ][2224] Host cfg01.mcp-pike-ovs-dpdk-ha.local (10.167.4.11) already present
2018-03-30 05:53:24,280 [salt.state       ][INFO    ][2224] Completed state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.280456 duration_in_ms=2.768
2018-03-30 05:53:24,281 [salt.state       ][INFO    ][2224] Running state [cmp002] at time 05:53:24.281331
2018-03-30 05:53:24,282 [salt.state       ][INFO    ][2224] Executing state host.present for cmp002
2018-03-30 05:53:24,284 [salt.state       ][INFO    ][2224] {'host': 'cmp002'}
2018-03-30 05:53:24,285 [salt.state       ][INFO    ][2224] Completed state [cmp002] at time 05:53:24.284926 duration_in_ms=3.596
2018-03-30 05:53:24,285 [salt.state       ][INFO    ][2224] Running state [cmp002.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.285727
2018-03-30 05:53:24,286 [salt.state       ][INFO    ][2224] Executing state host.present for cmp002.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,289 [salt.state       ][INFO    ][2224] {'host': 'cmp002.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,289 [salt.state       ][INFO    ][2224] Completed state [cmp002.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.289828 duration_in_ms=4.101
2018-03-30 05:53:24,290 [salt.state       ][INFO    ][2224] Running state [cmp001] at time 05:53:24.290541
2018-03-30 05:53:24,291 [salt.state       ][INFO    ][2224] Executing state host.present for cmp001
2018-03-30 05:53:24,295 [salt.state       ][INFO    ][2224] {'host': 'cmp001'}
2018-03-30 05:53:24,295 [salt.state       ][INFO    ][2224] Completed state [cmp001] at time 05:53:24.295848 duration_in_ms=5.308
2018-03-30 05:53:24,296 [salt.state       ][INFO    ][2224] Running state [cmp001.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.296585
2018-03-30 05:53:24,297 [salt.state       ][INFO    ][2224] Executing state host.present for cmp001.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,301 [salt.state       ][INFO    ][2224] {'host': 'cmp001.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,301 [salt.state       ][INFO    ][2224] Completed state [cmp001.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.301799 duration_in_ms=5.213
2018-03-30 05:53:24,302 [salt.state       ][INFO    ][2224] Running state [dbs01] at time 05:53:24.302531
2018-03-30 05:53:24,303 [salt.state       ][INFO    ][2224] Executing state host.present for dbs01
2018-03-30 05:53:24,307 [salt.state       ][INFO    ][2224] {'host': 'dbs01'}
2018-03-30 05:53:24,307 [salt.state       ][INFO    ][2224] Completed state [dbs01] at time 05:53:24.307823 duration_in_ms=5.292
2018-03-30 05:53:24,308 [salt.state       ][INFO    ][2224] Running state [dbs01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.308553
2018-03-30 05:53:24,309 [salt.state       ][INFO    ][2224] Executing state host.present for dbs01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,313 [salt.state       ][INFO    ][2224] {'host': 'dbs01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,313 [salt.state       ][INFO    ][2224] Completed state [dbs01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.313782 duration_in_ms=5.23
2018-03-30 05:53:24,314 [salt.state       ][INFO    ][2224] Running state [dbs02] at time 05:53:24.314463
2018-03-30 05:53:24,315 [salt.state       ][INFO    ][2224] Executing state host.present for dbs02
2018-03-30 05:53:24,319 [salt.state       ][INFO    ][2224] {'host': 'dbs02'}
2018-03-30 05:53:24,319 [salt.state       ][INFO    ][2224] Completed state [dbs02] at time 05:53:24.319835 duration_in_ms=5.373
2018-03-30 05:53:24,320 [salt.state       ][INFO    ][2224] Running state [dbs02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.320545
2018-03-30 05:53:24,321 [salt.state       ][INFO    ][2224] Executing state host.present for dbs02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,325 [salt.state       ][INFO    ][2224] {'host': 'dbs02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,325 [salt.state       ][INFO    ][2224] Completed state [dbs02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.325780 duration_in_ms=5.236
2018-03-30 05:53:24,326 [salt.state       ][INFO    ][2224] Running state [dbs03] at time 05:53:24.326472
2018-03-30 05:53:24,327 [salt.state       ][INFO    ][2224] Executing state host.present for dbs03
2018-03-30 05:53:24,331 [salt.state       ][INFO    ][2224] {'host': 'dbs03'}
2018-03-30 05:53:24,331 [salt.state       ][INFO    ][2224] Completed state [dbs03] at time 05:53:24.331873 duration_in_ms=5.401
2018-03-30 05:53:24,332 [salt.state       ][INFO    ][2224] Running state [dbs03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.332576
2018-03-30 05:53:24,333 [salt.state       ][INFO    ][2224] Executing state host.present for dbs03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,337 [salt.state       ][INFO    ][2224] {'host': 'dbs03.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,337 [salt.state       ][INFO    ][2224] Completed state [dbs03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.337813 duration_in_ms=5.236
2018-03-30 05:53:24,338 [salt.state       ][INFO    ][2224] Running state [mas01] at time 05:53:24.338477
2018-03-30 05:53:24,339 [salt.state       ][INFO    ][2224] Executing state host.present for mas01
2018-03-30 05:53:24,346 [salt.state       ][INFO    ][2224] {'host': 'mas01'}
2018-03-30 05:53:24,347 [salt.state       ][INFO    ][2224] Completed state [mas01] at time 05:53:24.347369 duration_in_ms=8.891
2018-03-30 05:53:24,348 [salt.state       ][INFO    ][2224] Running state [mas01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.348082
2018-03-30 05:53:24,348 [salt.state       ][INFO    ][2224] Executing state host.present for mas01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,352 [salt.state       ][INFO    ][2224] {'host': 'mas01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,353 [salt.state       ][INFO    ][2224] Completed state [mas01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.353194 duration_in_ms=5.111
2018-03-30 05:53:24,354 [salt.state       ][INFO    ][2224] Running state [ctl02] at time 05:53:24.354012
2018-03-30 05:53:24,354 [salt.state       ][INFO    ][2224] Executing state host.present for ctl02
2018-03-30 05:53:24,358 [salt.state       ][INFO    ][2224] {'host': 'ctl02'}
2018-03-30 05:53:24,359 [salt.state       ][INFO    ][2224] Completed state [ctl02] at time 05:53:24.358992 duration_in_ms=4.981
2018-03-30 05:53:24,359 [salt.state       ][INFO    ][2224] Running state [ctl02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.359611
2018-03-30 05:53:24,360 [salt.state       ][INFO    ][2224] Executing state host.present for ctl02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,364 [salt.state       ][INFO    ][2224] {'host': 'ctl02.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,365 [salt.state       ][INFO    ][2224] Completed state [ctl02.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.365097 duration_in_ms=5.486
2018-03-30 05:53:24,365 [salt.state       ][INFO    ][2224] Running state [ctl03] at time 05:53:24.365766
2018-03-30 05:53:24,366 [salt.state       ][INFO    ][2224] Executing state host.present for ctl03
2018-03-30 05:53:24,370 [salt.state       ][INFO    ][2224] {'host': 'ctl03'}
2018-03-30 05:53:24,371 [salt.state       ][INFO    ][2224] Completed state [ctl03] at time 05:53:24.371122 duration_in_ms=5.355
2018-03-30 05:53:24,371 [salt.state       ][INFO    ][2224] Running state [ctl03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.371736
2018-03-30 05:53:24,372 [salt.state       ][INFO    ][2224] Executing state host.present for ctl03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,388 [salt.state       ][INFO    ][2224] {'host': 'ctl03.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,389 [salt.state       ][INFO    ][2224] Completed state [ctl03.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.389384 duration_in_ms=17.648
2018-03-30 05:53:24,390 [salt.state       ][INFO    ][2224] Running state [ctl01] at time 05:53:24.390006
2018-03-30 05:53:24,390 [salt.state       ][INFO    ][2224] Executing state host.present for ctl01
2018-03-30 05:53:24,394 [salt.state       ][INFO    ][2224] {'host': 'ctl01'}
2018-03-30 05:53:24,395 [salt.state       ][INFO    ][2224] Completed state [ctl01] at time 05:53:24.395351 duration_in_ms=5.344
2018-03-30 05:53:24,396 [salt.state       ][INFO    ][2224] Running state [ctl01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.396223
2018-03-30 05:53:24,397 [salt.state       ][INFO    ][2224] Executing state host.present for ctl01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,400 [salt.state       ][INFO    ][2224] {'host': 'ctl01.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,401 [salt.state       ][INFO    ][2224] Completed state [ctl01.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.401201 duration_in_ms=4.979
2018-03-30 05:53:24,401 [salt.state       ][INFO    ][2224] Running state [ctl] at time 05:53:24.401850
2018-03-30 05:53:24,402 [salt.state       ][INFO    ][2224] Executing state host.present for ctl
2018-03-30 05:53:24,406 [salt.state       ][INFO    ][2224] {'host': 'ctl'}
2018-03-30 05:53:24,407 [salt.state       ][INFO    ][2224] Completed state [ctl] at time 05:53:24.406995 duration_in_ms=5.145
2018-03-30 05:53:24,407 [salt.state       ][INFO    ][2224] Running state [ctl.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.407695
2018-03-30 05:53:24,408 [salt.state       ][INFO    ][2224] Executing state host.present for ctl.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:53:24,412 [salt.state       ][INFO    ][2224] {'host': 'ctl.mcp-pike-ovs-dpdk-ha.local'}
2018-03-30 05:53:24,413 [salt.state       ][INFO    ][2224] Completed state [ctl.mcp-pike-ovs-dpdk-ha.local] at time 05:53:24.413041 duration_in_ms=5.346
2018-03-30 05:53:24,413 [salt.state       ][INFO    ][2224] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 05:53:24.413669
2018-03-30 05:53:24,414 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/network/interfaces.d/50-cloud-init.cfg
2018-03-30 05:53:24,415 [salt.state       ][INFO    ][2224] {'removed': '/etc/network/interfaces.d/50-cloud-init.cfg'}
2018-03-30 05:53:24,415 [salt.state       ][INFO    ][2224] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 05:53:24.415574 duration_in_ms=1.905
2018-03-30 05:53:24,428 [salt.state       ][INFO    ][2224] Running state [ens2] at time 05:53:24.428610
2018-03-30 05:53:24,429 [salt.state       ][INFO    ][2224] Executing state network.managed for ens2
2018-03-30 05:53:24,648 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['ifdown', 'ens2'] in directory '/root'
2018-03-30 05:53:25,959 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['ifup', 'ens2'] in directory '/root'
2018-03-30 05:53:26,721 [salt.state       ][INFO    ][2224] {'interface': 'Added network interface.', 'status': 'Interface ens2 restart to validate'}
2018-03-30 05:53:26,722 [salt.state       ][INFO    ][2224] Completed state [ens2] at time 05:53:26.722483 duration_in_ms=2293.872
2018-03-30 05:53:26,723 [salt.state       ][INFO    ][2224] Running state [ens3] at time 05:53:26.722978
2018-03-30 05:53:26,723 [salt.state       ][INFO    ][2224] Executing state network.managed for ens3
2018-03-30 05:53:26,762 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['ifup', 'ens3'] in directory '/root'
2018-03-30 05:53:27,301 [salt.state       ][INFO    ][2224] {'interface': 'Added network interface.', 'status': 'Interface ens3 is up'}
2018-03-30 05:53:27,301 [salt.state       ][INFO    ][2224] Completed state [ens3] at time 05:53:27.301713 duration_in_ms=578.735
2018-03-30 05:53:27,301 [salt.state       ][INFO    ][2224] Running state [ens3] at time 05:53:27.301957
2018-03-30 05:53:27,302 [salt.state       ][INFO    ][2224] Executing state network.routes for ens3
2018-03-30 05:53:27,317 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'status', 'networking.service', '-n', '0'] in directory '/root'
2018-03-30 05:53:27,333 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'networking.service'] in directory '/root'
2018-03-30 05:53:30,957 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'networking.service'] in directory '/root'
2018-03-30 05:53:30,988 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'networking.service'] in directory '/root'
2018-03-30 05:53:31,550 [salt.loaded.int.module.cmdmod][ERROR   ][2224] Command '['systemd-run', '--scope', 'systemctl', 'start', 'networking.service']' failed with return code: 1
2018-03-30 05:53:31,550 [salt.loaded.int.module.cmdmod][ERROR   ][2224] output: Running scope as unit run-r7d5aef7728db42bead6e52a6d9adaf3e.scope.
Job for networking.service failed because the control process exited with error code. See "systemctl status networking.service" and "journalctl -xe" for details.
2018-03-30 05:53:31,551 [salt.state       ][INFO    ][2224] {'network_routes': 'Added interface ens3 routes.'}
2018-03-30 05:53:31,551 [salt.state       ][INFO    ][2224] Completed state [ens3] at time 05:53:31.551549 duration_in_ms=4249.593
2018-03-30 05:53:31,552 [salt.state       ][INFO    ][2224] Running state [ens4] at time 05:53:31.552034
2018-03-30 05:53:31,552 [salt.state       ][INFO    ][2224] Executing state network.managed for ens4
2018-03-30 05:53:31,592 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['ifup', 'ens4'] in directory '/root'
2018-03-30 05:53:31,800 [salt.loaded.int.module.cmdmod][ERROR   ][2224] Command '['ifup', 'ens4']' failed with return code: 1
2018-03-30 05:53:31,801 [salt.loaded.int.module.cmdmod][ERROR   ][2224] output: SIOCADDRT: File exists
run-parts: /etc/network/if-up.d/route-ens3 exited with return code 7
Failed to bring up ens4.
2018-03-30 05:53:32,151 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055332155037
2018-03-30 05:53:32,181 [salt.minion      ][INFO    ][4617] Starting a new job with PID 4617
2018-03-30 05:53:32,200 [salt.minion      ][INFO    ][4617] Returning information for job: 20180330055332155037
2018-03-30 05:53:32,345 [salt.state       ][INFO    ][2224] {'interface': 'Added network interface.', 'status': 'Interface ens4 is up'}
2018-03-30 05:53:32,345 [salt.state       ][INFO    ][2224] Completed state [ens4] at time 05:53:32.345909 duration_in_ms=793.875
2018-03-30 05:53:32,346 [salt.state       ][INFO    ][2224] Running state [/etc/profile.d/proxy.sh] at time 05:53:32.346388
2018-03-30 05:53:32,346 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/profile.d/proxy.sh
2018-03-30 05:53:32,347 [salt.state       ][INFO    ][2224] File /etc/profile.d/proxy.sh is not present
2018-03-30 05:53:32,347 [salt.state       ][INFO    ][2224] Completed state [/etc/profile.d/proxy.sh] at time 05:53:32.347803 duration_in_ms=1.414
2018-03-30 05:53:32,348 [salt.state       ][INFO    ][2224] Running state [/etc/apt/apt.conf.d/95proxies] at time 05:53:32.348178
2018-03-30 05:53:32,348 [salt.state       ][INFO    ][2224] Executing state file.absent for /etc/apt/apt.conf.d/95proxies
2018-03-30 05:53:32,349 [salt.state       ][INFO    ][2224] File /etc/apt/apt.conf.d/95proxies is not present
2018-03-30 05:53:32,349 [salt.state       ][INFO    ][2224] Completed state [/etc/apt/apt.conf.d/95proxies] at time 05:53:32.349332 duration_in_ms=1.153
2018-03-30 05:53:32,350 [salt.state       ][INFO    ][2224] Running state [ntp] at time 05:53:32.350673
2018-03-30 05:53:32,351 [salt.state       ][INFO    ][2224] Executing state pkg.installed for ntp
2018-03-30 05:53:32,529 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 05:53:32,556 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'ntp'] in directory '/root'
2018-03-30 05:53:40,134 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:53:40,174 [salt.state       ][INFO    ][2224] Made the following changes:
'ntp' changed from 'absent' to '1:4.2.8p4+dfsg-3ubuntu5.8'
'libopts25' changed from 'absent' to '1:5.18.7-3'

2018-03-30 05:53:40,186 [salt.state       ][INFO    ][2224] Loading fresh modules for state activity
2018-03-30 05:53:40,209 [salt.state       ][INFO    ][2224] Completed state [ntp] at time 05:53:40.209730 duration_in_ms=7859.053
2018-03-30 05:53:40,217 [salt.state       ][INFO    ][2224] Running state [/etc/ntp.conf] at time 05:53:40.217617
2018-03-30 05:53:40,218 [salt.state       ][INFO    ][2224] Executing state file.managed for /etc/ntp.conf
2018-03-30 05:53:40,254 [salt.fileclient  ][INFO    ][2224] Fetching file from saltenv 'base', ** done ** 'ntp/files/ntp.conf'
2018-03-30 05:53:40,302 [salt.state       ][INFO    ][2224] File changed:
--- 
+++ 
@@ -1,66 +1,24 @@
-# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help
 
-driftfile /var/lib/ntp/ntp.drift
 
-# Enable this if you want statistics to be logged.
-#statsdir /var/log/ntpstats/
+# ntpd will only synchronize your clock.
 
-statistics loopstats peerstats clockstats
-filegen loopstats file loopstats type day enable
-filegen peerstats file peerstats type day enable
-filegen clockstats file clockstats type day enable
+# For details, see:
+# - the ntp.conf man page
+# - http://support.ntp.org/bin/view/Support/GettingStarted
+# - https://wiki.archlinux.org/index.php/Network_Time_Protocol_daemon
 
-# Specify one or more NTP servers.
+# Associate to cloud NTP pool servers
+server 1.pool.ntp.org iburst
+server 0.pool.ntp.org
 
-# Use servers from the NTP Pool Project. Approved by Ubuntu Technical Board
-# on 2011-02-08 (LP: #104525). See http://www.pool.ntp.org/join.html for
-# more information.
-pool 0.ubuntu.pool.ntp.org iburst
-pool 1.ubuntu.pool.ntp.org iburst
-pool 2.ubuntu.pool.ntp.org iburst
-pool 3.ubuntu.pool.ntp.org iburst
-
-# Use Ubuntu's ntp server as a fallback.
-pool ntp.ubuntu.com
-
-# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for
-# details.  The web page <http://support.ntp.org/bin/view/Support/AccessRestrictions>
-# might also be helpful.
-#
-# Note that "restrict" applies to both servers and clients, so a configuration
-# that might be intended to block requests from certain clients could also end
-# up blocking replies from your own upstream servers.
-
-# By default, exchange time with everybody, but don't allow configuration.
-restrict -4 default kod notrap nomodify nopeer noquery limited
-restrict -6 default kod notrap nomodify nopeer noquery limited
-
-# Local users may interrogate the ntp server more closely.
+# Only allow read-only access from localhost
+restrict default noquery nopeer
 restrict 127.0.0.1
 restrict ::1
 
-# Needed for adding pool entries
-restrict source notrap nomodify noquery
-
-# Clients from this (example!) subnet have unlimited access, but only if
-# cryptographically authenticated.
-#restrict 192.168.123.0 mask 255.255.255.0 notrust
+# mode7 is required for collectd monitoring
 
 
-# If you want to provide time to your local subnet, change the next line.
-# (Again, the address is an example only.)
-#broadcast 192.168.123.255
-
-# If you want to listen to time broadcasts on your local subnet, de-comment the
-# next lines.  Please do this only if you trust everybody on the network!
-#disable auth
-#broadcastclient
-
-#Changes recquired to use pps synchonisation as explained in documentation:
-#http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm#AEN3918
-
-#server 127.127.8.1 mode 135 prefer    # Meinberg GPS167 with PPS
-#fudge 127.127.8.1 time1 0.0042        # relative to PPS for my hardware
-
-#server 127.127.22.1                   # ATOM(PPS)
-#fudge 127.127.22.1 flag3 1            # enable PPS API
+# Location of drift file
+driftfile /var/lib/ntp/ntp.drift
+logfile /var/log/ntp.log

2018-03-30 05:53:40,310 [salt.state       ][INFO    ][2224] Completed state [/etc/ntp.conf] at time 05:53:40.310366 duration_in_ms=92.75
2018-03-30 05:53:40,466 [salt.state       ][INFO    ][2224] Running state [ntp] at time 05:53:40.466282
2018-03-30 05:53:40,466 [salt.state       ][INFO    ][2224] Executing state service.running for ntp
2018-03-30 05:53:40,467 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-03-30 05:53:40,486 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-03-30 05:53:40,503 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-03-30 05:53:40,521 [salt.state       ][INFO    ][2224] The service ntp is already running
2018-03-30 05:53:40,521 [salt.state       ][INFO    ][2224] Completed state [ntp] at time 05:53:40.521756 duration_in_ms=55.474
2018-03-30 05:53:40,522 [salt.state       ][INFO    ][2224] Running state [ntp] at time 05:53:40.521998
2018-03-30 05:53:40,522 [salt.state       ][INFO    ][2224] Executing state service.mod_watch for ntp
2018-03-30 05:53:40,523 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-03-30 05:53:40,539 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-03-30 05:53:40,558 [salt.loaded.int.module.cmdmod][INFO    ][2224] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'ntp.service'] in directory '/root'
2018-03-30 05:53:40,636 [salt.state       ][INFO    ][2224] {'ntp': True}
2018-03-30 05:53:40,636 [salt.state       ][INFO    ][2224] Completed state [ntp] at time 05:53:40.636665 duration_in_ms=114.667
2018-03-30 05:53:40,639 [salt.minion      ][INFO    ][2224] Returning information for job: 20180330055241432131
2018-03-30 05:54:13,132 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command state.apply with jid 20180330055413131419
2018-03-30 05:54:13,161 [salt.minion      ][INFO    ][5524] Starting a new job with PID 5524
2018-03-30 05:54:16,163 [salt.state       ][INFO    ][5524] Loading fresh modules for state activity
2018-03-30 05:54:18,387 [salt.state       ][INFO    ][5524] Running state [/etc/environment] at time 05:54:18.387916
2018-03-30 05:54:18,388 [salt.state       ][INFO    ][5524] Executing state file.blockreplace for /etc/environment
2018-03-30 05:54:18,394 [salt.state       ][INFO    ][5524] File changed:
--- 
+++ 
@@ -1,3 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
 # SALT MANAGED VARIABLES - DO NOT EDIT - START
+# 
 # # SALT MANAGED VARIABLES - END

2018-03-30 05:54:18,394 [salt.state       ][INFO    ][5524] Completed state [/etc/environment] at time 05:54:18.394619 duration_in_ms=6.703
2018-03-30 05:54:18,394 [salt.state       ][INFO    ][5524] Running state [/etc/profile.d] at time 05:54:18.394813
2018-03-30 05:54:18,394 [salt.state       ][INFO    ][5524] Executing state file.directory for /etc/profile.d
2018-03-30 05:54:18,395 [salt.state       ][INFO    ][5524] Directory /etc/profile.d is in the correct state
2018-03-30 05:54:18,396 [salt.state       ][INFO    ][5524] Completed state [/etc/profile.d] at time 05:54:18.396091 duration_in_ms=1.277
2018-03-30 05:54:18,790 [salt.state       ][INFO    ][5524] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 05:54:18.790884
2018-03-30 05:54:18,791 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/apt/apt.conf.d/99prefer_ipv4-salt
2018-03-30 05:54:18,812 [salt.state       ][INFO    ][5524] File /etc/apt/apt.conf.d/99prefer_ipv4-salt is in the correct state
2018-03-30 05:54:18,812 [salt.state       ][INFO    ][5524] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 05:54:18.812498 duration_in_ms=21.615
2018-03-30 05:54:18,813 [salt.state       ][INFO    ][5524] Running state [linux_repo_prereq_pkgs] at time 05:54:18.812996
2018-03-30 05:54:18,813 [salt.state       ][INFO    ][5524] Executing state pkg.installed for linux_repo_prereq_pkgs
2018-03-30 05:54:18,813 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:54:19,117 [salt.state       ][INFO    ][5524] All specified packages are already installed
2018-03-30 05:54:19,117 [salt.state       ][INFO    ][5524] Completed state [linux_repo_prereq_pkgs] at time 05:54:19.117352 duration_in_ms=304.356
2018-03-30 05:54:19,117 [salt.state       ][INFO    ][5524] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 05:54:19.117640
2018-03-30 05:54:19,117 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/apt/apt.conf.d/99proxies-salt-uca
2018-03-30 05:54:19,118 [salt.state       ][INFO    ][5524] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-03-30 05:54:19,118 [salt.state       ][INFO    ][5524] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 05:54:19.118399 duration_in_ms=0.759
2018-03-30 05:54:19,118 [salt.state       ][INFO    ][5524] Running state [/etc/apt/preferences.d/uca] at time 05:54:19.118557
2018-03-30 05:54:19,118 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/apt/preferences.d/uca
2018-03-30 05:54:19,118 [salt.state       ][INFO    ][5524] File /etc/apt/preferences.d/uca is not present
2018-03-30 05:54:19,119 [salt.state       ][INFO    ][5524] Completed state [/etc/apt/preferences.d/uca] at time 05:54:19.119029 duration_in_ms=0.472
2018-03-30 05:54:19,120 [salt.state       ][INFO    ][5524] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 05:54:19.120499
2018-03-30 05:54:19,120 [salt.state       ][INFO    ][5524] Executing state cmd.run for apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA
2018-03-30 05:54:19,121 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-03-30 05:54:19,404 [salt.state       ][INFO    ][5524] {'pid': 5584, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" not changed\ngpg: Total number processed: 1\ngpg:              unchanged: 1', 'stdout': 'Executing: /tmp/tmp.SauVa76KLT/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-03-30 05:54:19,405 [salt.state       ][INFO    ][5524] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 05:54:19.405463 duration_in_ms=284.963
2018-03-30 05:54:19,412 [salt.state       ][INFO    ][5524] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 05:54:19.412136
2018-03-30 05:54:19,412 [salt.state       ][INFO    ][5524] Executing state pkgrepo.managed for deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main
2018-03-30 05:54:19,494 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 05:54:22,547 [salt.state       ][INFO    ][5524] Configured package repo 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main'
2018-03-30 05:54:22,548 [salt.state       ][INFO    ][5524] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/pike main] at time 05:54:22.548210 duration_in_ms=3136.073
2018-03-30 05:54:22,549 [salt.state       ][INFO    ][5524] Running state [linux_extra_packages_latest] at time 05:54:22.549179
2018-03-30 05:54:22,550 [salt.state       ][INFO    ][5524] Executing state pkg.latest for linux_extra_packages_latest
2018-03-30 05:54:22,566 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-03-30 05:54:22,639 [salt.state       ][INFO    ][5524] Package libapache2-mod-wsgi is already up-to-date
2018-03-30 05:54:22,640 [salt.state       ][INFO    ][5524] Completed state [linux_extra_packages_latest] at time 05:54:22.640713 duration_in_ms=91.534
2018-03-30 05:54:22,642 [salt.state       ][INFO    ][5524] Running state [UTC] at time 05:54:22.642274
2018-03-30 05:54:22,642 [salt.state       ][INFO    ][5524] Executing state timezone.system for UTC
2018-03-30 05:54:22,644 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['timedatectl'] in directory '/root'
2018-03-30 05:54:22,693 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['timedatectl'] in directory '/root'
2018-03-30 05:54:22,715 [salt.state       ][INFO    ][5524] Timezone UTC already set, UTC already set to UTC
2018-03-30 05:54:22,716 [salt.state       ][INFO    ][5524] Completed state [UTC] at time 05:54:22.716729 duration_in_ms=74.454
2018-03-30 05:54:22,718 [salt.state       ][INFO    ][5524] Running state [nf_conntrack] at time 05:54:22.718294
2018-03-30 05:54:22,718 [salt.state       ][INFO    ][5524] Executing state kmod.present for nf_conntrack
2018-03-30 05:54:22,720 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'lsmod' in directory '/root'
2018-03-30 05:54:22,739 [salt.state       ][INFO    ][5524] Kernel module nf_conntrack is already present
2018-03-30 05:54:22,740 [salt.state       ][INFO    ][5524] Completed state [nf_conntrack] at time 05:54:22.740183 duration_in_ms=21.89
2018-03-30 05:54:22,741 [salt.state       ][INFO    ][5524] Running state [kernel.panic] at time 05:54:22.741529
2018-03-30 05:54:22,742 [salt.state       ][INFO    ][5524] Executing state sysctl.present for kernel.panic
2018-03-30 05:54:22,759 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:22,800 [salt.state       ][INFO    ][5524] Sysctl value kernel.panic = 60 is already set
2018-03-30 05:54:22,801 [salt.state       ][INFO    ][5524] Completed state [kernel.panic] at time 05:54:22.801247 duration_in_ms=59.717
2018-03-30 05:54:22,802 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_keepalive_probes] at time 05:54:22.801974
2018-03-30 05:54:22,802 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_keepalive_probes
2018-03-30 05:54:22,803 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:22,846 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_keepalive_probes = 8 is already set
2018-03-30 05:54:22,847 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_keepalive_probes] at time 05:54:22.847756 duration_in_ms=45.782
2018-03-30 05:54:22,848 [salt.state       ][INFO    ][5524] Running state [fs.file-max] at time 05:54:22.848503
2018-03-30 05:54:22,849 [salt.state       ][INFO    ][5524] Executing state sysctl.present for fs.file-max
2018-03-30 05:54:22,850 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:22,895 [salt.state       ][INFO    ][5524] Sysctl value fs.file-max = 124165 is already set
2018-03-30 05:54:22,896 [salt.state       ][INFO    ][5524] Completed state [fs.file-max] at time 05:54:22.896256 duration_in_ms=47.752
2018-03-30 05:54:22,897 [salt.state       ][INFO    ][5524] Running state [net.core.somaxconn] at time 05:54:22.897059
2018-03-30 05:54:22,897 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.core.somaxconn
2018-03-30 05:54:22,898 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:22,942 [salt.state       ][INFO    ][5524] Sysctl value net.core.somaxconn = 4096 is already set
2018-03-30 05:54:22,943 [salt.state       ][INFO    ][5524] Completed state [net.core.somaxconn] at time 05:54:22.943325 duration_in_ms=46.265
2018-03-30 05:54:22,944 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_max_syn_backlog] at time 05:54:22.944004
2018-03-30 05:54:22,944 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_max_syn_backlog
2018-03-30 05:54:22,945 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,001 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_max_syn_backlog = 8192 is already set
2018-03-30 05:54:23,002 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_max_syn_backlog] at time 05:54:23.002453 duration_in_ms=58.447
2018-03-30 05:54:23,003 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_tw_reuse] at time 05:54:23.003506
2018-03-30 05:54:23,004 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_tw_reuse
2018-03-30 05:54:23,006 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,055 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_tw_reuse = 1 is already set
2018-03-30 05:54:23,057 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_tw_reuse] at time 05:54:23.056929 duration_in_ms=53.423
2018-03-30 05:54:23,057 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_congestion_control] at time 05:54:23.057794
2018-03-30 05:54:23,058 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_congestion_control
2018-03-30 05:54:23,059 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,102 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_congestion_control = yeah is already set
2018-03-30 05:54:23,103 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_congestion_control] at time 05:54:23.103299 duration_in_ms=45.505
2018-03-30 05:54:23,104 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_retries2] at time 05:54:23.104070
2018-03-30 05:54:23,104 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_retries2
2018-03-30 05:54:23,106 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,142 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_retries2 = 5 is already set
2018-03-30 05:54:23,142 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_retries2] at time 05:54:23.142881 duration_in_ms=38.811
2018-03-30 05:54:23,143 [salt.state       ][INFO    ][5524] Running state [net.core.netdev_max_backlog] at time 05:54:23.143567
2018-03-30 05:54:23,144 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.core.netdev_max_backlog
2018-03-30 05:54:23,145 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,174 [salt.state       ][INFO    ][5524] Sysctl value net.core.netdev_max_backlog = 261144 is already set
2018-03-30 05:54:23,175 [salt.state       ][INFO    ][5524] Completed state [net.core.netdev_max_backlog] at time 05:54:23.175298 duration_in_ms=31.731
2018-03-30 05:54:23,175 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_slow_start_after_idle] at time 05:54:23.175881
2018-03-30 05:54:23,176 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_slow_start_after_idle
2018-03-30 05:54:23,177 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,207 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_slow_start_after_idle = 0 is already set
2018-03-30 05:54:23,208 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 05:54:23.208406 duration_in_ms=32.525
2018-03-30 05:54:23,209 [salt.state       ][INFO    ][5524] Running state [vm.swappiness] at time 05:54:23.209033
2018-03-30 05:54:23,209 [salt.state       ][INFO    ][5524] Executing state sysctl.present for vm.swappiness
2018-03-30 05:54:23,210 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,228 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055423225554
2018-03-30 05:54:23,262 [salt.minion      ][INFO    ][6005] Starting a new job with PID 6005
2018-03-30 05:54:23,279 [salt.state       ][INFO    ][5524] Sysctl value vm.swappiness = 10 is already set
2018-03-30 05:54:23,280 [salt.state       ][INFO    ][5524] Completed state [vm.swappiness] at time 05:54:23.280217 duration_in_ms=71.183
2018-03-30 05:54:23,281 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_keepalive_intvl] at time 05:54:23.281157
2018-03-30 05:54:23,282 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_keepalive_intvl
2018-03-30 05:54:23,283 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,287 [salt.minion      ][INFO    ][6005] Returning information for job: 20180330055423225554
2018-03-30 05:54:23,337 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_keepalive_intvl = 3 is already set
2018-03-30 05:54:23,338 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_keepalive_intvl] at time 05:54:23.338602 duration_in_ms=57.443
2018-03-30 05:54:23,339 [salt.state       ][INFO    ][5524] Running state [net.ipv4.neigh.default.gc_thresh1] at time 05:54:23.339553
2018-03-30 05:54:23,340 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh1
2018-03-30 05:54:23,341 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,378 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.neigh.default.gc_thresh1 = 4096 is already set
2018-03-30 05:54:23,379 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 05:54:23.379239 duration_in_ms=39.686
2018-03-30 05:54:23,379 [salt.state       ][INFO    ][5524] Running state [net.ipv4.neigh.default.gc_thresh2] at time 05:54:23.379882
2018-03-30 05:54:23,380 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh2
2018-03-30 05:54:23,381 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,413 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.neigh.default.gc_thresh2 = 8192 is already set
2018-03-30 05:54:23,414 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 05:54:23.414581 duration_in_ms=34.699
2018-03-30 05:54:23,415 [salt.state       ][INFO    ][5524] Running state [net.ipv4.neigh.default.gc_thresh3] at time 05:54:23.415170
2018-03-30 05:54:23,415 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.neigh.default.gc_thresh3
2018-03-30 05:54:23,416 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,451 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.neigh.default.gc_thresh3 = 16384 is already set
2018-03-30 05:54:23,452 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 05:54:23.452034 duration_in_ms=36.863
2018-03-30 05:54:23,452 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_fin_timeout] at time 05:54:23.452585
2018-03-30 05:54:23,453 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_fin_timeout
2018-03-30 05:54:23,454 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,483 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_fin_timeout = 30 is already set
2018-03-30 05:54:23,484 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_fin_timeout] at time 05:54:23.484243 duration_in_ms=31.658
2018-03-30 05:54:23,484 [salt.state       ][INFO    ][5524] Running state [net.ipv4.tcp_keepalive_time] at time 05:54:23.484926
2018-03-30 05:54:23,485 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.ipv4.tcp_keepalive_time
2018-03-30 05:54:23,486 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,512 [salt.state       ][INFO    ][5524] Sysctl value net.ipv4.tcp_keepalive_time = 30 is already set
2018-03-30 05:54:23,513 [salt.state       ][INFO    ][5524] Completed state [net.ipv4.tcp_keepalive_time] at time 05:54:23.513583 duration_in_ms=28.658
2018-03-30 05:54:23,514 [salt.state       ][INFO    ][5524] Running state [net.nf_conntrack_max] at time 05:54:23.514088
2018-03-30 05:54:23,514 [salt.state       ][INFO    ][5524] Executing state sysctl.present for net.nf_conntrack_max
2018-03-30 05:54:23,515 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'sysctl -a' in directory '/root'
2018-03-30 05:54:23,546 [salt.state       ][INFO    ][5524] Sysctl value net.nf_conntrack_max = 1048576 is already set
2018-03-30 05:54:23,547 [salt.state       ][INFO    ][5524] Completed state [net.nf_conntrack_max] at time 05:54:23.547230 duration_in_ms=33.141
2018-03-30 05:54:23,547 [salt.state       ][INFO    ][5524] Running state [linux_sysfs_package] at time 05:54:23.547738
2018-03-30 05:54:23,548 [salt.state       ][INFO    ][5524] Executing state pkg.installed for linux_sysfs_package
2018-03-30 05:54:23,553 [salt.state       ][INFO    ][5524] All specified packages are already installed
2018-03-30 05:54:23,553 [salt.state       ][INFO    ][5524] Completed state [linux_sysfs_package] at time 05:54:23.553386 duration_in_ms=5.647
2018-03-30 05:54:23,554 [salt.state       ][INFO    ][5524] Running state [/etc/sysfs.d] at time 05:54:23.554614
2018-03-30 05:54:23,555 [salt.state       ][INFO    ][5524] Executing state file.directory for /etc/sysfs.d
2018-03-30 05:54:23,555 [salt.state       ][INFO    ][5524] Directory /etc/sysfs.d is in the correct state
2018-03-30 05:54:23,556 [salt.state       ][INFO    ][5524] Completed state [/etc/sysfs.d] at time 05:54:23.556063 duration_in_ms=1.449
2018-03-30 05:54:23,556 [salt.state       ][INFO    ][5524] Running state [ondemand] at time 05:54:23.556880
2018-03-30 05:54:23,557 [salt.state       ][INFO    ][5524] Executing state service.dead for ondemand
2018-03-30 05:54:23,557 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-03-30 05:54:23,576 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-03-30 05:54:23,593 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-03-30 05:54:23,613 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'runlevel' in directory '/root'
2018-03-30 05:54:23,625 [salt.state       ][INFO    ][5524] The service ondemand is already dead
2018-03-30 05:54:23,626 [salt.state       ][INFO    ][5524] Completed state [ondemand] at time 05:54:23.626497 duration_in_ms=69.616
2018-03-30 05:54:23,628 [salt.state       ][INFO    ][5524] Running state [cs_CZ.UTF-8] at time 05:54:23.627964
2018-03-30 05:54:23,628 [salt.state       ][INFO    ][5524] Executing state locale.present for cs_CZ.UTF-8
2018-03-30 05:54:23,629 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'locale -a' in directory '/root'
2018-03-30 05:54:23,639 [salt.state       ][INFO    ][5524] Locale cs_CZ.UTF-8 is already present
2018-03-30 05:54:23,640 [salt.state       ][INFO    ][5524] Completed state [cs_CZ.UTF-8] at time 05:54:23.640425 duration_in_ms=12.459
2018-03-30 05:54:23,641 [salt.state       ][INFO    ][5524] Running state [en_US.UTF-8] at time 05:54:23.641216
2018-03-30 05:54:23,641 [salt.state       ][INFO    ][5524] Executing state locale.present for en_US.UTF-8
2018-03-30 05:54:23,642 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'locale -a' in directory '/root'
2018-03-30 05:54:23,654 [salt.state       ][INFO    ][5524] Locale en_US.UTF-8 is already present
2018-03-30 05:54:23,654 [salt.state       ][INFO    ][5524] Completed state [en_US.UTF-8] at time 05:54:23.654660 duration_in_ms=13.444
2018-03-30 05:54:23,656 [salt.state       ][INFO    ][5524] Running state [en_US.UTF-8] at time 05:54:23.656240
2018-03-30 05:54:23,656 [salt.state       ][INFO    ][5524] Executing state locale.system for en_US.UTF-8
2018-03-30 05:54:23,657 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'localectl' in directory '/root'
2018-03-30 05:54:23,695 [salt.state       ][INFO    ][5524] System locale en_US.UTF-8 already set
2018-03-30 05:54:23,696 [salt.state       ][INFO    ][5524] Completed state [en_US.UTF-8] at time 05:54:23.696126 duration_in_ms=39.886
2018-03-30 05:54:23,698 [salt.state       ][INFO    ][5524] Running state [root] at time 05:54:23.698012
2018-03-30 05:54:23,698 [salt.state       ][INFO    ][5524] Executing state user.present for root
2018-03-30 05:54:23,700 [salt.state       ][INFO    ][5524] User root is present and up to date
2018-03-30 05:54:23,700 [salt.state       ][INFO    ][5524] Completed state [root] at time 05:54:23.700815 duration_in_ms=2.803
2018-03-30 05:54:23,702 [salt.state       ][INFO    ][5524] Running state [/root] at time 05:54:23.701970
2018-03-30 05:54:23,702 [salt.state       ][INFO    ][5524] Executing state file.directory for /root
2018-03-30 05:54:23,703 [salt.state       ][INFO    ][5524] Directory /root is in the correct state
2018-03-30 05:54:23,703 [salt.state       ][INFO    ][5524] Completed state [/root] at time 05:54:23.703526 duration_in_ms=1.557
2018-03-30 05:54:23,703 [salt.state       ][INFO    ][5524] Running state [/etc/sudoers.d/90-salt-user-root] at time 05:54:23.703929
2018-03-30 05:54:23,704 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/sudoers.d/90-salt-user-root
2018-03-30 05:54:23,704 [salt.state       ][INFO    ][5524] File /etc/sudoers.d/90-salt-user-root is not present
2018-03-30 05:54:23,705 [salt.state       ][INFO    ][5524] Completed state [/etc/sudoers.d/90-salt-user-root] at time 05:54:23.705335 duration_in_ms=1.406
2018-03-30 05:54:23,705 [salt.state       ][INFO    ][5524] Running state [ubuntu] at time 05:54:23.705766
2018-03-30 05:54:23,706 [salt.state       ][INFO    ][5524] Executing state user.present for ubuntu
2018-03-30 05:54:23,707 [salt.state       ][INFO    ][5524] User ubuntu is present and up to date
2018-03-30 05:54:23,707 [salt.state       ][INFO    ][5524] Completed state [ubuntu] at time 05:54:23.707525 duration_in_ms=1.76
2018-03-30 05:54:23,708 [salt.state       ][INFO    ][5524] Running state [/home/ubuntu] at time 05:54:23.708458
2018-03-30 05:54:23,708 [salt.state       ][INFO    ][5524] Executing state file.directory for /home/ubuntu
2018-03-30 05:54:23,709 [salt.state       ][INFO    ][5524] Directory /home/ubuntu is in the correct state
2018-03-30 05:54:23,710 [salt.state       ][INFO    ][5524] Completed state [/home/ubuntu] at time 05:54:23.710099 duration_in_ms=1.641
2018-03-30 05:54:23,710 [salt.state       ][INFO    ][5524] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 05:54:23.710850
2018-03-30 05:54:23,711 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/sudoers.d/90-salt-user-ubuntu
2018-03-30 05:54:23,744 [salt.state       ][INFO    ][5524] File /etc/sudoers.d/90-salt-user-ubuntu is in the correct state
2018-03-30 05:54:23,745 [salt.state       ][INFO    ][5524] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 05:54:23.745252 duration_in_ms=34.401
2018-03-30 05:54:23,745 [salt.state       ][INFO    ][5524] Running state [/etc/security/limits.d/90-salt-default.conf] at time 05:54:23.745728
2018-03-30 05:54:23,746 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/security/limits.d/90-salt-default.conf
2018-03-30 05:54:23,822 [salt.state       ][INFO    ][5524] File /etc/security/limits.d/90-salt-default.conf is in the correct state
2018-03-30 05:54:23,823 [salt.state       ][INFO    ][5524] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 05:54:23.823050 duration_in_ms=77.322
2018-03-30 05:54:23,823 [salt.state       ][INFO    ][5524] Running state [apt-daily.timer] at time 05:54:23.823436
2018-03-30 05:54:23,823 [salt.state       ][INFO    ][5524] Executing state service.dead for apt-daily.timer
2018-03-30 05:54:23,824 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'status', 'apt-daily.timer', '-n', '0'] in directory '/root'
2018-03-30 05:54:23,841 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-active', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:54:23,856 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-enabled', 'apt-daily.timer'] in directory '/root'
2018-03-30 05:54:23,875 [salt.state       ][INFO    ][5524] The service apt-daily.timer is already dead
2018-03-30 05:54:23,875 [salt.state       ][INFO    ][5524] Completed state [apt-daily.timer] at time 05:54:23.875740 duration_in_ms=52.303
2018-03-30 05:54:23,876 [salt.state       ][INFO    ][5524] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 05:54:23.876248
2018-03-30 05:54:23,876 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/systemd/system.conf.d/90-salt.conf
2018-03-30 05:54:23,951 [salt.state       ][INFO    ][5524] File /etc/systemd/system.conf.d/90-salt.conf is in the correct state
2018-03-30 05:54:23,951 [salt.state       ][INFO    ][5524] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 05:54:23.951653 duration_in_ms=75.405
2018-03-30 05:54:23,953 [salt.state       ][INFO    ][5524] Running state [service.systemctl_reload] at time 05:54:23.953292
2018-03-30 05:54:23,953 [salt.state       ][INFO    ][5524] Executing state module.wait for service.systemctl_reload
2018-03-30 05:54:23,954 [salt.state       ][INFO    ][5524] No changes made for service.systemctl_reload
2018-03-30 05:54:23,954 [salt.state       ][INFO    ][5524] Completed state [service.systemctl_reload] at time 05:54:23.954436 duration_in_ms=1.144
2018-03-30 05:54:23,954 [salt.state       ][INFO    ][5524] Running state [/etc/hostname] at time 05:54:23.954831
2018-03-30 05:54:23,955 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/hostname
2018-03-30 05:54:23,970 [salt.state       ][INFO    ][5524] File /etc/hostname is in the correct state
2018-03-30 05:54:23,971 [salt.state       ][INFO    ][5524] Completed state [/etc/hostname] at time 05:54:23.971252 duration_in_ms=16.421
2018-03-30 05:54:23,972 [salt.state       ][INFO    ][5524] Running state [hostname prx01] at time 05:54:23.972080
2018-03-30 05:54:23,972 [salt.state       ][INFO    ][5524] Executing state cmd.wait for hostname prx01
2018-03-30 05:54:23,972 [salt.state       ][INFO    ][5524] No changes made for hostname prx01
2018-03-30 05:54:23,973 [salt.state       ][INFO    ][5524] Completed state [hostname prx01] at time 05:54:23.973175 duration_in_ms=1.095
2018-03-30 05:54:23,973 [salt.state       ][INFO    ][5524] Running state [mdb02] at time 05:54:23.973750
2018-03-30 05:54:23,974 [salt.state       ][INFO    ][5524] Executing state host.present for mdb02
2018-03-30 05:54:23,974 [salt.state       ][INFO    ][5524] Host mdb02 (10.167.4.33) already present
2018-03-30 05:54:23,975 [salt.state       ][INFO    ][5524] Completed state [mdb02] at time 05:54:23.974976 duration_in_ms=1.227
2018-03-30 05:54:23,975 [salt.state       ][INFO    ][5524] Running state [mdb02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.975320
2018-03-30 05:54:23,975 [salt.state       ][INFO    ][5524] Executing state host.present for mdb02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,976 [salt.state       ][INFO    ][5524] Host mdb02.mcp-pike-ovs-dpdk-ha.local (10.167.4.33) already present
2018-03-30 05:54:23,976 [salt.state       ][INFO    ][5524] Completed state [mdb02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.976476 duration_in_ms=1.157
2018-03-30 05:54:23,976 [salt.state       ][INFO    ][5524] Running state [mdb03] at time 05:54:23.976834
2018-03-30 05:54:23,977 [salt.state       ][INFO    ][5524] Executing state host.present for mdb03
2018-03-30 05:54:23,977 [salt.state       ][INFO    ][5524] Host mdb03 (10.167.4.34) already present
2018-03-30 05:54:23,978 [salt.state       ][INFO    ][5524] Completed state [mdb03] at time 05:54:23.977984 duration_in_ms=1.149
2018-03-30 05:54:23,978 [salt.state       ][INFO    ][5524] Running state [mdb03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.978326
2018-03-30 05:54:23,978 [salt.state       ][INFO    ][5524] Executing state host.present for mdb03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,979 [salt.state       ][INFO    ][5524] Host mdb03.mcp-pike-ovs-dpdk-ha.local (10.167.4.34) already present
2018-03-30 05:54:23,979 [salt.state       ][INFO    ][5524] Completed state [mdb03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.979693 duration_in_ms=1.367
2018-03-30 05:54:23,980 [salt.state       ][INFO    ][5524] Running state [mdb01] at time 05:54:23.980056
2018-03-30 05:54:23,980 [salt.state       ][INFO    ][5524] Executing state host.present for mdb01
2018-03-30 05:54:23,980 [salt.state       ][INFO    ][5524] Host mdb01 (10.167.4.32) already present
2018-03-30 05:54:23,981 [salt.state       ][INFO    ][5524] Completed state [mdb01] at time 05:54:23.981257 duration_in_ms=1.201
2018-03-30 05:54:23,981 [salt.state       ][INFO    ][5524] Running state [mdb01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.981600
2018-03-30 05:54:23,981 [salt.state       ][INFO    ][5524] Executing state host.present for mdb01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,982 [salt.state       ][INFO    ][5524] Host mdb01.mcp-pike-ovs-dpdk-ha.local (10.167.4.32) already present
2018-03-30 05:54:23,982 [salt.state       ][INFO    ][5524] Completed state [mdb01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.982740 duration_in_ms=1.14
2018-03-30 05:54:23,983 [salt.state       ][INFO    ][5524] Running state [mdb] at time 05:54:23.983079
2018-03-30 05:54:23,983 [salt.state       ][INFO    ][5524] Executing state host.present for mdb
2018-03-30 05:54:23,983 [salt.state       ][INFO    ][5524] Host mdb (10.167.4.31) already present
2018-03-30 05:54:23,984 [salt.state       ][INFO    ][5524] Completed state [mdb] at time 05:54:23.984234 duration_in_ms=1.156
2018-03-30 05:54:23,984 [salt.state       ][INFO    ][5524] Running state [mdb.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.984590
2018-03-30 05:54:23,984 [salt.state       ][INFO    ][5524] Executing state host.present for mdb.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,985 [salt.state       ][INFO    ][5524] Host mdb.mcp-pike-ovs-dpdk-ha.local (10.167.4.31) already present
2018-03-30 05:54:23,986 [salt.state       ][INFO    ][5524] Completed state [mdb.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.986026 duration_in_ms=1.436
2018-03-30 05:54:23,986 [salt.state       ][INFO    ][5524] Running state [cfg01] at time 05:54:23.986528
2018-03-30 05:54:23,987 [salt.state       ][INFO    ][5524] Executing state host.present for cfg01
2018-03-30 05:54:23,987 [salt.state       ][INFO    ][5524] Host cfg01 (10.167.4.11) already present
2018-03-30 05:54:23,988 [salt.state       ][INFO    ][5524] Completed state [cfg01] at time 05:54:23.988218 duration_in_ms=1.69
2018-03-30 05:54:23,988 [salt.state       ][INFO    ][5524] Running state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.988715
2018-03-30 05:54:23,989 [salt.state       ][INFO    ][5524] Executing state host.present for cfg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,990 [salt.state       ][INFO    ][5524] Host cfg01.mcp-pike-ovs-dpdk-ha.local (10.167.4.11) already present
2018-03-30 05:54:23,990 [salt.state       ][INFO    ][5524] Completed state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.990423 duration_in_ms=1.709
2018-03-30 05:54:23,990 [salt.state       ][INFO    ][5524] Running state [prx01] at time 05:54:23.990925
2018-03-30 05:54:23,991 [salt.state       ][INFO    ][5524] Executing state host.present for prx01
2018-03-30 05:54:23,991 [salt.state       ][INFO    ][5524] Host prx01 (10.167.4.14) already present
2018-03-30 05:54:23,992 [salt.state       ][INFO    ][5524] Completed state [prx01] at time 05:54:23.992149 duration_in_ms=1.224
2018-03-30 05:54:23,992 [salt.state       ][INFO    ][5524] Running state [prx01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.992494
2018-03-30 05:54:23,992 [salt.state       ][INFO    ][5524] Executing state host.present for prx01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:23,993 [salt.state       ][INFO    ][5524] Host prx01.mcp-pike-ovs-dpdk-ha.local (10.167.4.14) already present
2018-03-30 05:54:23,993 [salt.state       ][INFO    ][5524] Completed state [prx01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:23.993701 duration_in_ms=1.208
2018-03-30 05:54:23,994 [salt.state       ][INFO    ][5524] Running state [file.replace] at time 05:54:23.994576
2018-03-30 05:54:23,995 [salt.state       ][INFO    ][5524] Executing state module.run for file.replace
2018-03-30 05:54:24,016 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['git', '--version'] in directory '/root'
2018-03-30 05:54:24,133 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command 'grep -q "prx01 prx01.mcp-pike-ovs-dpdk-ha.local" /etc/hosts' in directory '/root'
2018-03-30 05:54:24,148 [salt.state       ][INFO    ][5524] {'ret': '--- \n+++ \n@@ -11,7 +11,7 @@\n 10.167.4.32\t\tmdb01 mdb01.mcp-pike-ovs-dpdk-ha.local\n 10.167.4.31\t\tmdb mdb.mcp-pike-ovs-dpdk-ha.local\n 10.167.4.11\t\tcfg01 cfg01.mcp-pike-ovs-dpdk-ha.local\n-10.167.4.14\t\tprx01 prx01.mcp-pike-ovs-dpdk-ha.local\n+10.167.4.14\t\tprx01.mcp-pike-ovs-dpdk-ha.local prx01\n 10.167.4.20\t\tkvm01 kvm01.mcp-pike-ovs-dpdk-ha.local\n 10.167.4.22\t\tkvm03 kvm03.mcp-pike-ovs-dpdk-ha.local\n 10.167.4.21\t\tkvm02 kvm02.mcp-pike-ovs-dpdk-ha.local\n'}
2018-03-30 05:54:24,148 [salt.state       ][INFO    ][5524] Completed state [file.replace] at time 05:54:24.148735 duration_in_ms=154.158
2018-03-30 05:54:24,149 [salt.state       ][INFO    ][5524] Running state [kvm01] at time 05:54:24.149267
2018-03-30 05:54:24,149 [salt.state       ][INFO    ][5524] Executing state host.present for kvm01
2018-03-30 05:54:24,150 [salt.state       ][INFO    ][5524] Host kvm01 (10.167.4.20) already present
2018-03-30 05:54:24,150 [salt.state       ][INFO    ][5524] Completed state [kvm01] at time 05:54:24.150716 duration_in_ms=1.449
2018-03-30 05:54:24,151 [salt.state       ][INFO    ][5524] Running state [kvm01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.151103
2018-03-30 05:54:24,151 [salt.state       ][INFO    ][5524] Executing state host.present for kvm01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,151 [salt.state       ][INFO    ][5524] Host kvm01.mcp-pike-ovs-dpdk-ha.local (10.167.4.20) already present
2018-03-30 05:54:24,152 [salt.state       ][INFO    ][5524] Completed state [kvm01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.152292 duration_in_ms=1.188
2018-03-30 05:54:24,152 [salt.state       ][INFO    ][5524] Running state [kvm03] at time 05:54:24.152652
2018-03-30 05:54:24,153 [salt.state       ][INFO    ][5524] Executing state host.present for kvm03
2018-03-30 05:54:24,153 [salt.state       ][INFO    ][5524] Host kvm03 (10.167.4.22) already present
2018-03-30 05:54:24,153 [salt.state       ][INFO    ][5524] Completed state [kvm03] at time 05:54:24.153910 duration_in_ms=1.258
2018-03-30 05:54:24,154 [salt.state       ][INFO    ][5524] Running state [kvm03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.154257
2018-03-30 05:54:24,154 [salt.state       ][INFO    ][5524] Executing state host.present for kvm03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,155 [salt.state       ][INFO    ][5524] Host kvm03.mcp-pike-ovs-dpdk-ha.local (10.167.4.22) already present
2018-03-30 05:54:24,155 [salt.state       ][INFO    ][5524] Completed state [kvm03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.155411 duration_in_ms=1.155
2018-03-30 05:54:24,155 [salt.state       ][INFO    ][5524] Running state [kvm02] at time 05:54:24.155760
2018-03-30 05:54:24,156 [salt.state       ][INFO    ][5524] Executing state host.present for kvm02
2018-03-30 05:54:24,156 [salt.state       ][INFO    ][5524] Host kvm02 (10.167.4.21) already present
2018-03-30 05:54:24,156 [salt.state       ][INFO    ][5524] Completed state [kvm02] at time 05:54:24.156939 duration_in_ms=1.18
2018-03-30 05:54:24,157 [salt.state       ][INFO    ][5524] Running state [kvm02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.157292
2018-03-30 05:54:24,157 [salt.state       ][INFO    ][5524] Executing state host.present for kvm02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,158 [salt.state       ][INFO    ][5524] Host kvm02.mcp-pike-ovs-dpdk-ha.local (10.167.4.21) already present
2018-03-30 05:54:24,158 [salt.state       ][INFO    ][5524] Completed state [kvm02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.158458 duration_in_ms=1.166
2018-03-30 05:54:24,158 [salt.state       ][INFO    ][5524] Running state [dbs] at time 05:54:24.158808
2018-03-30 05:54:24,159 [salt.state       ][INFO    ][5524] Executing state host.present for dbs
2018-03-30 05:54:24,159 [salt.state       ][INFO    ][5524] Host dbs (10.167.4.23) already present
2018-03-30 05:54:24,160 [salt.state       ][INFO    ][5524] Completed state [dbs] at time 05:54:24.159975 duration_in_ms=1.168
2018-03-30 05:54:24,160 [salt.state       ][INFO    ][5524] Running state [dbs.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.160322
2018-03-30 05:54:24,160 [salt.state       ][INFO    ][5524] Executing state host.present for dbs.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,161 [salt.state       ][INFO    ][5524] Host dbs.mcp-pike-ovs-dpdk-ha.local (10.167.4.23) already present
2018-03-30 05:54:24,161 [salt.state       ][INFO    ][5524] Completed state [dbs.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.161338 duration_in_ms=1.015
2018-03-30 05:54:24,161 [salt.state       ][INFO    ][5524] Running state [prx] at time 05:54:24.161584
2018-03-30 05:54:24,161 [salt.state       ][INFO    ][5524] Executing state host.present for prx
2018-03-30 05:54:24,162 [salt.state       ][INFO    ][5524] Host prx (10.167.4.13) already present
2018-03-30 05:54:24,162 [salt.state       ][INFO    ][5524] Completed state [prx] at time 05:54:24.162357 duration_in_ms=0.772
2018-03-30 05:54:24,162 [salt.state       ][INFO    ][5524] Running state [prx.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.162579
2018-03-30 05:54:24,162 [salt.state       ][INFO    ][5524] Executing state host.present for prx.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,163 [salt.state       ][INFO    ][5524] Host prx.mcp-pike-ovs-dpdk-ha.local (10.167.4.13) already present
2018-03-30 05:54:24,163 [salt.state       ][INFO    ][5524] Completed state [prx.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.163339 duration_in_ms=0.759
2018-03-30 05:54:24,163 [salt.state       ][INFO    ][5524] Running state [prx02] at time 05:54:24.163558
2018-03-30 05:54:24,163 [salt.state       ][INFO    ][5524] Executing state host.present for prx02
2018-03-30 05:54:24,164 [salt.state       ][INFO    ][5524] Host prx02 (10.167.4.15) already present
2018-03-30 05:54:24,164 [salt.state       ][INFO    ][5524] Completed state [prx02] at time 05:54:24.164323 duration_in_ms=0.765
2018-03-30 05:54:24,164 [salt.state       ][INFO    ][5524] Running state [prx02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.164546
2018-03-30 05:54:24,164 [salt.state       ][INFO    ][5524] Executing state host.present for prx02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,165 [salt.state       ][INFO    ][5524] Host prx02.mcp-pike-ovs-dpdk-ha.local (10.167.4.15) already present
2018-03-30 05:54:24,165 [salt.state       ][INFO    ][5524] Completed state [prx02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.165375 duration_in_ms=0.829
2018-03-30 05:54:24,165 [salt.state       ][INFO    ][5524] Running state [msg02] at time 05:54:24.165596
2018-03-30 05:54:24,165 [salt.state       ][INFO    ][5524] Executing state host.present for msg02
2018-03-30 05:54:24,166 [salt.state       ][INFO    ][5524] Host msg02 (10.167.4.29) already present
2018-03-30 05:54:24,166 [salt.state       ][INFO    ][5524] Completed state [msg02] at time 05:54:24.166350 duration_in_ms=0.754
2018-03-30 05:54:24,166 [salt.state       ][INFO    ][5524] Running state [msg02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.166570
2018-03-30 05:54:24,166 [salt.state       ][INFO    ][5524] Executing state host.present for msg02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,167 [salt.state       ][INFO    ][5524] Host msg02.mcp-pike-ovs-dpdk-ha.local (10.167.4.29) already present
2018-03-30 05:54:24,167 [salt.state       ][INFO    ][5524] Completed state [msg02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.167338 duration_in_ms=0.767
2018-03-30 05:54:24,167 [salt.state       ][INFO    ][5524] Running state [msg03] at time 05:54:24.167559
2018-03-30 05:54:24,167 [salt.state       ][INFO    ][5524] Executing state host.present for msg03
2018-03-30 05:54:24,168 [salt.state       ][INFO    ][5524] Host msg03 (10.167.4.30) already present
2018-03-30 05:54:24,168 [salt.state       ][INFO    ][5524] Completed state [msg03] at time 05:54:24.168307 duration_in_ms=0.749
2018-03-30 05:54:24,168 [salt.state       ][INFO    ][5524] Running state [msg03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.168536
2018-03-30 05:54:24,168 [salt.state       ][INFO    ][5524] Executing state host.present for msg03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,169 [salt.state       ][INFO    ][5524] Host msg03.mcp-pike-ovs-dpdk-ha.local (10.167.4.30) already present
2018-03-30 05:54:24,169 [salt.state       ][INFO    ][5524] Completed state [msg03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.169452 duration_in_ms=0.917
2018-03-30 05:54:24,169 [salt.state       ][INFO    ][5524] Running state [msg01] at time 05:54:24.169672
2018-03-30 05:54:24,169 [salt.state       ][INFO    ][5524] Executing state host.present for msg01
2018-03-30 05:54:24,170 [salt.state       ][INFO    ][5524] Host msg01 (10.167.4.28) already present
2018-03-30 05:54:24,170 [salt.state       ][INFO    ][5524] Completed state [msg01] at time 05:54:24.170426 duration_in_ms=0.754
2018-03-30 05:54:24,170 [salt.state       ][INFO    ][5524] Running state [msg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.170641
2018-03-30 05:54:24,170 [salt.state       ][INFO    ][5524] Executing state host.present for msg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,171 [salt.state       ][INFO    ][5524] Host msg01.mcp-pike-ovs-dpdk-ha.local (10.167.4.28) already present
2018-03-30 05:54:24,171 [salt.state       ][INFO    ][5524] Completed state [msg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.171396 duration_in_ms=0.754
2018-03-30 05:54:24,171 [salt.state       ][INFO    ][5524] Running state [msg] at time 05:54:24.171611
2018-03-30 05:54:24,171 [salt.state       ][INFO    ][5524] Executing state host.present for msg
2018-03-30 05:54:24,172 [salt.state       ][INFO    ][5524] Host msg (10.167.4.27) already present
2018-03-30 05:54:24,172 [salt.state       ][INFO    ][5524] Completed state [msg] at time 05:54:24.172355 duration_in_ms=0.745
2018-03-30 05:54:24,172 [salt.state       ][INFO    ][5524] Running state [msg.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.172572
2018-03-30 05:54:24,172 [salt.state       ][INFO    ][5524] Executing state host.present for msg.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,173 [salt.state       ][INFO    ][5524] Host msg.mcp-pike-ovs-dpdk-ha.local (10.167.4.27) already present
2018-03-30 05:54:24,173 [salt.state       ][INFO    ][5524] Completed state [msg.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.173353 duration_in_ms=0.782
2018-03-30 05:54:24,173 [salt.state       ][INFO    ][5524] Running state [cfg01] at time 05:54:24.173566
2018-03-30 05:54:24,173 [salt.state       ][INFO    ][5524] Executing state host.present for cfg01
2018-03-30 05:54:24,174 [salt.state       ][INFO    ][5524] Host cfg01 (10.167.4.11) already present
2018-03-30 05:54:24,174 [salt.state       ][INFO    ][5524] Completed state [cfg01] at time 05:54:24.174312 duration_in_ms=0.747
2018-03-30 05:54:24,174 [salt.state       ][INFO    ][5524] Running state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.174524
2018-03-30 05:54:24,174 [salt.state       ][INFO    ][5524] Executing state host.present for cfg01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,175 [salt.state       ][INFO    ][5524] Host cfg01.mcp-pike-ovs-dpdk-ha.local (10.167.4.11) already present
2018-03-30 05:54:24,175 [salt.state       ][INFO    ][5524] Completed state [cfg01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.175273 duration_in_ms=0.749
2018-03-30 05:54:24,175 [salt.state       ][INFO    ][5524] Running state [cmp002] at time 05:54:24.175493
2018-03-30 05:54:24,175 [salt.state       ][INFO    ][5524] Executing state host.present for cmp002
2018-03-30 05:54:24,176 [salt.state       ][INFO    ][5524] Host cmp002 (10.167.4.53) already present
2018-03-30 05:54:24,176 [salt.state       ][INFO    ][5524] Completed state [cmp002] at time 05:54:24.176249 duration_in_ms=0.756
2018-03-30 05:54:24,176 [salt.state       ][INFO    ][5524] Running state [cmp002.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.176461
2018-03-30 05:54:24,176 [salt.state       ][INFO    ][5524] Executing state host.present for cmp002.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,177 [salt.state       ][INFO    ][5524] Host cmp002.mcp-pike-ovs-dpdk-ha.local (10.167.4.53) already present
2018-03-30 05:54:24,177 [salt.state       ][INFO    ][5524] Completed state [cmp002.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.177247 duration_in_ms=0.787
2018-03-30 05:54:24,177 [salt.state       ][INFO    ][5524] Running state [cmp001] at time 05:54:24.177463
2018-03-30 05:54:24,177 [salt.state       ][INFO    ][5524] Executing state host.present for cmp001
2018-03-30 05:54:24,178 [salt.state       ][INFO    ][5524] Host cmp001 (10.167.4.52) already present
2018-03-30 05:54:24,178 [salt.state       ][INFO    ][5524] Completed state [cmp001] at time 05:54:24.178223 duration_in_ms=0.759
2018-03-30 05:54:24,178 [salt.state       ][INFO    ][5524] Running state [cmp001.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.178442
2018-03-30 05:54:24,178 [salt.state       ][INFO    ][5524] Executing state host.present for cmp001.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,179 [salt.state       ][INFO    ][5524] Host cmp001.mcp-pike-ovs-dpdk-ha.local (10.167.4.52) already present
2018-03-30 05:54:24,179 [salt.state       ][INFO    ][5524] Completed state [cmp001.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.179199 duration_in_ms=0.757
2018-03-30 05:54:24,179 [salt.state       ][INFO    ][5524] Running state [dbs01] at time 05:54:24.179424
2018-03-30 05:54:24,179 [salt.state       ][INFO    ][5524] Executing state host.present for dbs01
2018-03-30 05:54:24,180 [salt.state       ][INFO    ][5524] Host dbs01 (10.167.4.24) already present
2018-03-30 05:54:24,180 [salt.state       ][INFO    ][5524] Completed state [dbs01] at time 05:54:24.180177 duration_in_ms=0.753
2018-03-30 05:54:24,180 [salt.state       ][INFO    ][5524] Running state [dbs01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.180391
2018-03-30 05:54:24,180 [salt.state       ][INFO    ][5524] Executing state host.present for dbs01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,180 [salt.state       ][INFO    ][5524] Host dbs01.mcp-pike-ovs-dpdk-ha.local (10.167.4.24) already present
2018-03-30 05:54:24,181 [salt.state       ][INFO    ][5524] Completed state [dbs01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.181167 duration_in_ms=0.777
2018-03-30 05:54:24,181 [salt.state       ][INFO    ][5524] Running state [dbs02] at time 05:54:24.181388
2018-03-30 05:54:24,181 [salt.state       ][INFO    ][5524] Executing state host.present for dbs02
2018-03-30 05:54:24,181 [salt.state       ][INFO    ][5524] Host dbs02 (10.167.4.25) already present
2018-03-30 05:54:24,182 [salt.state       ][INFO    ][5524] Completed state [dbs02] at time 05:54:24.182146 duration_in_ms=0.758
2018-03-30 05:54:24,182 [salt.state       ][INFO    ][5524] Running state [dbs02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.182368
2018-03-30 05:54:24,182 [salt.state       ][INFO    ][5524] Executing state host.present for dbs02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,182 [salt.state       ][INFO    ][5524] Host dbs02.mcp-pike-ovs-dpdk-ha.local (10.167.4.25) already present
2018-03-30 05:54:24,183 [salt.state       ][INFO    ][5524] Completed state [dbs02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.183128 duration_in_ms=0.76
2018-03-30 05:54:24,183 [salt.state       ][INFO    ][5524] Running state [dbs03] at time 05:54:24.183351
2018-03-30 05:54:24,183 [salt.state       ][INFO    ][5524] Executing state host.present for dbs03
2018-03-30 05:54:24,183 [salt.state       ][INFO    ][5524] Host dbs03 (10.167.4.26) already present
2018-03-30 05:54:24,184 [salt.state       ][INFO    ][5524] Completed state [dbs03] at time 05:54:24.184108 duration_in_ms=0.758
2018-03-30 05:54:24,184 [salt.state       ][INFO    ][5524] Running state [dbs03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.184328
2018-03-30 05:54:24,184 [salt.state       ][INFO    ][5524] Executing state host.present for dbs03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,184 [salt.state       ][INFO    ][5524] Host dbs03.mcp-pike-ovs-dpdk-ha.local (10.167.4.26) already present
2018-03-30 05:54:24,185 [salt.state       ][INFO    ][5524] Completed state [dbs03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.185125 duration_in_ms=0.797
2018-03-30 05:54:24,185 [salt.state       ][INFO    ][5524] Running state [mas01] at time 05:54:24.185342
2018-03-30 05:54:24,185 [salt.state       ][INFO    ][5524] Executing state host.present for mas01
2018-03-30 05:54:24,185 [salt.state       ][INFO    ][5524] Host mas01 (10.167.4.12) already present
2018-03-30 05:54:24,186 [salt.state       ][INFO    ][5524] Completed state [mas01] at time 05:54:24.186103 duration_in_ms=0.761
2018-03-30 05:54:24,186 [salt.state       ][INFO    ][5524] Running state [mas01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.186320
2018-03-30 05:54:24,186 [salt.state       ][INFO    ][5524] Executing state host.present for mas01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,186 [salt.state       ][INFO    ][5524] Host mas01.mcp-pike-ovs-dpdk-ha.local (10.167.4.12) already present
2018-03-30 05:54:24,187 [salt.state       ][INFO    ][5524] Completed state [mas01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.187096 duration_in_ms=0.776
2018-03-30 05:54:24,187 [salt.state       ][INFO    ][5524] Running state [ctl02] at time 05:54:24.187316
2018-03-30 05:54:24,187 [salt.state       ][INFO    ][5524] Executing state host.present for ctl02
2018-03-30 05:54:24,187 [salt.state       ][INFO    ][5524] Host ctl02 (10.167.4.37) already present
2018-03-30 05:54:24,188 [salt.state       ][INFO    ][5524] Completed state [ctl02] at time 05:54:24.188074 duration_in_ms=0.759
2018-03-30 05:54:24,188 [salt.state       ][INFO    ][5524] Running state [ctl02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.188286
2018-03-30 05:54:24,188 [salt.state       ][INFO    ][5524] Executing state host.present for ctl02.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,188 [salt.state       ][INFO    ][5524] Host ctl02.mcp-pike-ovs-dpdk-ha.local (10.167.4.37) already present
2018-03-30 05:54:24,189 [salt.state       ][INFO    ][5524] Completed state [ctl02.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.189165 duration_in_ms=0.879
2018-03-30 05:54:24,189 [salt.state       ][INFO    ][5524] Running state [ctl03] at time 05:54:24.189382
2018-03-30 05:54:24,189 [salt.state       ][INFO    ][5524] Executing state host.present for ctl03
2018-03-30 05:54:24,189 [salt.state       ][INFO    ][5524] Host ctl03 (10.167.4.38) already present
2018-03-30 05:54:24,190 [salt.state       ][INFO    ][5524] Completed state [ctl03] at time 05:54:24.190149 duration_in_ms=0.767
2018-03-30 05:54:24,190 [salt.state       ][INFO    ][5524] Running state [ctl03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.190370
2018-03-30 05:54:24,190 [salt.state       ][INFO    ][5524] Executing state host.present for ctl03.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,190 [salt.state       ][INFO    ][5524] Host ctl03.mcp-pike-ovs-dpdk-ha.local (10.167.4.38) already present
2018-03-30 05:54:24,191 [salt.state       ][INFO    ][5524] Completed state [ctl03.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.191122 duration_in_ms=0.753
2018-03-30 05:54:24,191 [salt.state       ][INFO    ][5524] Running state [ctl01] at time 05:54:24.191328
2018-03-30 05:54:24,191 [salt.state       ][INFO    ][5524] Executing state host.present for ctl01
2018-03-30 05:54:24,191 [salt.state       ][INFO    ][5524] Host ctl01 (10.167.4.36) already present
2018-03-30 05:54:24,192 [salt.state       ][INFO    ][5524] Completed state [ctl01] at time 05:54:24.192073 duration_in_ms=0.745
2018-03-30 05:54:24,192 [salt.state       ][INFO    ][5524] Running state [ctl01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.192287
2018-03-30 05:54:24,192 [salt.state       ][INFO    ][5524] Executing state host.present for ctl01.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,192 [salt.state       ][INFO    ][5524] Host ctl01.mcp-pike-ovs-dpdk-ha.local (10.167.4.36) already present
2018-03-30 05:54:24,193 [salt.state       ][INFO    ][5524] Completed state [ctl01.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.193052 duration_in_ms=0.766
2018-03-30 05:54:24,193 [salt.state       ][INFO    ][5524] Running state [ctl] at time 05:54:24.193279
2018-03-30 05:54:24,193 [salt.state       ][INFO    ][5524] Executing state host.present for ctl
2018-03-30 05:54:24,193 [salt.state       ][INFO    ][5524] Host ctl (10.167.4.35) already present
2018-03-30 05:54:24,194 [salt.state       ][INFO    ][5524] Completed state [ctl] at time 05:54:24.194025 duration_in_ms=0.746
2018-03-30 05:54:24,194 [salt.state       ][INFO    ][5524] Running state [ctl.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.194246
2018-03-30 05:54:24,194 [salt.state       ][INFO    ][5524] Executing state host.present for ctl.mcp-pike-ovs-dpdk-ha.local
2018-03-30 05:54:24,194 [salt.state       ][INFO    ][5524] Host ctl.mcp-pike-ovs-dpdk-ha.local (10.167.4.35) already present
2018-03-30 05:54:24,195 [salt.state       ][INFO    ][5524] Completed state [ctl.mcp-pike-ovs-dpdk-ha.local] at time 05:54:24.195001 duration_in_ms=0.755
2018-03-30 05:54:24,195 [salt.state       ][INFO    ][5524] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 05:54:24.195223
2018-03-30 05:54:24,195 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/network/interfaces.d/50-cloud-init.cfg
2018-03-30 05:54:24,195 [salt.state       ][INFO    ][5524] File /etc/network/interfaces.d/50-cloud-init.cfg is not present
2018-03-30 05:54:24,195 [salt.state       ][INFO    ][5524] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 05:54:24.195889 duration_in_ms=0.667
2018-03-30 05:54:24,196 [salt.state       ][INFO    ][5524] Running state [ens2] at time 05:54:24.196106
2018-03-30 05:54:24,196 [salt.state       ][INFO    ][5524] Executing state network.managed for ens2
2018-03-30 05:54:24,758 [salt.state       ][INFO    ][5524] Interface ens2 is up to date.
2018-03-30 05:54:24,758 [salt.state       ][INFO    ][5524] Completed state [ens2] at time 05:54:24.758732 duration_in_ms=562.626
2018-03-30 05:54:24,759 [salt.state       ][INFO    ][5524] Running state [ens3] at time 05:54:24.759034
2018-03-30 05:54:24,759 [salt.state       ][INFO    ][5524] Executing state network.managed for ens3
2018-03-30 05:54:25,258 [salt.state       ][INFO    ][5524] Interface ens3 is up to date.
2018-03-30 05:54:25,259 [salt.state       ][INFO    ][5524] Completed state [ens3] at time 05:54:25.259381 duration_in_ms=500.348
2018-03-30 05:54:25,259 [salt.state       ][INFO    ][5524] Running state [ens3] at time 05:54:25.259892
2018-03-30 05:54:25,260 [salt.state       ][INFO    ][5524] Executing state network.routes for ens3
2018-03-30 05:54:25,269 [salt.state       ][INFO    ][5524] Interface ens3 routes are up to date.
2018-03-30 05:54:25,269 [salt.state       ][INFO    ][5524] Completed state [ens3] at time 05:54:25.269524 duration_in_ms=9.631
2018-03-30 05:54:25,269 [salt.state       ][INFO    ][5524] Running state [ens4] at time 05:54:25.269922
2018-03-30 05:54:25,270 [salt.state       ][INFO    ][5524] Executing state network.managed for ens4
2018-03-30 05:54:25,827 [salt.state       ][INFO    ][5524] Interface ens4 is up to date.
2018-03-30 05:54:25,828 [salt.state       ][INFO    ][5524] Completed state [ens4] at time 05:54:25.828093 duration_in_ms=558.171
2018-03-30 05:54:25,828 [salt.state       ][INFO    ][5524] Running state [/etc/profile.d/proxy.sh] at time 05:54:25.828471
2018-03-30 05:54:25,828 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/profile.d/proxy.sh
2018-03-30 05:54:25,829 [salt.state       ][INFO    ][5524] File /etc/profile.d/proxy.sh is not present
2018-03-30 05:54:25,829 [salt.state       ][INFO    ][5524] Completed state [/etc/profile.d/proxy.sh] at time 05:54:25.829564 duration_in_ms=1.093
2018-03-30 05:54:25,829 [salt.state       ][INFO    ][5524] Running state [/etc/apt/apt.conf.d/95proxies] at time 05:54:25.829830
2018-03-30 05:54:25,830 [salt.state       ][INFO    ][5524] Executing state file.absent for /etc/apt/apt.conf.d/95proxies
2018-03-30 05:54:25,830 [salt.state       ][INFO    ][5524] File /etc/apt/apt.conf.d/95proxies is not present
2018-03-30 05:54:25,830 [salt.state       ][INFO    ][5524] Completed state [/etc/apt/apt.conf.d/95proxies] at time 05:54:25.830596 duration_in_ms=0.766
2018-03-30 05:54:25,830 [salt.state       ][INFO    ][5524] Running state [ntp] at time 05:54:25.830851
2018-03-30 05:54:25,831 [salt.state       ][INFO    ][5524] Executing state pkg.installed for ntp
2018-03-30 05:54:25,837 [salt.state       ][INFO    ][5524] All specified packages are already installed
2018-03-30 05:54:25,837 [salt.state       ][INFO    ][5524] Completed state [ntp] at time 05:54:25.837372 duration_in_ms=6.521
2018-03-30 05:54:25,839 [salt.state       ][INFO    ][5524] Running state [/etc/ntp.conf] at time 05:54:25.839020
2018-03-30 05:54:25,839 [salt.state       ][INFO    ][5524] Executing state file.managed for /etc/ntp.conf
2018-03-30 05:54:25,891 [salt.state       ][INFO    ][5524] File /etc/ntp.conf is in the correct state
2018-03-30 05:54:25,895 [salt.state       ][INFO    ][5524] Completed state [/etc/ntp.conf] at time 05:54:25.891724 duration_in_ms=52.703
2018-03-30 05:54:25,896 [salt.state       ][INFO    ][5524] Running state [ntp] at time 05:54:25.896418
2018-03-30 05:54:25,896 [salt.state       ][INFO    ][5524] Executing state service.running for ntp
2018-03-30 05:54:25,897 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-03-30 05:54:25,914 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-03-30 05:54:25,929 [salt.loaded.int.module.cmdmod][INFO    ][5524] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-03-30 05:54:25,952 [salt.state       ][INFO    ][5524] The service ntp is already running
2018-03-30 05:54:25,952 [salt.state       ][INFO    ][5524] Completed state [ntp] at time 05:54:25.952761 duration_in_ms=56.339
2018-03-30 05:54:25,956 [salt.minion      ][INFO    ][5524] Returning information for job: 20180330055413131419
2018-03-30 05:54:59,176 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command ssh.set_auth_key with jid 20180330055459163349
2018-03-30 05:54:59,202 [salt.minion      ][INFO    ][6256] Starting a new job with PID 6256
2018-03-30 05:54:59,221 [salt.loader.192.168.11.2.int.module.ssh][WARNING ][6256] Public Key hashing currently defaults to "md5". This will change to "sha256" in the Nitrogen release.
2018-03-30 05:54:59,224 [salt.minion      ][INFO    ][6256] Returning information for job: 20180330055459163349
2018-03-30 05:54:59,917 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command file.write with jid 20180330055459918253
2018-03-30 05:54:59,944 [salt.minion      ][INFO    ][6261] Starting a new job with PID 6261
2018-03-30 05:54:59,965 [salt.minion      ][INFO    ][6261] Returning information for job: 20180330055459918253
2018-03-30 05:55:01,152 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command state.apply with jid 20180330055501156208
2018-03-30 05:55:01,179 [salt.minion      ][INFO    ][6266] Starting a new job with PID 6266
2018-03-30 05:55:03,613 [salt.state       ][INFO    ][6266] Loading fresh modules for state activity
2018-03-30 05:55:03,666 [salt.fileclient  ][INFO    ][6266] Fetching file from saltenv 'base', ** done ** 'opnfv/route_wrapper.sls'
2018-03-30 05:55:03,684 [salt.state       ][INFO    ][6266] Running state [/usr/local/sbin/route] at time 05:55:03.684266
2018-03-30 05:55:03,685 [salt.state       ][INFO    ][6266] Executing state file.managed for /usr/local/sbin/route
2018-03-30 05:55:03,705 [salt.state       ][INFO    ][6266] File changed:
New file
2018-03-30 05:55:03,706 [salt.state       ][INFO    ][6266] Completed state [/usr/local/sbin/route] at time 05:55:03.705936 duration_in_ms=21.67
2018-03-30 05:55:03,708 [salt.minion      ][INFO    ][6266] Returning information for job: 20180330055501156208
2018-03-30 05:55:04,348 [salt.minion      ][INFO    ][2133] User sudo_ubuntu Executing command system.reboot with jid 20180330055504340161
2018-03-30 05:55:04,387 [salt.minion      ][INFO    ][6275] Starting a new job with PID 6275
2018-03-30 05:55:04,397 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][6275] Executing command ['shutdown', '-r', 'now'] in directory '/root'
2018-03-30 05:55:04,529 [salt.utils.parsers][WARNING ][2133] Minion received a SIGTERM. Exiting.
2018-03-30 05:55:04,530 [salt.cli.daemons ][INFO    ][2133] Shutting down the Salt Minion
2018-03-30 05:55:17,873 [salt.cli.daemons ][INFO    ][1338] Setting up the Salt Minion "prx01.mcp-pike-ovs-dpdk-ha.local"
2018-03-30 05:55:18,530 [salt.cli.daemons ][INFO    ][1338] Starting up the Salt Minion
2018-03-30 05:55:18,532 [salt.utils.event ][INFO    ][1338] Starting pull socket on /var/run/salt/minion/minion_event_bd4454dbe8_pull.ipc
2018-03-30 05:55:19,474 [salt.minion      ][INFO    ][1338] Creating minion process manager
2018-03-30 05:55:20,634 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1338] Executing command ['date', '+%z'] in directory '/root'
2018-03-30 05:55:20,685 [salt.utils.schedule][INFO    ][1338] Updating job settings for scheduled job: __mine_interval
2018-03-30 05:55:20,691 [salt.minion      ][INFO    ][1338] Added mine.update to scheduler
2018-03-30 05:55:20,705 [salt.minion      ][INFO    ][1338] Minion is starting as user 'root'
2018-03-30 05:55:20,721 [salt.minion      ][INFO    ][1338] Minion is ready to receive requests!
2018-03-30 05:55:21,723 [salt.utils.schedule][INFO    ][1338] Running scheduled job: __mine_interval
2018-03-30 05:55:25,250 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command test.ping with jid 20180330055525251439
2018-03-30 05:55:25,274 [salt.minion      ][INFO    ][1443] Starting a new job with PID 1443
2018-03-30 05:55:25,407 [salt.minion      ][INFO    ][1443] Returning information for job: 20180330055525251439
2018-03-30 05:55:25,992 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command pkg.upgrade with jid 20180330055525987199
2018-03-30 05:55:26,025 [salt.minion      ][INFO    ][1448] Starting a new job with PID 1448
2018-03-30 05:55:26,194 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1448] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:55:26,864 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1448] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'upgrade'] in directory '/root'
2018-03-30 05:55:36,080 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055536080439
2018-03-30 05:55:36,107 [salt.minion      ][INFO    ][1482] Starting a new job with PID 1482
2018-03-30 05:55:38,917 [salt.minion      ][INFO    ][1482] Returning information for job: 20180330055536080439
2018-03-30 05:55:46,128 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055546128633
2018-03-30 05:55:46,162 [salt.minion      ][INFO    ][1502] Starting a new job with PID 1502
2018-03-30 05:55:46,184 [salt.minion      ][INFO    ][1502] Returning information for job: 20180330055546128633
2018-03-30 05:55:56,295 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055556296382
2018-03-30 05:55:56,326 [salt.minion      ][INFO    ][1522] Starting a new job with PID 1522
2018-03-30 05:55:56,399 [salt.minion      ][INFO    ][1522] Returning information for job: 20180330055556296382
2018-03-30 05:56:06,407 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055606406817
2018-03-30 05:56:06,441 [salt.minion      ][INFO    ][1540] Starting a new job with PID 1540
2018-03-30 05:56:06,462 [salt.minion      ][INFO    ][1540] Returning information for job: 20180330055606406817
2018-03-30 05:56:16,602 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055616602606
2018-03-30 05:56:16,632 [salt.minion      ][INFO    ][1554] Starting a new job with PID 1554
2018-03-30 05:56:16,662 [salt.minion      ][INFO    ][1554] Returning information for job: 20180330055616602606
2018-03-30 05:56:26,760 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055626760716
2018-03-30 05:56:26,789 [salt.minion      ][INFO    ][1647] Starting a new job with PID 1647
2018-03-30 05:56:26,847 [salt.minion      ][INFO    ][1647] Returning information for job: 20180330055626760716
2018-03-30 05:56:36,904 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330055636905090
2018-03-30 05:56:36,932 [salt.minion      ][INFO    ][2177] Starting a new job with PID 2177
2018-03-30 05:56:37,009 [salt.minion      ][INFO    ][2177] Returning information for job: 20180330055636905090
2018-03-30 05:56:41,614 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][1448] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 05:56:41,667 [salt.minion      ][INFO    ][1448] Returning information for job: 20180330055525987199
2018-03-30 05:56:55,016 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command test.ping with jid 20180330055655019013
2018-03-30 05:56:55,053 [salt.minion      ][INFO    ][2267] Starting a new job with PID 2267
2018-03-30 05:56:55,125 [salt.minion      ][INFO    ][2267] Returning information for job: 20180330055655019013
2018-03-30 06:01:14,072 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command state.sls with jid 20180330060114069864
2018-03-30 06:01:14,098 [salt.minion      ][INFO    ][2282] Starting a new job with PID 2282
2018-03-30 06:01:14,555 [salt.state       ][INFO    ][2282] Loading fresh modules for state activity
2018-03-30 06:01:14,656 [salt.fileclient  ][INFO    ][2282] Fetching file from saltenv 'base', ** done ** 'keepalived/init.sls'
2018-03-30 06:01:14,688 [salt.fileclient  ][INFO    ][2282] Fetching file from saltenv 'base', ** done ** 'keepalived/cluster.sls'
2018-03-30 06:01:16,147 [salt.state       ][INFO    ][2282] Running state [keepalived] at time 06:01:16.147812
2018-03-30 06:01:16,148 [salt.state       ][INFO    ][2282] Executing state pkg.installed for keepalived
2018-03-30 06:01:16,149 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 06:01:16,504 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['apt-cache', '-q', 'policy', 'keepalived'] in directory '/root'
2018-03-30 06:01:16,623 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 06:01:19,007 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 06:01:19,039 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'keepalived'] in directory '/root'
2018-03-30 06:01:24,118 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330060124114781
2018-03-30 06:01:24,145 [salt.minion      ][INFO    ][3228] Starting a new job with PID 3228
2018-03-30 06:01:24,168 [salt.minion      ][INFO    ][3228] Returning information for job: 20180330060124114781
2018-03-30 06:01:27,842 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 06:01:27,881 [salt.state       ][INFO    ][2282] Made the following changes:
'libsnmp30' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'libnl-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'
'libsensors4' changed from 'absent' to '1:3.4.0-2'
'libsnmp-base' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'keepalived' changed from 'absent' to '1:1.2.19-1ubuntu0.2'
'ipvsadm' changed from 'absent' to '1:1.28-3'
'libnl-genl-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'

2018-03-30 06:01:27,915 [salt.state       ][INFO    ][2282] Loading fresh modules for state activity
2018-03-30 06:01:27,934 [salt.state       ][INFO    ][2282] Completed state [keepalived] at time 06:01:27.934641 duration_in_ms=11786.831
2018-03-30 06:01:27,938 [salt.state       ][INFO    ][2282] Running state [lsof] at time 06:01:27.938156
2018-03-30 06:01:27,938 [salt.state       ][INFO    ][2282] Executing state pkg.installed for lsof
2018-03-30 06:01:28,263 [salt.state       ][INFO    ][2282] All specified packages are already installed
2018-03-30 06:01:28,264 [salt.state       ][INFO    ][2282] Completed state [lsof] at time 06:01:28.264412 duration_in_ms=326.256
2018-03-30 06:01:28,272 [salt.state       ][INFO    ][2282] Running state [/etc/keepalived/keepalived.conf] at time 06:01:28.272490
2018-03-30 06:01:28,272 [salt.state       ][INFO    ][2282] Executing state file.managed for /etc/keepalived/keepalived.conf
2018-03-30 06:01:28,300 [salt.fileclient  ][INFO    ][2282] Fetching file from saltenv 'base', ** done ** 'keepalived/files/keepalived.conf'
2018-03-30 06:01:28,338 [salt.state       ][INFO    ][2282] File changed:
New file
2018-03-30 06:01:28,338 [salt.state       ][INFO    ][2282] Completed state [/etc/keepalived/keepalived.conf] at time 06:01:28.338567 duration_in_ms=66.077
2018-03-30 06:01:28,338 [salt.state       ][INFO    ][2282] Running state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 06:01:28.338777
2018-03-30 06:01:28,338 [salt.state       ][INFO    ][2282] Executing state file.managed for /usr/local/bin/vrrp_script_check_pidof.sh
2018-03-30 06:01:29,006 [salt.fileclient  ][INFO    ][2282] Fetching file from saltenv 'base', ** done ** 'keepalived/files/vrrp_script_check_pidof.sh'
2018-03-30 06:01:29,010 [salt.state       ][INFO    ][2282] File changed:
New file
2018-03-30 06:01:29,010 [salt.state       ][INFO    ][2282] Completed state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 06:01:29.010444 duration_in_ms=671.665
2018-03-30 06:01:29,017 [salt.state       ][INFO    ][2282] Running state [keepalived] at time 06:01:29.017616
2018-03-30 06:01:29,018 [salt.state       ][INFO    ][2282] Executing state service.running for keepalived
2018-03-30 06:01:29,019 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'status', 'keepalived.service', '-n', '0'] in directory '/root'
2018-03-30 06:01:29,049 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,073 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,096 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,119 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,231 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,271 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,300 [salt.loaded.int.module.cmdmod][INFO    ][2282] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-03-30 06:01:29,322 [salt.state       ][INFO    ][2282] {'keepalived': True}
2018-03-30 06:01:29,324 [salt.state       ][INFO    ][2282] Completed state [keepalived] at time 06:01:29.323869 duration_in_ms=306.252
2018-03-30 06:01:29,327 [salt.minion      ][INFO    ][2282] Returning information for job: 20180330060114069864
2018-03-30 06:02:01,936 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command pillar.get with jid 20180330060201932346
2018-03-30 06:02:01,972 [salt.minion      ][INFO    ][3532] Starting a new job with PID 3532
2018-03-30 06:02:01,981 [salt.minion      ][INFO    ][3532] Returning information for job: 20180330060201932346
2018-03-30 06:14:16,504 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command state.sls with jid 20180330061416498344
2018-03-30 06:14:16,529 [salt.minion      ][INFO    ][3863] Starting a new job with PID 3863
2018-03-30 06:14:18,171 [salt.state       ][INFO    ][3863] Loading fresh modules for state activity
2018-03-30 06:14:18,232 [salt.fileclient  ][INFO    ][3863] Fetching file from saltenv 'base', ** done ** 'memcached/init.sls'
2018-03-30 06:14:18,263 [salt.fileclient  ][INFO    ][3863] Fetching file from saltenv 'base', ** done ** 'memcached/server.sls'
2018-03-30 06:14:18,294 [salt.fileclient  ][INFO    ][3863] Fetching file from saltenv 'base', ** done ** 'memcached/map.jinja'
2018-03-30 06:14:18,799 [salt.state       ][INFO    ][3863] Running state [memcached] at time 06:14:18.799502
2018-03-30 06:14:18,799 [salt.state       ][INFO    ][3863] Executing state pkg.installed for memcached
2018-03-30 06:14:18,800 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 06:14:19,138 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['apt-cache', '-q', 'policy', 'memcached'] in directory '/root'
2018-03-30 06:14:19,211 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 06:14:21,039 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 06:14:21,077 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'memcached'] in directory '/root'
2018-03-30 06:14:26,602 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330061426590562
2018-03-30 06:14:26,638 [salt.minion      ][INFO    ][4587] Starting a new job with PID 4587
2018-03-30 06:14:26,660 [salt.minion      ][INFO    ][4587] Returning information for job: 20180330061426590562
2018-03-30 06:14:26,707 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 06:14:26,741 [salt.state       ][INFO    ][3863] Made the following changes:
'memcached' changed from 'absent' to '1.4.25-2ubuntu1.4'

2018-03-30 06:14:26,754 [salt.state       ][INFO    ][3863] Loading fresh modules for state activity
2018-03-30 06:14:26,775 [salt.state       ][INFO    ][3863] Completed state [memcached] at time 06:14:26.774984 duration_in_ms=7975.483
2018-03-30 06:14:26,779 [salt.state       ][INFO    ][3863] Running state [python-memcache] at time 06:14:26.779887
2018-03-30 06:14:26,780 [salt.state       ][INFO    ][3863] Executing state pkg.installed for python-memcache
2018-03-30 06:14:27,157 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 06:14:27,183 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-memcache'] in directory '/root'
2018-03-30 06:14:29,761 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 06:14:29,799 [salt.state       ][INFO    ][3863] Made the following changes:
'python-memcache' changed from 'absent' to '1.57-1'

2018-03-30 06:14:29,812 [salt.state       ][INFO    ][3863] Loading fresh modules for state activity
2018-03-30 06:14:29,838 [salt.state       ][INFO    ][3863] Completed state [python-memcache] at time 06:14:29.838258 duration_in_ms=3058.37
2018-03-30 06:14:29,844 [salt.state       ][INFO    ][3863] Running state [/etc/memcached.conf] at time 06:14:29.844320
2018-03-30 06:14:29,844 [salt.state       ][INFO    ][3863] Executing state file.managed for /etc/memcached.conf
2018-03-30 06:14:29,907 [salt.fileclient  ][INFO    ][3863] Fetching file from saltenv 'base', ** done ** 'memcached/files/memcached.conf'
2018-03-30 06:14:29,941 [salt.state       ][INFO    ][3863] File changed:
--- 
+++ 
@@ -1,11 +1,10 @@
+
 # memcached default config file
 # 2003 - Jay Bonci <jaybonci@debian.org>
-# This configuration file is read by the start-memcached script provided as
-# part of the Debian GNU/Linux distribution.
+# This configuration file is read by the start-memcached script provided as part of the Debian GNU/Linux distribution. 
 
 # Run memcached as a daemon. This command is implied, and is not needed for the
-# daemon to run. See the README.Debian that comes with this package for more
-# information.
+# daemon to run. See the README.Debian that comes with this package for more information.
 -d
 
 # Log memcached's output to /var/log/memcached
@@ -18,13 +17,13 @@
 # -vv
 
 # Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
-# Note that the daemon will grow to this size, but does not start out holding this much
-# memory
+# Note that the daemon will grow to this size, but does not start out holding this much memory
 -m 64
 
 # Default connection port is 11211
 -p 11211
 
+-U 11211
 # Run the daemon as root. The start-memcached will default to running as root if no
 # -u command is present in this config file
 -u memcache
@@ -32,10 +31,12 @@
 # Specify which IP address to listen on. The default is to listen on all IP addresses
 # This parameter is one of the only security measures that memcached has, so make sure
 # it's listening on a firewalled interface.
--l 127.0.0.1
+-l 0.0.0.0
 
 # Limit the number of simultaneous incoming connections. The daemon default is 1024
 # -c 1024
+# Mirantis
+-c 8192
 
 # Lock down all paged memory. Consult with the README and homepage before you do this
 # -k
@@ -45,3 +46,9 @@
 
 # Maximize core file limit
 # -r
+
+# Number of threads to use to process incoming requests.
+-t 1
+
+# Set size of each slab page. Default value for this parameter is 1m, minimum is 1k, max is 128m.
+-I 1m

2018-03-30 06:14:29,948 [salt.state       ][INFO    ][3863] Completed state [/etc/memcached.conf] at time 06:14:29.948663 duration_in_ms=104.344
2018-03-30 06:14:30,140 [salt.state       ][INFO    ][3863] Running state [memcached] at time 06:14:30.140226
2018-03-30 06:14:30,140 [salt.state       ][INFO    ][3863] Executing state service.running for memcached
2018-03-30 06:14:30,140 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemctl', 'status', 'memcached.service', '-n', '0'] in directory '/root'
2018-03-30 06:14:30,156 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-03-30 06:14:30,167 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-03-30 06:14:30,178 [salt.state       ][INFO    ][3863] The service memcached is already running
2018-03-30 06:14:30,179 [salt.state       ][INFO    ][3863] Completed state [memcached] at time 06:14:30.179262 duration_in_ms=39.036
2018-03-30 06:14:30,179 [salt.state       ][INFO    ][3863] Running state [memcached] at time 06:14:30.179482
2018-03-30 06:14:30,179 [salt.state       ][INFO    ][3863] Executing state service.mod_watch for memcached
2018-03-30 06:14:30,180 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-03-30 06:14:30,191 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-03-30 06:14:30,203 [salt.loaded.int.module.cmdmod][INFO    ][3863] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'memcached.service'] in directory '/root'
2018-03-30 06:14:30,235 [salt.state       ][INFO    ][3863] {'memcached': True}
2018-03-30 06:14:30,235 [salt.state       ][INFO    ][3863] Completed state [memcached] at time 06:14:30.235413 duration_in_ms=55.931
2018-03-30 06:14:30,237 [salt.minion      ][INFO    ][3863] Returning information for job: 20180330061416498344
2018-03-30 06:55:21,724 [salt.utils.schedule][INFO    ][1338] Running scheduled job: __mine_interval
2018-03-30 07:05:00,558 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command state.sls with jid 20180330070500553159
2018-03-30 07:05:00,588 [salt.minion      ][INFO    ][6189] Starting a new job with PID 6189
2018-03-30 07:05:03,026 [salt.state       ][INFO    ][6189] Loading fresh modules for state activity
2018-03-30 07:05:03,089 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/init.sls'
2018-03-30 07:05:03,128 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/server/init.sls'
2018-03-30 07:05:03,152 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/server/service.sls'
2018-03-30 07:05:03,264 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/server/plugin.sls'
2018-03-30 07:05:03,718 [salt.state       ][INFO    ][6189] Running state [apache2] at time 07:05:03.718527
2018-03-30 07:05:03,718 [salt.state       ][INFO    ][6189] Executing state pkg.installed for apache2
2018-03-30 07:05:03,719 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:05:03,996 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['apt-cache', '-q', 'policy', 'apache2'] in directory '/root'
2018-03-30 07:05:04,074 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 07:05:07,313 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 07:05:07,345 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'apache2'] in directory '/root'
2018-03-30 07:05:10,674 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070510644450
2018-03-30 07:05:10,697 [salt.minion      ][INFO    ][7043] Starting a new job with PID 7043
2018-03-30 07:05:10,718 [salt.minion      ][INFO    ][7043] Returning information for job: 20180330070510644450
2018-03-30 07:05:20,693 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070520684677
2018-03-30 07:05:20,721 [salt.minion      ][INFO    ][7060] Starting a new job with PID 7060
2018-03-30 07:05:20,788 [salt.minion      ][INFO    ][7060] Returning information for job: 20180330070520684677
2018-03-30 07:05:29,474 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:05:29,522 [salt.state       ][INFO    ][6189] Made the following changes:
'apache2-data' changed from 'absent' to '2.4.18-2ubuntu3.5'
'httpd-cgi' changed from 'absent' to '1'
'apache2-utils' changed from 'absent' to '2.4.18-2ubuntu3.5'
'httpd' changed from 'absent' to '1'
'ssl-cert' changed from 'absent' to '1.0.37'
'apache2' changed from 'absent' to '2.4.18-2ubuntu3.5'

2018-03-30 07:05:29,542 [salt.state       ][INFO    ][6189] Loading fresh modules for state activity
2018-03-30 07:05:29,571 [salt.state       ][INFO    ][6189] Completed state [apache2] at time 07:05:29.571453 duration_in_ms=25852.926
2018-03-30 07:05:29,575 [salt.state       ][INFO    ][6189] Running state [libapache2-mod-wsgi] at time 07:05:29.575273
2018-03-30 07:05:29,575 [salt.state       ][INFO    ][6189] Executing state pkg.installed for libapache2-mod-wsgi
2018-03-30 07:05:29,898 [salt.state       ][INFO    ][6189] All specified packages are already installed
2018-03-30 07:05:29,899 [salt.state       ][INFO    ][6189] Completed state [libapache2-mod-wsgi] at time 07:05:29.899228 duration_in_ms=323.955
2018-03-30 07:05:29,899 [salt.state       ][INFO    ][6189] Running state [openstack-dashboard] at time 07:05:29.899663
2018-03-30 07:05:29,900 [salt.state       ][INFO    ][6189] Executing state pkg.installed for openstack-dashboard
2018-03-30 07:05:29,913 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 07:05:29,940 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'openstack-dashboard'] in directory '/root'
2018-03-30 07:05:30,760 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070530741024
2018-03-30 07:05:30,790 [salt.minion      ][INFO    ][7569] Starting a new job with PID 7569
2018-03-30 07:05:30,814 [salt.minion      ][INFO    ][7569] Returning information for job: 20180330070530741024
2018-03-30 07:05:40,986 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070540980608
2018-03-30 07:05:41,013 [salt.minion      ][INFO    ][7681] Starting a new job with PID 7681
2018-03-30 07:05:41,060 [salt.minion      ][INFO    ][7681] Returning information for job: 20180330070540980608
2018-03-30 07:05:51,032 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070551022077
2018-03-30 07:05:51,061 [salt.minion      ][INFO    ][7862] Starting a new job with PID 7862
2018-03-30 07:05:51,082 [salt.minion      ][INFO    ][7862] Returning information for job: 20180330070551022077
2018-03-30 07:06:01,258 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070601250136
2018-03-30 07:06:01,287 [salt.minion      ][INFO    ][8124] Starting a new job with PID 8124
2018-03-30 07:06:01,310 [salt.minion      ][INFO    ][8124] Returning information for job: 20180330070601250136
2018-03-30 07:06:11,297 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070611275130
2018-03-30 07:06:11,325 [salt.minion      ][INFO    ][8214] Starting a new job with PID 8214
2018-03-30 07:06:11,346 [salt.minion      ][INFO    ][8214] Returning information for job: 20180330070611275130
2018-03-30 07:06:21,360 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070621352532
2018-03-30 07:06:21,388 [salt.minion      ][INFO    ][8482] Starting a new job with PID 8482
2018-03-30 07:06:21,411 [salt.minion      ][INFO    ][8482] Returning information for job: 20180330070621352532
2018-03-30 07:06:31,383 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070631374913
2018-03-30 07:06:31,414 [salt.minion      ][INFO    ][8717] Starting a new job with PID 8717
2018-03-30 07:06:31,447 [salt.minion      ][INFO    ][8717] Returning information for job: 20180330070631374913
2018-03-30 07:06:41,431 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070641421220
2018-03-30 07:06:41,473 [salt.minion      ][INFO    ][8950] Starting a new job with PID 8950
2018-03-30 07:06:41,493 [salt.minion      ][INFO    ][8950] Returning information for job: 20180330070641421220
2018-03-30 07:06:51,468 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070651461836
2018-03-30 07:06:51,500 [salt.minion      ][INFO    ][9550] Starting a new job with PID 9550
2018-03-30 07:06:51,538 [salt.minion      ][INFO    ][9550] Returning information for job: 20180330070651461836
2018-03-30 07:07:01,505 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070701498141
2018-03-30 07:07:01,537 [salt.minion      ][INFO    ][9561] Starting a new job with PID 9561
2018-03-30 07:07:01,569 [salt.minion      ][INFO    ][9561] Returning information for job: 20180330070701498141
2018-03-30 07:07:11,552 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070711543165
2018-03-30 07:07:11,587 [salt.minion      ][INFO    ][9880] Starting a new job with PID 9880
2018-03-30 07:07:11,618 [salt.minion      ][INFO    ][9880] Returning information for job: 20180330070711543165
2018-03-30 07:07:21,606 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070721584690
2018-03-30 07:07:21,631 [salt.minion      ][INFO    ][10149] Starting a new job with PID 10149
2018-03-30 07:07:21,656 [salt.minion      ][INFO    ][10149] Returning information for job: 20180330070721584690
2018-03-30 07:07:31,633 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070731625807
2018-03-30 07:07:31,664 [salt.minion      ][INFO    ][10435] Starting a new job with PID 10435
2018-03-30 07:07:31,687 [salt.minion      ][INFO    ][10435] Returning information for job: 20180330070731625807
2018-03-30 07:07:41,665 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070741656792
2018-03-30 07:07:41,707 [salt.minion      ][INFO    ][10581] Starting a new job with PID 10581
2018-03-30 07:07:41,733 [salt.minion      ][INFO    ][10581] Returning information for job: 20180330070741656792
2018-03-30 07:07:51,708 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070751700395
2018-03-30 07:07:51,745 [salt.minion      ][INFO    ][10590] Starting a new job with PID 10590
2018-03-30 07:07:51,772 [salt.minion      ][INFO    ][10590] Returning information for job: 20180330070751700395
2018-03-30 07:08:01,759 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070801751835
2018-03-30 07:08:01,796 [salt.minion      ][INFO    ][10599] Starting a new job with PID 10599
2018-03-30 07:08:01,822 [salt.minion      ][INFO    ][10599] Returning information for job: 20180330070801751835
2018-03-30 07:08:09,454 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:08:09,498 [salt.state       ][INFO    ][6189] Made the following changes:
'python-routes' changed from 'absent' to '2.4.1-1~cloud0'
'python-retrying' changed from 'absent' to '1.3.3-1'
'python-kombu' changed from 'absent' to '4.0.2+really4.0.2+dfsg-2ubuntu1~cloud0'
'python-oslo.concurrency' changed from 'absent' to '3.21.0-0ubuntu2~cloud0'
'python-sqlparse' changed from 'absent' to '0.1.18-1'
'python-pint' changed from 'absent' to '0.6-1ubuntu1'
'python-monotonic' changed from 'absent' to '0.6-2'
'python2.7-pymongo' changed from 'absent' to '1'
'python2.7-bson' changed from 'absent' to '1'
'libtiff5' changed from 'absent' to '4.0.6-1ubuntu0.4'
'python-secretstorage' changed from 'absent' to '2.1.3-1'
'python-glanceclient' changed from 'absent' to '1:2.8.0-0ubuntu1~cloud0'
'python-formencode' changed from 'absent' to '1.3.0-0ubuntu5'
'python-functools32' changed from 'absent' to '3.2.3.2-2'
'python-cachetools' changed from 'absent' to '1.1.6-1~cloud0'
'python-semantic-version' changed from 'absent' to '2.3.1-1'
'python-blinker' changed from 'absent' to '1.3.dfsg2-1build1'
'python-roman' changed from 'absent' to '2.0.0-2'
'python-pastescript' changed from 'absent' to '1.7.5-3build1'
'python-bs4' changed from 'absent' to '4.4.1-1'
'python2.7-pymongo-ext' changed from 'absent' to '1'
'python-tenacity' changed from 'absent' to '3.3.0-0ubuntu1~cloud0'
'python-unittest2' changed from 'absent' to '1.1.0-6.1'
'python-setuptools' changed from 'absent' to '36.2.7-2~cloud0'
'python2.7-django-appconf' changed from 'absent' to '1'
'docutils-doc' changed from 'absent' to '0.12+dfsg-1'
'python-dbus' changed from 'absent' to '1.2.0-3'
'python-gridfs' changed from 'absent' to '3.2-1build1'
'python-fixtures' changed from 'absent' to '3.0.0-2~cloud0'
'python-testtools' changed from 'absent' to '1.8.1-0ubuntu1'
'python-anyjson' changed from 'absent' to '0.3.3-1build1'
'python-jsonschema' changed from 'absent' to '2.5.1-4'
'python-prettytable' changed from 'absent' to '0.7.2-3'
'python-compressor' changed from 'absent' to '2.0-1ubuntu1'
'python-netaddr' changed from 'absent' to '0.7.18-1'
'python-dnspython' changed from 'absent' to '1.15.0-1~cloud0'
'python-babel' changed from 'absent' to '2.4.0+dfsg.1-2ubuntu1~cloud0'
'python-requests' changed from '2.9.1-3' to '2.18.1-1~cloud0'
'python-certifi' changed from 'absent' to '2015.11.20.1-2'
'python-pil' changed from 'absent' to '3.1.2-0ubuntu1.1'
'docutils-common' changed from 'absent' to '0.12+dfsg-1'
'python2.7-lxml' changed from 'absent' to '1'
'python-pika' changed from 'absent' to '0.10.0-1'
'python-osc-lib' changed from 'absent' to '1.7.0-0ubuntu1~cloud0'
'python-keystoneclient' changed from 'absent' to '1:3.13.0-0ubuntu1~cloud0'
'python2.7-simplejson' changed from 'absent' to '1'
'python-extras' changed from 'absent' to '0.0.3-3'
'python2.7-django-openstack-auth' changed from 'absent' to '1'
'python-funcsigs' changed from 'absent' to '1.0.2-3~cloud0'
'python-bson-ext' changed from 'absent' to '3.2-1build1'
'python-scgi' changed from 'absent' to '1.13-1.1build1'
'python2.7-pil' changed from 'absent' to '1'
'python-repoze.lru' changed from 'absent' to '0.6-6'
'python-posix-ipc' changed from 'absent' to '0.9.8-2build2'
'formencode-i18n' changed from 'absent' to '1.3.0-0ubuntu5'
'python2.7-testtools' changed from 'absent' to '1'
'docutils' changed from 'absent' to '1'
'python-django-pyscss' changed from 'absent' to '2.0.2-4'
'ieee-data' changed from 'absent' to '20150531.1'
'python2.7-dbus' changed from 'absent' to '1'
'python-oslo.middleware' changed from 'absent' to '3.30.0-0ubuntu1.1~cloud0'
'python-pygments' changed from 'absent' to '2.2.0+dfsg-1~cloud0'
'python-pillow' changed from 'absent' to '1'
'python2.7-cinderclient' changed from 'absent' to '1'
'libpaperg' changed from 'absent' to '1'
'python2.7-netifaces' changed from 'absent' to '1'
'liblcms2-2' changed from 'absent' to '2.6-3ubuntu2'
'python-oslo.context' changed from 'absent' to '1:2.17.0-0ubuntu1~cloud0'
'python-neutronclient' changed from 'absent' to '1:6.5.0-0ubuntu1.1~cloud0'
'python-pymongo-ext' changed from 'absent' to '3.2-1build1'
'python-urllib3' changed from '1.13.1-2ubuntu0.16.04.1' to '1.21.1-1~cloud0'
'python2.7-pyinotify' changed from 'absent' to '1'
'python-webob' changed from 'absent' to '1:1.7.2-0ubuntu1~cloud0'
'python-pyparsing' changed from 'absent' to '2.1.10+dfsg1-1~cloud0'
'python-babel-localedata' changed from 'absent' to '2.4.0+dfsg.1-2ubuntu1~cloud0'
'python-positional' changed from 'absent' to '1.1.1-3~cloud0'
'python-appconf' changed from 'absent' to '1'
'python-cmd2' changed from 'absent' to '0.6.8-1'
'python-distribute' changed from 'absent' to '1'
'python-oslo-log' changed from 'absent' to '1'
'python-rjsmin' changed from 'absent' to '1.0.12+dfsg1-2ubuntu1'
'python-django-openstack-auth' changed from 'absent' to '3.5.0-0ubuntu1~cloud0'
'python-pathlib' changed from 'absent' to '1.0.1-2'
'python-iso8601' changed from 'absent' to '0.1.11-1'
'python-jsonpatch' changed from 'absent' to '1.19-3'
'python-cinderclient' changed from 'absent' to '1:3.1.0-0ubuntu1~cloud0'
'libwebpmux1' changed from 'absent' to '0.4.4-1'
'python-heatclient' changed from 'absent' to '1.11.0-0ubuntu1~cloud0'
'python-oslo.policy' changed from 'absent' to '1.25.1-0ubuntu1~cloud0'
'python-stevedore' changed from 'absent' to '1:1.25.0-0ubuntu1~cloud0'
'python-paste' changed from 'absent' to '1.7.5.1-6ubuntu3'
'python-openstack-auth' changed from 'absent' to '3.5.0-0ubuntu1~cloud0'
'python-lxml' changed from 'absent' to '3.5.0-1build1'
'python-oslo.config' changed from 'absent' to '1:4.11.0-0ubuntu1~cloud0'
'python-futurist' changed from 'absent' to '0.13.0-2'
'libpaper1' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-fasteners' changed from 'absent' to '0.12.0-2ubuntu1'
'python2.7-gi' changed from 'absent' to '1'
'python-linecache2' changed from 'absent' to '1.0.0-2'
'python-mimeparse' changed from 'absent' to '0.1.4-1build1'
'python-pastedeploy-tpl' changed from 'absent' to '1.5.2-1'
'python-oauthlib' changed from 'absent' to '1.0.3-1'
'python2.7-django-compressor' changed from 'absent' to '1'
'python-gi' changed from 'absent' to '3.20.0-0ubuntu1'
'python-contextlib2' changed from 'absent' to '0.5.1-1'
'python2.7-pathlib' changed from 'absent' to '1'
'python-oslo.serialization' changed from 'absent' to '2.20.0-0ubuntu1~cloud0'
'python-oslo.utils' changed from 'absent' to '3.28.0-0ubuntu1~cloud0'
'python-pika-pool' changed from 'absent' to '0.1.3-1ubuntu1'
'python-django' changed from 'absent' to '1.8.7-1ubuntu5.6'
'python-warlock' changed from 'absent' to '1.1.0-1'
'python-debtcollector' changed from 'absent' to '1.3.0-2'
'python2.7-gridfs' changed from 'absent' to '1'
'python-bson' changed from 'absent' to '3.2-1build1'
'python-simplejson' changed from 'absent' to '3.8.1-1ubuntu2'
'python-wrapt' changed from 'absent' to '1.8.0-5build2'
'python-docutils' changed from 'absent' to '0.12+dfsg-1'
'python-openid' changed from 'absent' to '2.2.5-6'
'python-pastedeploy' changed from 'absent' to '1.5.2-1'
'python2.7-cmd2' changed from 'absent' to '1'
'python-tz' changed from 'absent' to '2014.10~dfsg1-0ubuntu2'
'libpaper-utils' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-cliff' changed from 'absent' to '2.8.0-0ubuntu1.1~cloud0'
'python-oslo.i18n' changed from 'absent' to '3.17.0-0ubuntu1~cloud0'
'python-appdirs' changed from 'absent' to '1.4.0-2'
'libjpeg8' changed from 'absent' to '8c-2ubuntu8'
'python-statsd' changed from 'absent' to '3.2.1-2~cloud0'
'libxslt1.1' changed from 'absent' to '1.1.28-2.1ubuntu0.1'
'python-keyring' changed from 'absent' to '7.3-1ubuntu1'
'python-django-appconf' changed from 'absent' to '1.0.1-4'
'python-oslo-utils' changed from 'absent' to '1'
'python-novaclient' changed from 'absent' to '2:9.1.0-0ubuntu1~cloud0'
'python-unicodecsv' changed from 'absent' to '0.14.1-1'
'python-mock' changed from 'absent' to '2.0.0-3~cloud0'
'python-rfc3986' changed from 'absent' to '0.3.1-2~cloud0'
'python-eventlet' changed from 'absent' to '0.18.4-1ubuntu1'
'python-django-horizon' changed from 'absent' to '3:12.0.2-0ubuntu1~cloud0'
'python2.7-pyparsing' changed from 'absent' to '1'
'python-oslo.log' changed from 'absent' to '3.30.0-0ubuntu1~cloud0'
'python-pyscss' changed from 'absent' to '1.3.4-5'
'python-pyinotify' changed from 'absent' to '0.9.6-0fakesync1'
'libjpeg-turbo8' changed from 'absent' to '1.4.2-0ubuntu3'
'python-amqp' changed from 'absent' to '2.1.4-1~cloud0'
'python-pbr' changed from 'absent' to '2.0.0-0ubuntu1~cloud0'
'libwebp5' changed from 'absent' to '0.4.4-1'
'python-vine' changed from 'absent' to '1.1.3+dfsg-2~cloud0'
'python-django-compressor' changed from 'absent' to '2.0-1ubuntu1'
'python-netifaces' changed from 'absent' to '0.10.4-0.1build2'
'python-osprofiler' changed from 'absent' to '1.11.0-0ubuntu1~cloud0'
'python-os-client-config' changed from 'absent' to '1.28.0-0ubuntu1~cloud0'
'python-oslo.messaging' changed from 'absent' to '5.30.0-0ubuntu2~cloud0'
'python-django-common' changed from 'absent' to '1.8.7-1ubuntu5.6'
'python-tempita' changed from 'absent' to '0.5.2-1build1'
'openstack-dashboard' changed from 'absent' to '3:12.0.2-0ubuntu1~cloud0'
'python-json-pointer' changed from 'absent' to '1.9-3'
'python-html5lib' changed from 'absent' to '0.999-4'
'python-swiftclient' changed from 'absent' to '1:3.4.0-0ubuntu1~cloud0'
'python-jwt' changed from 'absent' to '1.3.0-1ubuntu0.1'
'python2.7-iso8601' changed from 'absent' to '1'
'python-greenlet' changed from 'absent' to '0.4.9-2fakesync1'
'python-oslo.service' changed from 'absent' to '1.25.0-0ubuntu1~cloud0'
'python-rcssmin' changed from 'absent' to '1.0.6-1ubuntu1'
'python-ceilometerclient' changed from 'absent' to '2.9.0-0ubuntu1~cloud0'
'python-csscompressor' changed from 'absent' to '0.9.4-2'
'python-traceback2' changed from 'absent' to '1.4.0-3'
'python-keystoneauth1' changed from 'absent' to '3.1.0-0ubuntu2~cloud0'
'python-pymongo' changed from 'absent' to '3.2-1build1'
'python-requestsexceptions' changed from 'absent' to '1.1.2-0ubuntu1'
'python-oslo-context' changed from 'absent' to '1'
'python2.7-bson-ext' changed from 'absent' to '1'
'libjbig0' changed from 'absent' to '2.1-3.1'

2018-03-30 07:08:09,530 [salt.state       ][INFO    ][6189] Loading fresh modules for state activity
2018-03-30 07:08:09,553 [salt.state       ][INFO    ][6189] Completed state [openstack-dashboard] at time 07:08:09.553552 duration_in_ms=159653.888
2018-03-30 07:08:09,557 [salt.state       ][INFO    ][6189] Running state [python-lesscpy] at time 07:08:09.557051
2018-03-30 07:08:09,557 [salt.state       ][INFO    ][6189] Executing state pkg.installed for python-lesscpy
2018-03-30 07:08:10,559 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 07:08:10,595 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-lesscpy'] in directory '/root'
2018-03-30 07:08:11,803 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070811799318
2018-03-30 07:08:11,826 [salt.minion      ][INFO    ][10749] Starting a new job with PID 10749
2018-03-30 07:08:11,847 [salt.minion      ][INFO    ][10749] Returning information for job: 20180330070811799318
2018-03-30 07:08:14,848 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:08:14,898 [salt.state       ][INFO    ][6189] Made the following changes:
'python-ply' changed from 'absent' to '3.7-1'
'python-lesscpy' changed from 'absent' to '0.10-1'
'python-ply-yacc-3.5' changed from 'absent' to '1'
'python2.7-ply' changed from 'absent' to '1'
'python-ply-lex-3.5' changed from 'absent' to '1'

2018-03-30 07:08:14,915 [salt.state       ][INFO    ][6189] Loading fresh modules for state activity
2018-03-30 07:08:15,039 [salt.state       ][INFO    ][6189] Completed state [python-lesscpy] at time 07:08:15.039808 duration_in_ms=5482.754
2018-03-30 07:08:15,048 [salt.state       ][INFO    ][6189] Running state [python-memcache] at time 07:08:15.048291
2018-03-30 07:08:15,049 [salt.state       ][INFO    ][6189] Executing state pkg.installed for python-memcache
2018-03-30 07:08:15,448 [salt.state       ][INFO    ][6189] All specified packages are already installed
2018-03-30 07:08:15,448 [salt.state       ][INFO    ][6189] Completed state [python-memcache] at time 07:08:15.448518 duration_in_ms=400.228
2018-03-30 07:08:15,448 [salt.state       ][INFO    ][6189] Running state [gettext-base] at time 07:08:15.448791
2018-03-30 07:08:15,448 [salt.state       ][INFO    ][6189] Executing state pkg.installed for gettext-base
2018-03-30 07:08:15,453 [salt.state       ][INFO    ][6189] All specified packages are already installed
2018-03-30 07:08:15,453 [salt.state       ][INFO    ][6189] Completed state [gettext-base] at time 07:08:15.453291 duration_in_ms=4.499
2018-03-30 07:08:15,453 [salt.state       ][INFO    ][6189] Running state [openstack-dashboard-apache] at time 07:08:15.453846
2018-03-30 07:08:15,454 [salt.state       ][INFO    ][6189] Executing state pkg.purged for openstack-dashboard-apache
2018-03-30 07:08:15,460 [salt.state       ][INFO    ][6189] All specified packages are already absent
2018-03-30 07:08:15,461 [salt.state       ][INFO    ][6189] Completed state [openstack-dashboard-apache] at time 07:08:15.461105 duration_in_ms=7.258
2018-03-30 07:08:15,462 [salt.state       ][INFO    ][6189] Running state [/etc/openstack-dashboard/local_settings.py] at time 07:08:15.462699
2018-03-30 07:08:15,462 [salt.state       ][INFO    ][6189] Executing state file.managed for /etc/openstack-dashboard/local_settings.py
2018-03-30 07:08:15,488 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/local_settings/pike_settings.py'
2018-03-30 07:08:15,537 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_local_settings.py'
2018-03-30 07:08:15,600 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_horizon_settings.py'
2018-03-30 07:08:15,630 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_keystone_settings.py'
2018-03-30 07:08:15,680 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_nova_settings.py'
2018-03-30 07:08:15,706 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_glance_settings.py'
2018-03-30 07:08:15,731 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_neutron_settings.py'
2018-03-30 07:08:15,756 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_heat_settings.py'
2018-03-30 07:08:15,780 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_websso_settings.py'
2018-03-30 07:08:15,805 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_ssl_settings.py'
2018-03-30 07:08:15,819 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -1,169 +1,61 @@
-# -*- coding: utf-8 -*-
-
 import os
-
 from django.utils.translation import ugettext_lazy as _
-
-from horizon.utils import secret_key
-
-from openstack_dashboard.settings import HORIZON_CONFIG
+from openstack_dashboard import exceptions
+
+HORIZON_CONFIG = {
+    'user_home': 'openstack_dashboard.views.get_user_home',
+    'ajax_queue_limit': 10,
+    'auto_fade_alerts': {
+        'delay': 3000,
+        'fade_duration': 1500,
+        'types': ['alert-success', 'alert-info']
+    },
+    'help_url': "http://docs.openstack.org",
+    'exceptions': {'recoverable': exceptions.RECOVERABLE,
+                   'not_found': exceptions.NOT_FOUND,
+                   'unauthorized': exceptions.UNAUTHORIZED},
+    'modal_backdrop': 'static',
+    'angular_modules': [],
+    'js_files': [],
+    'js_spec_files': [],
+    'disable_password_reveal': True,
+    'password_autocomplete': 'off'
+}
+
+INSTALLED_APPS = (
+    'openstack_dashboard',
+    'django.contrib.contenttypes',
+    'django.contrib.auth',
+    'django.contrib.sessions',
+    'django.contrib.messages',
+    'django.contrib.staticfiles',
+    'django.contrib.humanize',
+    'compressor',
+    'horizon',
+    'openstack_auth',
+)
+
+
 
 DEBUG = False
 
-# This setting controls whether or not compression is enabled. Disabling
-# compression makes Horizon considerably slower, but makes it much easier
-# to debug JS and CSS changes
-#COMPRESS_ENABLED = not DEBUG
-
-# This setting controls whether compression happens on the fly, or offline
-# with `python manage.py compress`
-# See https://django-compressor.readthedocs.io/en/latest/usage/#offline-compression
-# for more information
-#COMPRESS_OFFLINE = not DEBUG
-
-# WEBROOT is the location relative to Webserver root
-# should end with a slash.
-WEBROOT = '/'
-#LOGIN_URL = WEBROOT + 'auth/login/'
-#LOGOUT_URL = WEBROOT + 'auth/logout/'
-#
-# LOGIN_REDIRECT_URL can be used as an alternative for
-# HORIZON_CONFIG.user_home, if user_home is not set.
-# Do not set it to '/home/', as this will cause circular redirect loop
-#LOGIN_REDIRECT_URL = WEBROOT
-
-# If horizon is running in production (DEBUG is False), set this
-# with the list of host/domain names that the application can serve.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
-#ALLOWED_HOSTS = ['horizon.example.com', ]
-
-# Set SSL proxy settings:
-# Pass this header from the proxy after terminating the SSL,
-# and don't forget to strip it from the client's request.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
-#SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
-
-# If Horizon is being served through SSL, then uncomment the following two
-# settings to better secure the cookies from security exploits
-#CSRF_COOKIE_SECURE = True
-#SESSION_COOKIE_SECURE = True
-
-# The absolute path to the directory where message files are collected.
-# The message file must have a .json file extension. When the user logins to
-# horizon, the message files collected are processed and displayed to the user.
-#MESSAGES_PATH=None
-
-# Overrides for OpenStack API versions. Use this setting to force the
-# OpenStack dashboard to use a specific API version for a given service API.
-# Versions specified here should be integers or floats, not strings.
-# NOTE: The version should be formatted as it appears in the URL for the
-# service API. For example, The identity service APIs have inconsistent
-# use of the decimal point, so valid options would be 2.0 or 3.
-# Minimum compute version to get the instance locked status is 2.9.
-#OPENSTACK_API_VERSIONS = {
-#    "data-processing": 1.1,
-#    "identity": 3,
-#    "image": 2,
-#    "volume": 2,
-#    "compute": 2,
-#}
-
-# Set this to True if running on a multi-domain model. When this is enabled, it
-# will require the user to enter the Domain name in addition to the username
-# for login.
-#OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
-
-# Set this to True if you want available domains displayed as a dropdown menu
-# on the login screen. It is strongly advised NOT to enable this for public
-# clouds, as advertising enabled domains to unauthenticated customers
-# irresponsibly exposes private information. This should only be used for
-# private clouds where the dashboard sits behind a corporate firewall.
-#OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN = False
-
-# If OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN is enabled, this option can be used to
-# set the available domains to choose from. This is a list of pairs whose first
-# value is the domain name and the second is the display name.
-#OPENSTACK_KEYSTONE_DOMAIN_CHOICES = (
-#  ('Default', 'Default'),
-#)
-
-# Overrides the default domain used when running on single-domain model
-# with Keystone V3. All entities will be created in the default domain.
-# NOTE: This value must be the name of the default domain, NOT the ID.
-# Also, you will most likely have a value in the keystone policy file like this
-#    "cloud_admin": "rule:admin_required and domain_id:<your domain id>"
-# This value must be the name of the domain whose ID is specified there.
-#OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
-
-# Set this to True to enable panels that provide the ability for users to
-# manage Identity Providers (IdPs) and establish a set of rules to map
-# federation protocol attributes to Identity API attributes.
-# This extension requires v3.0+ of the Identity API.
-#OPENSTACK_KEYSTONE_FEDERATION_MANAGEMENT = False
-
-# Set Console type:
-# valid options are "AUTO"(default), "VNC", "SPICE", "RDP", "SERIAL" or None
-# Set to None explicitly if you want to deactivate the console.
-#CONSOLE_TYPE = "AUTO"
-
-# If provided, a "Report Bug" link will be displayed in the site header
-# which links to the value of this setting (ideally a URL containing
-# information on how to report issues).
-#HORIZON_CONFIG["bug_url"] = "http://bug-report.example.com"
-
-# Show backdrop element outside the modal, do not close the modal
-# after clicking on backdrop.
-#HORIZON_CONFIG["modal_backdrop"] = "static"
-
-# Specify a regular expression to validate user passwords.
-#HORIZON_CONFIG["password_validator"] = {
-#    "regex": '.*',
-#    "help_text": _("Your password does not meet the requirements."),
-#}
-
-# Disable simplified floating IP address management for deployments with
-# multiple floating IP pools or complex network requirements.
-#HORIZON_CONFIG["simple_ip_management"] = False
-
-# Turn off browser autocompletion for forms including the login form and
-# the database creation workflow if so desired.
-#HORIZON_CONFIG["password_autocomplete"] = "off"
-
-# Setting this to True will disable the reveal button for password fields,
-# including on the login form.
-#HORIZON_CONFIG["disable_password_reveal"] = False
+TEMPLATE_DEBUG = DEBUG
+
+ALLOWED_HOSTS = ['*']
+
+AUTHENTICATION_URLS = ['openstack_auth.urls']
 
 LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
 
-# Set custom secret key:
-# You can either set it to a specific value or you can let horizon generate a
-# default secret key that is unique on this machine, e.i. regardless of the
-# amount of Python WSGI workers (if used behind Apache+mod_wsgi): However,
-# there may be situations where you would want to set this explicitly, e.g.
-# when multiple dashboard instances are distributed on different machines
-# (usually behind a load-balancer). Either you have to make sure that a session
-# gets all requests routed to the same dashboard instance or you set the same
-# SECRET_KEY for all of them.
-SECRET_KEY = secret_key.generate_or_read_from_file('/var/lib/openstack-dashboard/secret_key')
-
-# We recommend you use memcached for development; otherwise after every reload
-# of the django development server, you will have to login again. To use
-# memcached set CACHES to something like
+SECRET_KEY = 'opaesee8Que2yahJoh9fo0eefo1Aeyo6ahyei8zeiboh3aeth5loth7ieNa5xi5e'
 
 CACHES = {
     'default': {
+
         'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-        'LOCATION': '127.0.0.1:11211',
-    },
-}
-
-#CACHES = {
-#    'default': {
-#        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
-#    }
-#}
+        'LOCATION': "172.30.10.102:11211"
+    }
+}
 
 # Send email to the console by default
 EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
@@ -171,75 +63,247 @@
 #EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
 
 # Configure these for your outgoing email host
-#EMAIL_HOST = 'smtp.my-company.com'
-#EMAIL_PORT = 25
-#EMAIL_HOST_USER = 'djangomail'
-#EMAIL_HOST_PASSWORD = 'top-secret!'
+# EMAIL_HOST = 'smtp.my-company.com'
+# EMAIL_PORT = 25
+# EMAIL_HOST_USER = 'djangomail'
+# EMAIL_HOST_PASSWORD = 'top-secret!'
+
+# The number of objects (Swift containers/objects or images) to display
+# on a single page before providing a paging element (a "more" link)
+# to paginate results.
+API_RESULT_LIMIT = 1000
+API_RESULT_PAGE_SIZE = 20
+
+# The timezone of the server. This should correspond with the timezone
+# of your entire OpenStack installation, and hopefully be in UTC.
+TIME_ZONE = "UTC"
+
+COMPRESS_OFFLINE = True
+
+# Trove user and database extension support. By default support for
+# creating users and databases on database instances is turned on.
+# To disable these extensions set the permission here to something
+# unusable such as ["!"].
+# TROVE_ADD_USER_PERMS = []
+# TROVE_ADD_DATABASE_PERMS = []
+
+SITE_BRANDING = 'OpenStack Dashboard'
+SESSION_COOKIE_HTTPONLY = True
+BOOT_ONLY_FROM_VOLUME = True
+
+REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
+                             'LAUNCH_INSTANCE_DEFAULTS',
+                             'OPENSTACK_IMAGE_FORMATS']
+
+
+# Specify a regular expression to validate user passwords.
+# HORIZON_CONFIG["password_validator"] = {
+#     "regex": '.*',
+#     "help_text": _("Your password does not meet the requirements.")
+# }
+
+# Turn off browser autocompletion for the login form if so desired.
+# HORIZON_CONFIG["password_autocomplete"] = "off"
+
+# The Horizon Policy Enforcement engine uses these values to load per service
+# policy rule files. The content of these files should match the files the
+# OpenStack services are using to determine role based access control in the
+# target installation.
+
+SESSION_TIMEOUT = 43200
+SESSION_ENGINE = "django.contrib.sessions.backends.cache"
+DROPDOWN_MAX_ITEMS = 30
+
+# Path to directory containing policy.json files
+POLICY_FILES_PATH = "/usr/share/openstack-dashboard/openstack_dashboard/conf"
+# Map of local copy of service policy files
+POLICY_FILES = {
+    "compute": "nova_policy.json",
+    "network": "neutron_policy.json",
+    "image": "glance_policy.json",
+    "telemetry": "ceilometer_policy.json",
+    "volume": "cinder_policy.json",
+    "orchestration": "heat_policy.json",
+    "identity": "keystone_policy.json",
+}
+
+LOGGING = {
+    'version': 1,
+    # When set to True this will disable all logging except
+    # for loggers specified in this configuration dictionary. Note that
+    # if nothing is specified here and disable_existing_loggers is True,
+    # django.db.backends will still log unless it is disabled explicitly.
+    
+    'disable_existing_loggers': False,
+    'handlers': {
+        'null': {
+            'level': 'DEBUG',
+            'class': 'logging.NullHandler',
+        },
+        'console': {
+            # Set the level to "DEBUG" for verbose output logging.
+            'level': 'INFO',
+            'class': 'logging.StreamHandler',
+        },
+        'file': {
+            'level': 'DEBUG',
+            'class': 'logging.FileHandler',
+            'filename': '/var/log/horizon/horizon.log',
+        },
+    },
+    'loggers': {
+        # Logging from django.db.backends is VERY verbose, send to null
+        # by default.
+        'django.db.backends': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        # DEBUG level for django.template starting Pike has some false positive traces, set it to INFO
+        # by default. Caused by bug PROD-17558.
+        'django.template': {
+            'handlers': ['file'],
+            'level': 'INFO',
+            'propagate': True,
+        },
+        'requests': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'horizon': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_dashboard': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'novaclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'cinderclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'keystoneclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'glanceclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'neutronclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'heatclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'ceilometerclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'troveclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'mistralclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'swiftclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_auth': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'scss.expression': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'nose.plugins.manager': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'django': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'iso8601': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+    }
+}
+
+
+# Overrides for OpenStack API versions. Use this setting to force the
+# OpenStack dashboard to use a specfic API version for a given service API.
+# NOTE: The version should be formatted as it appears in the URL for the
+# service API. For example, The identity service APIs have inconsistent
+# use of the decimal point, so valid options would be "2.0" or "3".
+OPENSTACK_API_VERSIONS = {
+    "identity": 3
+}
+# Set this to True if running on multi-domain model. When this is enabled, it
+# will require user to enter the Domain name in addition to username for login.
+# OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+
+# Overrides the default domain used when running on single-domain model
+# with Keystone V3. All entities will be created in the default domain.
+# OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
 
 # For multiple regions uncomment this configuration, and add (endpoint, title).
-#AVAILABLE_REGIONS = [
-#    ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
-#    ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
-#]
-
-OPENSTACK_HOST = "127.0.0.1"
-OPENSTACK_KEYSTONE_URL = "http://%s:5000/v2.0" % OPENSTACK_HOST
-OPENSTACK_KEYSTONE_DEFAULT_ROLE = "_member_"
-
-# For setting the default service region on a per-endpoint basis. Note that the
-# default value for this setting is {}, and below is just an example of how it
-# should be specified.
-#DEFAULT_SERVICE_REGIONS = {
-#    OPENSTACK_KEYSTONE_URL: 'RegionOne'
-#}
-
-# Enables keystone web single-sign-on if set to True.
-#WEBSSO_ENABLED = False
-
-# Determines which authentication choice to show as default.
-#WEBSSO_INITIAL_CHOICE = "credentials"
-
-# The list of authentication mechanisms which include keystone
-# federation protocols and identity provider/federation protocol
-# mapping keys (WEBSSO_IDP_MAPPING). Current supported protocol
-# IDs are 'saml2' and 'oidc'  which represent SAML 2.0, OpenID
-# Connect respectively.
-# Do not remove the mandatory credentials mechanism.
-# Note: The last two tuples are sample mapping keys to a identity provider
-# and federation protocol combination (WEBSSO_IDP_MAPPING).
-#WEBSSO_CHOICES = (
-#    ("credentials", _("Keystone Credentials")),
-#    ("oidc", _("OpenID Connect")),
-#    ("saml2", _("Security Assertion Markup Language")),
-#    ("acme_oidc", "ACME - OpenID Connect"),
-#    ("acme_saml2", "ACME - SAML2"),
-#)
-
-# A dictionary of specific identity provider and federation protocol
-# combinations. From the selected authentication mechanism, the value
-# will be looked up as keys in the dictionary. If a match is found,
-# it will redirect the user to a identity provider and federation protocol
-# specific WebSSO endpoint in keystone, otherwise it will use the value
-# as the protocol_id when redirecting to the WebSSO by protocol endpoint.
-# NOTE: The value is expected to be a tuple formatted as: (<idp_id>, <protocol_id>).
-#WEBSSO_IDP_MAPPING = {
-#    "acme_oidc": ("acme", "oidc"),
-#    "acme_saml2": ("acme", "saml2"),
-#}
-
-# The Keystone Provider drop down uses Keystone to Keystone federation
-# to switch between Keystone service providers.
-# Set display name for Identity Provider (dropdown display name)
-#KEYSTONE_PROVIDER_IDP_NAME = "Local Keystone"
-# This id is used for only for comparison with the service provider IDs. This ID
-# should not match any service provider IDs.
-#KEYSTONE_PROVIDER_IDP_ID = "localkeystone"
+# AVAILABLE_REGIONS = [
+#     ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
+#     ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
+# ]
+
+
+OPENSTACK_HOST = "10.167.4.35"
+OPENSTACK_KEYSTONE_URL = "http://%s:5000/v3" % OPENSTACK_HOST
+
+OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = "default"
+
+OPENSTACK_KEYSTONE_DEFAULT_ROLE = "Member"
 
 # Disable SSL certificate checks (useful for self-signed certificates):
-#OPENSTACK_SSL_NO_VERIFY = True
 
 # The CA certificate to use to verify SSL connections
-#OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+# OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+
+# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is 'publicURL'.
+OPENSTACK_ENDPOINT_TYPE = "internalURL"
+
+# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
+# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is None.  This
+# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
+#SECONDARY_ENDPOINT_TYPE = "publicURL"
 
 # The OPENSTACK_KEYSTONE_BACKEND settings can be used to identify the
 # capabilities of the auth backend for Keystone.
@@ -253,43 +317,13 @@
     'can_edit_group': True,
     'can_edit_project': True,
     'can_edit_domain': True,
-    'can_edit_role': True,
-}
-
-# Setting this to True, will add a new "Retrieve Password" action on instance,
-# allowing Admin session password retrieval/decryption.
-#OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
-
-# This setting allows deployers to control whether a token is deleted on log
-# out. This can be helpful when there are often long running processes being
-# run in the Horizon environment.
-#TOKEN_DELETION_DISABLED = False
-
-# The Launch Instance user experience has been significantly enhanced.
-# You can choose whether to enable the new launch instance experience,
-# the legacy experience, or both. The legacy experience will be removed
-# in a future release, but is available as a temporary backup setting to ensure
-# compatibility with existing deployments. Further development will not be
-# done on the legacy experience. Please report any problems with the new
-# experience via the Launchpad tracking system.
-#
-# Toggle LAUNCH_INSTANCE_LEGACY_ENABLED and LAUNCH_INSTANCE_NG_ENABLED to
-# determine the experience to enable.  Set them both to true to enable
-# both.
-#LAUNCH_INSTANCE_LEGACY_ENABLED = True
-#LAUNCH_INSTANCE_NG_ENABLED = False
-
-# A dictionary of settings which can be used to provide the default values for
-# properties found in the Launch Instance modal.
-#LAUNCH_INSTANCE_DEFAULTS = {
-#    'config_drive': False,
-#    'enable_scheduler_hints': True,
-#    'disable_image': False,
-#    'disable_instance_snapshot': False,
-#    'disable_volume': False,
-#    'disable_volume_snapshot': False,
-#    'create_volume': True,
-#}
+    'can_edit_role': True
+}
+
+
+# Set Console type:
+# valid options would be "AUTO", "VNC" or "SPICE"
+# CONSOLE_TYPE = "AUTO"
 
 # The Xen Hypervisor has the ability to set the mount point for volumes
 # attached to instances (other Hypervisors currently do not). Setting
@@ -298,97 +332,52 @@
 OPENSTACK_HYPERVISOR_FEATURES = {
     'can_set_mount_point': False,
     'can_set_password': False,
-    'requires_keypair': False,
-    'enable_quotas': True
-}
-
-# The OPENSTACK_CINDER_FEATURES settings can be used to enable optional
-# services provided by cinder that is not exposed by its extension API.
-OPENSTACK_CINDER_FEATURES = {
-    'enable_backup': False,
-}
-
-# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
-# services provided by neutron. Options currently available are load
-# balancer service, security groups, quotas, VPN service.
-OPENSTACK_NEUTRON_NETWORK = {
-    'enable_router': True,
-    'enable_quotas': True,
-    'enable_ipv6': True,
-    'enable_distributed_router': False,
-    'enable_ha_router': False,
-    'enable_fip_topology_check': True,
-
-    # Default dns servers you would like to use when a subnet is
-    # created.  This is only a default, users can still choose a different
-    # list of dns servers when creating a new subnet.
-    # The entries below are examples only, and are not appropriate for
-    # real deployments
-    # 'default_dns_nameservers': ["8.8.8.8", "8.8.4.4", "208.67.222.222"],
-
-    # Set which provider network types are supported. Only the network types
-    # in this list will be available to choose from when creating a network.
-    # Network types include local, flat, vlan, gre, vxlan and geneve.
-    # 'supported_provider_types': ['*'],
-
-    # You can configure available segmentation ID range per network type
-    # in your deployment.
-    # 'segmentation_id_range': {
-    #     'vlan': [1024, 2048],
-    #     'vxlan': [4094, 65536],
-    # },
-
-    # You can define additional provider network types here.
-    # 'extra_provider_types': {
-    #     'awesome_type': {
-    #         'display_name': 'Awesome New Type',
-    #         'require_physical_network': False,
-    #         'require_segmentation_id': True,
-    #     }
-    # },
-
-    # Set which VNIC types are supported for port binding. Only the VNIC
-    # types in this list will be available to choose from when creating a
-    # port.
-    # VNIC types include 'normal', 'direct', 'direct-physical', 'macvtap',
-    # 'baremetal' and 'virtio-forwarder'
-    # Set to empty list or None to disable VNIC type selection.
-    'supported_vnic_types': ['*'],
-
-    # Set list of available physical networks to be selected in the physical
-    # network field on the admin create network modal. If it's set to an empty
-    # list, the field will be a regular input field.
-    # e.g. ['default', 'test']
-    'physical_networks': [],
-
-}
-
-# The OPENSTACK_HEAT_STACK settings can be used to disable password
-# field required while launching the stack.
-OPENSTACK_HEAT_STACK = {
-    'enable_user_pass': True,
-}
+}
+
+# When set, enables the instance action "Retrieve password"
+# allowing password retrieval
+OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
+
+# When launching an instance, the menu of available flavors is
+# sorted by RAM usage, ascending.  Provide a callback method here
+# (and/or a flag for reverse sort) for the sorted() method if you'd
+# like a different behaviour.  For more info, see
+# http://docs.python.org/2/library/functions.html#sorted
+# CREATE_INSTANCE_FLAVOR_SORT = {
+#     'key': my_awesome_callback_method,
+#     'reverse': False,
+# }
+
+FLAVOR_EXTRA_KEYS = {
+    'flavor_keys': [
+        ('quota:read_bytes_sec', _('Quota: Read bytes')),
+        ('quota:write_bytes_sec', _('Quota: Write bytes')),
+        ('quota:cpu_quota', _('Quota: CPU')),
+        ('quota:cpu_period', _('Quota: CPU period')),
+        ('quota:inbound_average', _('Quota: Inbound average')),
+        ('quota:outbound_average', _('Quota: Outbound average')),
+    ]
+}
+
 
 # The OPENSTACK_IMAGE_BACKEND settings can be used to customize features
 # in the OpenStack Dashboard related to the Image service, such as the list
 # of supported image formats.
-#OPENSTACK_IMAGE_BACKEND = {
-#    'image_formats': [
-#        ('', _('Select format')),
-#        ('aki', _('AKI - Amazon Kernel Image')),
-#        ('ami', _('AMI - Amazon Machine Image')),
-#        ('ari', _('ARI - Amazon Ramdisk Image')),
-#        ('docker', _('Docker')),
-#        ('iso', _('ISO - Optical Disk Image')),
-#        ('ova', _('OVA - Open Virtual Appliance')),
-#        ('qcow2', _('QCOW2 - QEMU Emulator')),
-#        ('raw', _('Raw')),
-#        ('vdi', _('VDI - Virtual Disk Image')),
-#        ('vhd', _('VHD - Virtual Hard Disk')),
-#        ('vhdx', _('VHDX - Large Virtual Hard Disk')),
-#        ('vmdk', _('VMDK - Virtual Machine Disk')),
-#    ],
-#}
+OPENSTACK_IMAGE_BACKEND = {
+    'image_formats': [
+        ('', ''),
+        ('aki', _('AKI - Amazon Kernel Image')),
+        ('ami', _('AMI - Amazon Machine Image')),
+        ('ari', _('ARI - Amazon Ramdisk Image')),
+        ('iso', _('ISO - Optical Disk Image')),
+        ('qcow2', _('QCOW2 - QEMU Emulator')),
+        ('raw', _('Raw')),
+        ('vdi', _('VDI')),
+        ('vhd', _('VHD')),
+        ('vmdk', _('VMDK')),
+        ('docker', _('Docker Container'))
+    ]
+}
 
 # The IMAGE_CUSTOM_PROPERTY_TITLES settings is used to customize the titles for
 # image custom property attributes that appear on image detail pages.
@@ -398,273 +387,53 @@
     "ramdisk_id": _("Ramdisk ID"),
     "image_state": _("Euca2ools state"),
     "project_id": _("Project ID"),
-    "image_type": _("Image Type"),
-}
-
-# The IMAGE_RESERVED_CUSTOM_PROPERTIES setting is used to specify which image
-# custom properties should not be displayed in the Image Custom Properties
-# table.
-IMAGE_RESERVED_CUSTOM_PROPERTIES = []
-
-# Set to 'legacy' or 'direct' to allow users to upload images to glance via
-# Horizon server. When enabled, a file form field will appear on the create
-# image form. If set to 'off', there will be no file form field on the create
-# image form. See documentation for deployment considerations.
-#HORIZON_IMAGES_UPLOAD_MODE = 'legacy'
-
-# Allow a location to be set when creating or updating Glance images.
-# If using Glance V2, this value should be False unless the Glance
-# configuration and policies allow setting locations.
-#IMAGES_ALLOW_LOCATION = False
-
-# A dictionary of default settings for create image modal.
-#CREATE_IMAGE_DEFAULTS = {
-#    'image_visibility': "public",
-#}
-
-# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is 'publicURL'.
-#OPENSTACK_ENDPOINT_TYPE = "publicURL"
-
-# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
-# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is None. This
-# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
-#SECONDARY_ENDPOINT_TYPE = None
-
-# The number of objects (Swift containers/objects or images) to display
-# on a single page before providing a paging element (a "more" link)
-# to paginate results.
-API_RESULT_LIMIT = 1000
-API_RESULT_PAGE_SIZE = 20
-
-# The size of chunk in bytes for downloading objects from Swift
-SWIFT_FILE_TRANSFER_CHUNK_SIZE = 512 * 1024
-
-# The default number of lines displayed for instance console log.
-INSTANCE_LOG_LENGTH = 35
-
-# Specify a maximum number of items to display in a dropdown.
-DROPDOWN_MAX_ITEMS = 30
-
-# The timezone of the server. This should correspond with the timezone
-# of your entire OpenStack installation, and hopefully be in UTC.
-TIME_ZONE = "UTC"
-
-# When launching an instance, the menu of available flavors is
-# sorted by RAM usage, ascending. If you would like a different sort order,
-# you can provide another flavor attribute as sorting key. Alternatively, you
-# can provide a custom callback method to use for sorting. You can also provide
-# a flag for reverse sort. For more info, see
-# http://docs.python.org/2/library/functions.html#sorted
-#CREATE_INSTANCE_FLAVOR_SORT = {
-#    'key': 'name',
-#     # or
-#    'key': my_awesome_callback_method,
-#    'reverse': False,
-#}
-
-# Set this to True to display an 'Admin Password' field on the Change Password
-# form to verify that it is indeed the admin logged-in who wants to change
-# the password.
-#ENFORCE_PASSWORD_CHECK = False
-
-# Modules that provide /auth routes that can be used to handle different types
-# of user authentication. Add auth plugins that require extra route handling to
-# this list.
-#AUTHENTICATION_URLS = [
-#    'openstack_auth.urls',
-#]
-
-# The Horizon Policy Enforcement engine uses these values to load per service
-# policy rule files. The content of these files should match the files the
-# OpenStack services are using to determine role based access control in the
-# target installation.
-
-# Path to directory containing policy.json files
-#POLICY_FILES_PATH = os.path.join(ROOT_PATH, "conf")
-
-# Map of local copy of service policy files.
-# Please insure that your identity policy file matches the one being used on
-# your keystone servers. There is an alternate policy file that may be used
-# in the Keystone v3 multi-domain case, policy.v3cloudsample.json.
-# This file is not included in the Horizon repository by default but can be
-# found at
-# http://git.openstack.org/cgit/openstack/keystone/tree/etc/ \
-# policy.v3cloudsample.json
-# Having matching policy files on the Horizon and Keystone servers is essential
-# for normal operation. This holds true for all services and their policy files.
-#POLICY_FILES = {
-#    'identity': 'keystone_policy.json',
-#    'compute': 'nova_policy.json',
-#    'volume': 'cinder_policy.json',
-#    'image': 'glance_policy.json',
-#    'orchestration': 'heat_policy.json',
-#    'network': 'neutron_policy.json',
-#}
-
-# TODO: (david-lyle) remove when plugins support adding settings.
-# Note: Only used when trove-dashboard plugin is configured to be used by
-# Horizon.
-# Trove user and database extension support. By default support for
-# creating users and databases on database instances is turned on.
-# To disable these extensions set the permission here to something
-# unusable such as ["!"].
-#TROVE_ADD_USER_PERMS = []
-#TROVE_ADD_DATABASE_PERMS = []
-
-# Change this patch to the appropriate list of tuples containing
-# a key, label and static directory containing two files:
-# _variables.scss and _styles.scss
-#AVAILABLE_THEMES = [
-#    ('default', 'Default', 'themes/default'),
-#    ('material', 'Material', 'themes/material'),
-#]
-
-LOGGING = {
-    'version': 1,
-    # When set to True this will disable all logging except
-    # for loggers specified in this configuration dictionary. Note that
-    # if nothing is specified here and disable_existing_loggers is True,
-    # django.db.backends will still log unless it is disabled explicitly.
-    'disable_existing_loggers': False,
-    # If apache2 mod_wsgi is used to deploy OpenStack dashboard
-    # timestamp is output by mod_wsgi. If WSGI framework you use does not
-    # output timestamp for logging, add %(asctime)s in the following
-    # format definitions.
-    'formatters': {
-        'console': {
-            'format': '%(levelname)s %(name)s %(message)s'
-        },
-        'operation': {
-            # The format of "%(message)s" is defined by
-            # OPERATION_LOG_OPTIONS['format']
-            'format': '%(message)s'
-        },
-    },
-    'handlers': {
-        'null': {
-            'level': 'DEBUG',
-            'class': 'logging.NullHandler',
-        },
-        'console': {
-            # Set the level to "DEBUG" for verbose output logging.
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'console',
-        },
-        'operation': {
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'operation',
-        },
-    },
-    'loggers': {
-        # Logging from django.db.backends is VERY verbose, send to null
-        # by default.
-        'django.db.backends': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'requests': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'horizon': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'horizon.operation_log': {
-            'handlers': ['operation'],
-            'level': 'INFO',
-            'propagate': False,
-        },
-        'openstack_dashboard': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'novaclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'cinderclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'glanceclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'neutronclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'heatclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'swiftclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'openstack_auth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'nose.plugins.manager': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'django': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'iso8601': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'scss': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-    },
+    "image_type": _("Image Type")
+}
+
+HORIZON_IMAGES_UPLOAD_MODE = "legacy"
+IMAGES_ALLOW_LOCATION = True
+
+
+# Disable simplified floating IP address management for deployments with
+# multiple floating IP pools or complex network requirements.
+# HORIZON_CONFIG["simple_ip_management"] = False
+
+# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
+# services provided by neutron. Options currenly available are load
+# balancer service, security groups, quotas, VPN service.
+
+OPENSTACK_NEUTRON_NETWORK = {
+    'enable_lb': True,
+    'enable_firewall': False,
+    'enable_quotas': True,
+    'enable_security_group': True,
+    'enable_vpn': False,
+    # The profile_support option is used to detect if an externa lrouter can be
+    # configured via the dashboard. When using specific plugins the
+    # profile_support can be turned on if needed.
+    'profile_support': None,
+    'enable_fip_topology_check': True,
+
+    #'profile_support': 'cisco',
 }
 
 # 'direction' should not be specified for all_tcp/udp/icmp.
 # It is specified in the form.
 SECURITY_GROUP_RULES = {
     'all_tcp': {
-        'name': _('All TCP'),
+        'name': 'ALL TCP',
         'ip_protocol': 'tcp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_udp': {
-        'name': _('All UDP'),
+        'name': 'ALL UDP',
         'ip_protocol': 'udp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_icmp': {
-        'name': _('All ICMP'),
+        'name': 'ALL ICMP',
         'ip_protocol': 'icmp',
         'from_port': '-1',
         'to_port': '-1',
@@ -755,160 +524,12 @@
     },
 }
 
-# Deprecation Notice:
-#
-# The setting FLAVOR_EXTRA_KEYS has been deprecated.
-# Please load extra spec metadata into the Glance Metadata Definition Catalog.
-#
-# The sample quota definitions can be found in:
-# <glance_source>/etc/metadefs/compute-quota.json
-#
-# The metadata definition catalog supports CLI and API:
-#  $glance --os-image-api-version 2 help md-namespace-import
-#  $glance-manage db_load_metadefs <directory_with_definition_files>
-#
-# See Metadata Definitions on: http://docs.openstack.org/developer/glance/
-
-# TODO: (david-lyle) remove when plugins support settings natively
-# Note: This is only used when the Sahara plugin is configured and enabled
-# for use in Horizon.
-# Indicate to the Sahara data processing service whether or not
-# automatic floating IP allocation is in effect.  If it is not
-# in effect, the user will be prompted to choose a floating IP
-# pool for use in their cluster.  False by default.  You would want
-# to set this to True if you were running Nova Networking with
-# auto_assign_floating_ip = True.
-#SAHARA_AUTO_IP_ALLOCATION_ENABLED = False
-
-# The hash algorithm to use for authentication tokens. This must
-# match the hash algorithm that the identity server and the
-# auth_token middleware are using. Allowed values are the
-# algorithms supported by Python's hashlib library.
-#OPENSTACK_TOKEN_HASH_ALGORITHM = 'md5'
-
-# AngularJS requires some settings to be made available to
-# the client side. Some settings are required by in-tree / built-in horizon
-# features. These settings must be added to REST_API_REQUIRED_SETTINGS in the
-# form of ['SETTING_1','SETTING_2'], etc.
-#
-# You may remove settings from this list for security purposes, but do so at
-# the risk of breaking a built-in horizon feature. These settings are required
-# for horizon to function properly. Only remove them if you know what you
-# are doing. These settings may in the future be moved to be defined within
-# the enabled panel configuration.
-# You should not add settings to this list for out of tree extensions.
-# See: https://wiki.openstack.org/wiki/Horizon/RESTAPI
-REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
-                              'LAUNCH_INSTANCE_DEFAULTS',
-                              'OPENSTACK_IMAGE_FORMATS',
-                              'OPENSTACK_KEYSTONE_DEFAULT_DOMAIN',
-                              'CREATE_IMAGE_DEFAULTS']
-
-# Additional settings can be made available to the client side for
-# extensibility by specifying them in REST_API_ADDITIONAL_SETTINGS
-# !! Please use extreme caution as the settings are transferred via HTTP/S
-# and are not encrypted on the browser. This is an experimental API and
-# may be deprecated in the future without notice.
-#REST_API_ADDITIONAL_SETTINGS = []
-
-###############################################################################
-# Ubuntu Settings
-###############################################################################
-
- # The default theme if no cookie is present
-DEFAULT_THEME = 'ubuntu'
-
-# Default Ubuntu apache configuration uses /horizon as the application root.
-WEBROOT='/horizon/'
-
-# By default, validation of the HTTP Host header is disabled.  Production
-# installations should have this set accordingly.  For more information
-# see https://docs.djangoproject.com/en/dev/ref/settings/.
-ALLOWED_HOSTS = '*'
-
-# Compress all assets offline as part of packaging installation
-COMPRESS_OFFLINE = True
-
-# DISALLOW_IFRAME_EMBED can be used to prevent Horizon from being embedded
-# within an iframe. Legacy browsers are still vulnerable to a Cross-Frame
-# Scripting (XFS) vulnerability, so this option allows extra security hardening
-# where iframes are not used in deployment. Default setting is True.
-# For more information see:
-# http://tinyurl.com/anticlickjack
-#DISALLOW_IFRAME_EMBED = True
-
-# Help URL can be made available for the client. To provide a help URL, edit the
-# following attribute to the URL of your choice.
-#HORIZON_CONFIG["help_url"] = "http://openstack.mycompany.org"
-
-# Settings for OperationLogMiddleware
-# OPERATION_LOG_ENABLED is flag to use the function to log an operation on
-# Horizon.
-# mask_targets is arrangement for appointing a target to mask.
-# method_targets is arrangement of HTTP method to output log.
-# format is the log contents.
-#OPERATION_LOG_ENABLED = False
-#OPERATION_LOG_OPTIONS = {
-#    'mask_fields': ['password'],
-#    'target_methods': ['POST'],
-#    'ignored_urls': ['/js/', '/static/', '^/api/'],
-#    'format': ("[%(client_ip)s] [%(domain_name)s]"
-#        " [%(domain_id)s] [%(project_name)s]"
-#        " [%(project_id)s] [%(user_name)s] [%(user_id)s] [%(request_scheme)s]"
-#        " [%(referer_url)s] [%(request_url)s] [%(message)s] [%(method)s]"
-#        " [%(http_status)s] [%(param)s]"),
-#}
-
-# The default date range in the Overview panel meters - either <today> minus N
-# days (if the value is integer N), or from the beginning of the current month
-# until today (if set to None). This setting should be used to limit the amount
-# of data fetched by default when rendering the Overview panel.
-#OVERVIEW_DAYS_RANGE = 1
-
-# To allow operators to require users provide a search criteria first
-# before loading any data into the views, set the following dict
-# attributes to True in each one of the panels you want to enable this feature.
-# Follow the convention <dashboard>.<view>
-#FILTER_DATA_FIRST = {
-#    'admin.instances': False,
-#    'admin.images': False,
-#    'admin.networks': False,
-#    'admin.routers': False,
-#    'admin.volumes': False,
-#    'identity.users': False,
-#    'identity.projects': False,
-#    'identity.groups': False,
-#    'identity.roles': False
-#}
-
-# Dict used to restrict user private subnet cidr range.
-# An empty list means that user input will not be restricted
-# for a corresponding IP version. By default, there is
-# no restriction for IPv4 or IPv6. To restrict
-# user private subnet cidr range set ALLOWED_PRIVATE_SUBNET_CIDR
-# to something like
-#ALLOWED_PRIVATE_SUBNET_CIDR = {
-#    'ipv4': ['10.0.0.0/8', '192.168.0.0/16'],
-#    'ipv6': ['fc00::/7']
-#}
-ALLOWED_PRIVATE_SUBNET_CIDR = {'ipv4': [], 'ipv6': []}
-
-# Projects and users can have extra attributes as defined by keystone v3.
-# Horizon has the ability to display these extra attributes via this setting.
-# If you'd like to display extra data in the project or user tables, set the
-# corresponding dict key to the attribute name, followed by the display name.
-# For more information, see horizon's customization (http://docs.openstack.org/developer/horizon/topics/customizing.html#horizon-customization-module-overrides)
-#PROJECT_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-#USER_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-
-# Password will have an expiration date when using keystone v3 and enabling the
-# feature.
-# This setting allows you to set the number of days that the user will be alerted
-# prior to the password expiration.
-# Once the password expires keystone will deny the access and users must
-# contact an admin to change their password.
-#PASSWORD_EXPIRES_WARNING_THRESHOLD_DAYS = 0+
+
+
+
+
+USE_SSL = True
+CSRF_COOKIE_SECURE = True
+CSRF_COOKIE_SECURE = True
+SESSION_COOKIE_HTTPONLY = True

2018-03-30 07:08:15,865 [salt.state       ][INFO    ][6189] Loading fresh modules for state activity
2018-03-30 07:08:15,914 [salt.state       ][INFO    ][6189] Completed state [/etc/openstack-dashboard/local_settings.py] at time 07:08:15.914741 duration_in_ms=451.836
2018-03-30 07:08:15,924 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 07:08:15.924026
2018-03-30 07:08:15,924 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json
2018-03-30 07:08:15,962 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/nova_policy.json'
2018-03-30 07:08:15,968 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -2,175 +2,436 @@
     "context_is_admin":  "role:admin",
     "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
     "default": "rule:admin_or_owner",
+
+    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
+
+    "compute:create": "rule:admin_or_owner",
+    "compute:create:attach_network": "rule:admin_or_owner",
+    "compute:create:attach_volume": "rule:admin_or_owner",
+    "compute:create:forced_host": "is_admin:True",
+
+    "compute:get": "rule:admin_or_owner",
+    "compute:get_all": "rule:admin_or_owner",
+    "compute:get_all_tenants": "is_admin:True",
+
+    "compute:update": "rule:admin_or_owner",
+
+    "compute:get_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_system_metadata": "rule:admin_or_owner",
+    "compute:update_instance_metadata": "rule:admin_or_owner",
+    "compute:delete_instance_metadata": "rule:admin_or_owner",
+
+    "compute:get_diagnostics": "rule:admin_or_owner",
+    "compute:get_instance_diagnostics": "rule:admin_or_owner",
+
+    "compute:start": "rule:admin_or_owner",
+    "compute:stop": "rule:admin_or_owner",
+
+    "compute:lock": "rule:admin_or_owner",
+    "compute:unlock": "rule:admin_or_owner",
+    "compute:unlock_override": "rule:admin_api",
+
+    "compute:get_vnc_console": "rule:admin_or_owner",
+    "compute:get_spice_console": "rule:admin_or_owner",
+    "compute:get_rdp_console": "rule:admin_or_owner",
+    "compute:get_serial_console": "rule:admin_or_owner",
+    "compute:get_mks_console": "rule:admin_or_owner",
+    "compute:get_console_output": "rule:admin_or_owner",
+
+    "compute:reset_network": "rule:admin_or_owner",
+    "compute:inject_network_info": "rule:admin_or_owner",
+    "compute:add_fixed_ip": "rule:admin_or_owner",
+    "compute:remove_fixed_ip": "rule:admin_or_owner",
+
+    "compute:attach_volume": "rule:admin_or_owner",
+    "compute:detach_volume": "rule:admin_or_owner",
+    "compute:swap_volume": "rule:admin_api",
+
+    "compute:attach_interface": "rule:admin_or_owner",
+    "compute:detach_interface": "rule:admin_or_owner",
+
+    "compute:set_admin_password": "rule:admin_or_owner",
+
+    "compute:rescue": "rule:admin_or_owner",
+    "compute:unrescue": "rule:admin_or_owner",
+
+    "compute:suspend": "rule:admin_or_owner",
+    "compute:resume": "rule:admin_or_owner",
+
+    "compute:pause": "rule:admin_or_owner",
+    "compute:unpause": "rule:admin_or_owner",
+
+    "compute:shelve": "rule:admin_or_owner",
+    "compute:shelve_offload": "rule:admin_or_owner",
+    "compute:unshelve": "rule:admin_or_owner",
+
+    "compute:snapshot": "rule:admin_or_owner",
+    "compute:snapshot_volume_backed": "rule:admin_or_owner",
+    "compute:backup": "rule:admin_or_owner",
+
+    "compute:resize": "rule:admin_or_owner",
+    "compute:confirm_resize": "rule:admin_or_owner",
+    "compute:revert_resize": "rule:admin_or_owner",
+
+    "compute:rebuild": "rule:admin_or_owner",
+    "compute:reboot": "rule:admin_or_owner",
+    "compute:delete": "rule:admin_or_owner",
+    "compute:soft_delete": "rule:admin_or_owner",
+    "compute:force_delete": "rule:admin_or_owner",
+
+    "compute:security_groups:add_to_instance": "rule:admin_or_owner",
+    "compute:security_groups:remove_from_instance": "rule:admin_or_owner",
+
+    "compute:restore": "rule:admin_or_owner",
+
+    "compute:volume_snapshot_create": "rule:admin_or_owner",
+    "compute:volume_snapshot_delete": "rule:admin_or_owner",
+
     "admin_api": "is_admin:True",
-
+    "compute_extension:accounts": "rule:admin_api",
+    "compute_extension:admin_actions": "rule:admin_api",
+    "compute_extension:admin_actions:pause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unpause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:suspend": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resume": "rule:admin_or_owner",
+    "compute_extension:admin_actions:lock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unlock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resetNetwork": "rule:admin_api",
+    "compute_extension:admin_actions:injectNetworkInfo": "rule:admin_api",
+    "compute_extension:admin_actions:createBackup": "rule:admin_or_owner",
+    "compute_extension:admin_actions:migrateLive": "rule:admin_api",
+    "compute_extension:admin_actions:resetState": "rule:admin_api",
+    "compute_extension:admin_actions:migrate": "rule:admin_api",
+    "compute_extension:aggregates": "rule:admin_api",
+    "compute_extension:agents": "rule:admin_api",
+    "compute_extension:attach_interfaces": "rule:admin_or_owner",
+    "compute_extension:baremetal_nodes": "rule:admin_api",
+    "compute_extension:cells": "rule:admin_api",
+    "compute_extension:cells:create": "rule:admin_api",
+    "compute_extension:cells:delete": "rule:admin_api",
+    "compute_extension:cells:update": "rule:admin_api",
+    "compute_extension:cells:sync_instances": "rule:admin_api",
+    "compute_extension:certificates": "rule:admin_or_owner",
+    "compute_extension:cloudpipe": "rule:admin_api",
+    "compute_extension:cloudpipe_update": "rule:admin_api",
+    "compute_extension:config_drive": "rule:admin_or_owner",
+    "compute_extension:console_output": "rule:admin_or_owner",
+    "compute_extension:consoles": "rule:admin_or_owner",
+    "compute_extension:createserverext": "rule:admin_or_owner",
+    "compute_extension:deferred_delete": "rule:admin_or_owner",
+    "compute_extension:disk_config": "rule:admin_or_owner",
+    "compute_extension:evacuate": "rule:admin_api",
+    "compute_extension:extended_server_attributes": "rule:admin_api",
+    "compute_extension:extended_status": "rule:admin_or_owner",
+    "compute_extension:extended_availability_zone": "rule:admin_or_owner",
+    "compute_extension:extended_ips": "rule:admin_or_owner",
+    "compute_extension:extended_ips_mac": "rule:admin_or_owner",
+    "compute_extension:extended_vif_net": "rule:admin_or_owner",
+    "compute_extension:extended_volumes": "rule:admin_or_owner",
+    "compute_extension:fixed_ips": "rule:admin_api",
+    "compute_extension:flavor_access": "rule:admin_or_owner",
+    "compute_extension:flavor_access:addTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_access:removeTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_disabled": "rule:admin_or_owner",
+    "compute_extension:flavor_rxtx": "rule:admin_or_owner",
+    "compute_extension:flavor_swap": "rule:admin_or_owner",
+    "compute_extension:flavorextradata": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:index": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:show": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:create": "rule:admin_api",
+    "compute_extension:flavorextraspecs:update": "rule:admin_api",
+    "compute_extension:flavorextraspecs:delete": "rule:admin_api",
+    "compute_extension:flavormanage": "rule:admin_api",
+    "compute_extension:floating_ip_dns": "rule:admin_or_owner",
+    "compute_extension:floating_ip_pools": "rule:admin_or_owner",
+    "compute_extension:floating_ips": "rule:admin_or_owner",
+    "compute_extension:floating_ips_bulk": "rule:admin_api",
+    "compute_extension:fping": "rule:admin_or_owner",
+    "compute_extension:fping:all_tenants": "rule:admin_api",
+    "compute_extension:hide_server_addresses": "is_admin:False",
+    "compute_extension:hosts": "rule:admin_api",
+    "compute_extension:hypervisors": "rule:admin_api",
+    "compute_extension:image_size": "rule:admin_or_owner",
+    "compute_extension:instance_actions": "rule:admin_or_owner",
+    "compute_extension:instance_actions:events": "rule:admin_api",
+    "compute_extension:instance_usage_audit_log": "rule:admin_api",
+    "compute_extension:keypairs": "rule:admin_or_owner",
+    "compute_extension:keypairs:index": "rule:admin_or_owner",
+    "compute_extension:keypairs:show": "rule:admin_or_owner",
+    "compute_extension:keypairs:create": "rule:admin_or_owner",
+    "compute_extension:keypairs:delete": "rule:admin_or_owner",
+    "compute_extension:multinic": "rule:admin_or_owner",
+    "compute_extension:networks": "rule:admin_api",
+    "compute_extension:networks:view": "rule:admin_or_owner",
+    "compute_extension:networks_associate": "rule:admin_api",
+    "compute_extension:os-tenant-networks": "rule:admin_or_owner",
+    "compute_extension:quotas:show": "rule:admin_or_owner",
+    "compute_extension:quotas:update": "rule:admin_api",
+    "compute_extension:quotas:delete": "rule:admin_api",
+    "compute_extension:quota_classes": "rule:admin_or_owner",
+    "compute_extension:rescue": "rule:admin_or_owner",
+    "compute_extension:security_group_default_rules": "rule:admin_api",
+    "compute_extension:security_groups": "rule:admin_or_owner",
+    "compute_extension:server_diagnostics": "rule:admin_api",
+    "compute_extension:server_groups": "rule:admin_or_owner",
+    "compute_extension:server_password": "rule:admin_or_owner",
+    "compute_extension:server_usage": "rule:admin_or_owner",
+    "compute_extension:services": "rule:admin_api",
+    "compute_extension:shelve": "rule:admin_or_owner",
+    "compute_extension:shelveOffload": "rule:admin_api",
+    "compute_extension:simple_tenant_usage:show": "rule:admin_or_owner",
+    "compute_extension:simple_tenant_usage:list": "rule:admin_api",
+    "compute_extension:unshelve": "rule:admin_or_owner",
+    "compute_extension:users": "rule:admin_api",
+    "compute_extension:virtual_interfaces": "rule:admin_or_owner",
+    "compute_extension:virtual_storage_arrays": "rule:admin_or_owner",
+    "compute_extension:volumes": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:index": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:show": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:create": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:update": "rule:admin_api",
+    "compute_extension:volume_attachments:delete": "rule:admin_or_owner",
+    "compute_extension:volumetypes": "rule:admin_or_owner",
+    "compute_extension:availability_zone:list": "rule:admin_or_owner",
+    "compute_extension:availability_zone:detail": "rule:admin_api",
+    "compute_extension:used_limits_for_admin": "rule:admin_api",
+    "compute_extension:migrations:index": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "compute_extension:console_auth_tokens": "rule:admin_api",
+    "compute_extension:os-server-external-events:create": "rule:admin_api",
+
+    "network:get_all": "rule:admin_or_owner",
+    "network:get": "rule:admin_or_owner",
+    "network:create": "rule:admin_or_owner",
+    "network:delete": "rule:admin_or_owner",
+    "network:associate": "rule:admin_or_owner",
+    "network:disassociate": "rule:admin_or_owner",
+    "network:get_vifs_by_instance": "rule:admin_or_owner",
+    "network:allocate_for_instance": "rule:admin_or_owner",
+    "network:deallocate_for_instance": "rule:admin_or_owner",
+    "network:validate_networks": "rule:admin_or_owner",
+    "network:get_instance_uuids_by_ip_filter": "rule:admin_or_owner",
+    "network:get_instance_id_by_floating_address": "rule:admin_or_owner",
+    "network:setup_networks_on_host": "rule:admin_or_owner",
+    "network:get_backdoor_port": "rule:admin_or_owner",
+
+    "network:get_floating_ip": "rule:admin_or_owner",
+    "network:get_floating_ip_pools": "rule:admin_or_owner",
+    "network:get_floating_ip_by_address": "rule:admin_or_owner",
+    "network:get_floating_ips_by_project": "rule:admin_or_owner",
+    "network:get_floating_ips_by_fixed_address": "rule:admin_or_owner",
+    "network:allocate_floating_ip": "rule:admin_or_owner",
+    "network:associate_floating_ip": "rule:admin_or_owner",
+    "network:disassociate_floating_ip": "rule:admin_or_owner",
+    "network:release_floating_ip": "rule:admin_or_owner",
+    "network:migrate_instance_start": "rule:admin_or_owner",
+    "network:migrate_instance_finish": "rule:admin_or_owner",
+
+    "network:get_fixed_ip": "rule:admin_or_owner",
+    "network:get_fixed_ip_by_address": "rule:admin_or_owner",
+    "network:add_fixed_ip_to_instance": "rule:admin_or_owner",
+    "network:remove_fixed_ip_from_instance": "rule:admin_or_owner",
+    "network:add_network_to_project": "rule:admin_or_owner",
+    "network:get_instance_nw_info": "rule:admin_or_owner",
+
+    "network:get_dns_domains": "rule:admin_or_owner",
+    "network:add_dns_entry": "rule:admin_or_owner",
+    "network:modify_dns_entry": "rule:admin_or_owner",
+    "network:delete_dns_entry": "rule:admin_or_owner",
+    "network:get_dns_entries_by_address": "rule:admin_or_owner",
+    "network:get_dns_entries_by_name": "rule:admin_or_owner",
+    "network:create_private_dns_domain": "rule:admin_or_owner",
+    "network:create_public_dns_domain": "rule:admin_or_owner",
+    "network:delete_dns_domain": "rule:admin_or_owner",
+    "network:attach_external_network": "rule:admin_api",
+    "network:get_vif_by_mac_address": "rule:admin_or_owner",
+
+    "os_compute_api:servers:detail:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:index:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:create": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
+    "os_compute_api:servers:create:forced_host": "rule:admin_api",
+    "os_compute_api:servers:delete": "rule:admin_or_owner",
+    "os_compute_api:servers:update": "rule:admin_or_owner",
+    "os_compute_api:servers:detail": "rule:admin_or_owner",
+    "os_compute_api:servers:index": "rule:admin_or_owner",
+    "os_compute_api:servers:reboot": "rule:admin_or_owner",
+    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
+    "os_compute_api:servers:resize": "rule:admin_or_owner",
+    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:show": "rule:admin_or_owner",
+    "os_compute_api:servers:show:host_status": "rule:admin_api",
+    "os_compute_api:servers:create_image": "rule:admin_or_owner",
+    "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
+    "os_compute_api:servers:start": "rule:admin_or_owner",
+    "os_compute_api:servers:stop": "rule:admin_or_owner",
+    "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
+    "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
+    "os_compute_api:servers:migrations:delete": "rule:admin_api",
+    "os_compute_api:servers:discoverable": "@",
+    "os_compute_api:servers:migrations:index": "rule:admin_api",
+    "os_compute_api:servers:migrations:show": "rule:admin_api",
+    "os_compute_api:os-access-ips:discoverable": "@",
+    "os_compute_api:os-access-ips": "rule:admin_or_owner",
+    "os_compute_api:os-admin-actions": "rule:admin_api",
     "os_compute_api:os-admin-actions:discoverable": "@",
+    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
     "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
-    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
-    "os_compute_api:os-admin-actions": "rule:admin_api",
-    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-password": "rule:admin_or_owner",
     "os_compute_api:os-admin-password:discoverable": "@",
-    "os_compute_api:os-admin-password": "rule:admin_or_owner",
+    "os_compute_api:os-aggregates:discoverable": "@",
+    "os_compute_api:os-aggregates:index": "rule:admin_api",
+    "os_compute_api:os-aggregates:create": "rule:admin_api",
+    "os_compute_api:os-aggregates:show": "rule:admin_api",
+    "os_compute_api:os-aggregates:update": "rule:admin_api",
+    "os_compute_api:os-aggregates:delete": "rule:admin_api",
+    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
     "os_compute_api:os-agents": "rule:admin_api",
     "os_compute_api:os-agents:discoverable": "@",
-    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
-    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:discoverable": "@",
-    "os_compute_api:os-aggregates:create": "rule:admin_api",
-    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:update": "rule:admin_api",
-    "os_compute_api:os-aggregates:index": "rule:admin_api",
-    "os_compute_api:os-aggregates:delete": "rule:admin_api",
-    "os_compute_api:os-aggregates:show": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
     "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
     "os_compute_api:os-attach-interfaces:discoverable": "@",
-    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:discoverable": "@",
-    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
+    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
     "os_compute_api:os-baremetal-nodes:discoverable": "@",
-    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
-    "network:attach_external_network": "is_admin:True",
-    "os_compute_api:os-block-device-mapping:discoverable": "@",
     "os_compute_api:os-block-device-mapping-v1:discoverable": "@",
+    "os_compute_api:os-cells": "rule:admin_api",
+    "os_compute_api:os-cells:create": "rule:admin_api",
+    "os_compute_api:os-cells:delete": "rule:admin_api",
+    "os_compute_api:os-cells:update": "rule:admin_api",
+    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
     "os_compute_api:os-cells:discoverable": "@",
-    "os_compute_api:os-cells:update": "rule:admin_api",
-    "os_compute_api:os-cells:create": "rule:admin_api",
-    "os_compute_api:os-cells": "rule:admin_api",
-    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
-    "os_compute_api:os-cells:delete": "rule:admin_api",
-    "cells_scheduler_filter:DifferentCellFilter": "is_admin:True",
-    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
-    "os_compute_api:os-certificates:discoverable": "@",
     "os_compute_api:os-certificates:create": "rule:admin_or_owner",
     "os_compute_api:os-certificates:show": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:discoverable": "@",
     "os_compute_api:os-cloudpipe": "rule:admin_api",
     "os_compute_api:os-cloudpipe:discoverable": "@",
+    "os_compute_api:os-config-drive": "rule:admin_or_owner",
     "os_compute_api:os-config-drive:discoverable": "@",
-    "os_compute_api:os-config-drive": "rule:admin_or_owner",
-    "os_compute_api:os-console-auth-tokens:discoverable": "@",
-    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-consoles:discoverable": "@",
+    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
     "os_compute_api:os-console-output:discoverable": "@",
     "os_compute_api:os-console-output": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:discoverable": "@",
-    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles:discoverable": "@",
     "os_compute_api:os-create-backup:discoverable": "@",
     "os_compute_api:os-create-backup": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
     "os_compute_api:os-deferred-delete:discoverable": "@",
-    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config:discoverable": "@",
+    "os_compute_api:os-evacuate": "rule:admin_api",
     "os_compute_api:os-evacuate:discoverable": "@",
-    "os_compute_api:os-evacuate": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes:discoverable": "@",
+    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:os-extended-status:discoverable": "@",
     "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
     "os_compute_api:os-extended-availability-zone:discoverable": "@",
-    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
-    "os_compute_api:os-extended-server-attributes:discoverable": "@",
-    "os_compute_api:os-extended-status:discoverable": "@",
-    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:extensions": "rule:admin_or_owner",
+    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:extension_info:discoverable": "@",
     "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
     "os_compute_api:os-extended-volumes:discoverable": "@",
-    "os_compute_api:extension_info:discoverable": "@",
-    "os_compute_api:extensions": "rule:admin_or_owner",
-    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:os-fixed-ips": "rule:admin_api",
     "os_compute_api:os-fixed-ips:discoverable": "@",
-    "os_compute_api:os-fixed-ips": "rule:admin_api",
-    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
     "os_compute_api:os-flavor-access:discoverable": "@",
     "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-rxtx:discoverable": "@",
+    "os_compute_api:flavors": "rule:admin_or_owner",
+    "os_compute_api:flavors:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
     "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
     "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
     "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
     "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-manage:discoverable": "@",
     "os_compute_api:os-flavor-manage": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:discoverable": "@",
-    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-rxtx:discoverable": "@",
-    "os_compute_api:flavors:discoverable": "@",
-    "os_compute_api:flavors": "rule:admin_or_owner",
     "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-dns:discoverable": "@",
     "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
-    "os_compute_api:os-floating-ip-dns:discoverable": "@",
     "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
+    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
     "os_compute_api:os-floating-ip-pools:discoverable": "@",
-    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
     "os_compute_api:os-floating-ips": "rule:admin_or_owner",
     "os_compute_api:os-floating-ips:discoverable": "@",
+    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
     "os_compute_api:os-floating-ips-bulk:discoverable": "@",
-    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
+    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-fping:discoverable": "@",
     "os_compute_api:os-fping:all_tenants": "rule:admin_api",
-    "os_compute_api:os-fping:discoverable": "@",
-    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-hide-server-addresses": "is_admin:False",
     "os_compute_api:os-hide-server-addresses:discoverable": "@",
-    "os_compute_api:os-hide-server-addresses": "is_admin:False",
+    "os_compute_api:os-hosts": "rule:admin_api",
     "os_compute_api:os-hosts:discoverable": "@",
-    "os_compute_api:os-hosts": "rule:admin_api",
+    "os_compute_api:os-hypervisors": "rule:admin_api",
     "os_compute_api:os-hypervisors:discoverable": "@",
-    "os_compute_api:os-hypervisors": "rule:admin_api",
-    "os_compute_api:image-metadata:discoverable": "@",
+    "os_compute_api:images:discoverable": "@",
+    "os_compute_api:image-size": "rule:admin_or_owner",
     "os_compute_api:image-size:discoverable": "@",
-    "os_compute_api:image-size": "rule:admin_or_owner",
-    "os_compute_api:images:discoverable": "@",
-    "os_compute_api:os-instance-actions:events": "rule:admin_api",
     "os_compute_api:os-instance-actions": "rule:admin_or_owner",
     "os_compute_api:os-instance-actions:discoverable": "@",
+    "os_compute_api:os-instance-actions:events": "rule:admin_api",
     "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
     "os_compute_api:os-instance-usage-audit-log:discoverable": "@",
     "os_compute_api:ips:discoverable": "@",
+    "os_compute_api:ips:index": "rule:admin_or_owner",
     "os_compute_api:ips:show": "rule:admin_or_owner",
-    "os_compute_api:ips:index": "rule:admin_or_owner",
     "os_compute_api:os-keypairs:discoverable": "@",
+    "os_compute_api:os-keypairs": "rule:admin_or_owner",
     "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
     "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
     "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs": "rule:admin_or_owner",
     "os_compute_api:limits:discoverable": "@",
     "os_compute_api:limits": "rule:admin_or_owner",
     "os_compute_api:os-lock-server:discoverable": "@",
     "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
     "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
-    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
+    "os_compute_api:os-migrate-server:discoverable": "@",
     "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
-    "os_compute_api:os-migrate-server:discoverable": "@",
     "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
-    "os_compute_api:os-migrations:index": "rule:admin_api",
-    "os_compute_api:os-migrations:discoverable": "@",
     "os_compute_api:os-multinic": "rule:admin_or_owner",
     "os_compute_api:os-multinic:discoverable": "@",
-    "os_compute_api:os-multiple-create:discoverable": "@",
-    "os_compute_api:os-networks:discoverable": "@",
     "os_compute_api:os-networks": "rule:admin_api",
     "os_compute_api:os-networks:view": "rule:admin_or_owner",
+    "os_compute_api:os-networks:discoverable": "@",
     "os_compute_api:os-networks-associate": "rule:admin_api",
     "os_compute_api:os-networks-associate:discoverable": "@",
-    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
     "os_compute_api:os-pause-server:discoverable": "@",
     "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
+    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
+    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
+    "os_compute_api:os-pci:discoverable": "@",
     "os_compute_api:os-pci:index": "rule:admin_api",
     "os_compute_api:os-pci:detail": "rule:admin_api",
-    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
     "os_compute_api:os-pci:show": "rule:admin_api",
-    "os_compute_api:os-pci:discoverable": "@",
+    "os_compute_api:os-personality:discoverable": "@",
+    "os_compute_api:os-preserve-ephemeral-rebuild:discoverable": "@",
+    "os_compute_api:os-quota-sets:discoverable": "@",
+    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
+    "os_compute_api:os-quota-sets:defaults": "@",
+    "os_compute_api:os-quota-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
+    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
     "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
     "os_compute_api:os-quota-class-sets:discoverable": "@",
-    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:defaults": "@",
-    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
-    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
-    "os_compute_api:os-quota-sets:discoverable": "@",
-    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
-    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
-    "os_compute_api:os-remote-consoles:discoverable": "@",
+    "os_compute_api:os-rescue": "rule:admin_or_owner",
     "os_compute_api:os-rescue:discoverable": "@",
-    "os_compute_api:os-rescue": "rule:admin_or_owner",
     "os_compute_api:os-scheduler-hints:discoverable": "@",
     "os_compute_api:os-security-group-default-rules:discoverable": "@",
     "os_compute_api:os-security-group-default-rules": "rule:admin_api",
@@ -178,82 +439,62 @@
     "os_compute_api:os-security-groups:discoverable": "@",
     "os_compute_api:os-server-diagnostics": "rule:admin_api",
     "os_compute_api:os-server-diagnostics:discoverable": "@",
-    "os_compute_api:os-server-external-events:create": "rule:admin_api",
-    "os_compute_api:os-server-external-events:discoverable": "@",
+    "os_compute_api:os-server-password": "rule:admin_or_owner",
+    "os_compute_api:os-server-password:discoverable": "@",
+    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "os_compute_api:os-server-usage:discoverable": "@",
+    "os_compute_api:os-server-groups": "rule:admin_or_owner",
     "os_compute_api:os-server-groups:discoverable": "@",
-    "os_compute_api:os-server-groups": "rule:admin_or_owner",
+    "os_compute_api:os-server-tags:index": "@",
+    "os_compute_api:os-server-tags:show": "@",
+    "os_compute_api:os-server-tags:update": "@",
+    "os_compute_api:os-server-tags:update_all": "@",
+    "os_compute_api:os-server-tags:delete": "@",
+    "os_compute_api:os-server-tags:delete_all": "@",
+    "os_compute_api:os-services": "rule:admin_api",
+    "os_compute_api:os-services:discoverable": "@",
+    "os_compute_api:server-metadata:discoverable": "@",
     "os_compute_api:server-metadata:index": "rule:admin_or_owner",
     "os_compute_api:server-metadata:show": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
     "os_compute_api:server-metadata:create": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:discoverable": "@",
+    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
     "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
-    "os_compute_api:os-server-password": "rule:admin_or_owner",
-    "os_compute_api:os-server-password:discoverable": "@",
-    "os_compute_api:os-server-tags:delete_all": "@",
-    "os_compute_api:os-server-tags:index": "@",
-    "os_compute_api:os-server-tags:update_all": "@",
-    "os_compute_api:os-server-tags:delete": "@",
-    "os_compute_api:os-server-tags:update": "@",
-    "os_compute_api:os-server-tags:show": "@",
-    "os_compute_api:os-server-tags:discoverable": "@",
-    "os_compute_api:os-server-usage": "rule:admin_or_owner",
-    "os_compute_api:os-server-usage:discoverable": "@",
-    "os_compute_api:servers:index": "rule:admin_or_owner",
-    "os_compute_api:servers:detail": "rule:admin_or_owner",
-    "os_compute_api:servers:detail:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:index:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:show": "rule:admin_or_owner",
-    "os_compute_api:servers:show:host_status": "rule:admin_api",
-    "os_compute_api:servers:create": "rule:admin_or_owner",
-    "os_compute_api:servers:create:forced_host": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
-    "os_compute_api:servers:delete": "rule:admin_or_owner",
-    "os_compute_api:servers:update": "rule:admin_or_owner",
-    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:reboot": "rule:admin_or_owner",
-    "os_compute_api:servers:resize": "rule:admin_or_owner",
-    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
-    "os_compute_api:servers:create_image": "rule:admin_or_owner",
-    "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
-    "os_compute_api:servers:start": "rule:admin_or_owner",
-    "os_compute_api:servers:stop": "rule:admin_or_owner",
-    "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
-    "os_compute_api:servers:discoverable": "@",
-    "os_compute_api:servers:migrations:show": "rule:admin_api",
-    "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
-    "os_compute_api:servers:migrations:delete": "rule:admin_api",
-    "os_compute_api:servers:migrations:index": "rule:admin_api",
-    "os_compute_api:server-migrations:discoverable": "@",
-    "os_compute_api:os-services": "rule:admin_api",
-    "os_compute_api:os-services:discoverable": "@",
     "os_compute_api:os-shelve:shelve": "rule:admin_or_owner",
-    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-shelve:shelve:discoverable": "@",
     "os_compute_api:os-shelve:shelve_offload": "rule:admin_api",
-    "os_compute_api:os-shelve:discoverable": "@",
+    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
     "os_compute_api:os-simple-tenant-usage:show": "rule:admin_or_owner",
     "os_compute_api:os-simple-tenant-usage:list": "rule:admin_api",
-    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
+    "os_compute_api:os-suspend-server:discoverable": "@",
+    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-suspend-server:resume": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:discoverable": "@",
     "os_compute_api:os-tenant-networks": "rule:admin_or_owner",
     "os_compute_api:os-tenant-networks:discoverable": "@",
+    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-user-data:discoverable": "@",
+    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-virtual-interfaces:discoverable": "@",
+    "os_compute_api:os-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-volumes:discoverable": "@",
+    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
+    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:discoverable": "@",
+    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
+    "os_compute_api:os-availability-zone:discoverable": "@",
+    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
+    "os_compute_api:os-used-limits": "rule:admin_api",
     "os_compute_api:os-used-limits:discoverable": "@",
-    "os_compute_api:os-used-limits": "rule:admin_api",
-    "os_compute_api:os-user-data:discoverable": "@",
-    "os_compute_api:versions:discoverable": "@",
-    "os_compute_api:os-virtual-interfaces:discoverable": "@",
-    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-volumes:discoverable": "@",
-    "os_compute_api:os-volumes": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:discoverable": "@",
-    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
-    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner"
+    "os_compute_api:os-migrations:index": "rule:admin_api",
+    "os_compute_api:os-migrations:discoverable": "@",
+    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
+    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-console-auth-tokens:discoverable": "@",
+    "os_compute_api:os-server-external-events:create": "rule:admin_api",
+    "os_compute_api:os-server-external-events:discoverable": "@"
 }

2018-03-30 07:08:15,970 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 07:08:15.970226 duration_in_ms=46.2
2018-03-30 07:08:15,971 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 07:08:15.971315
2018-03-30 07:08:15,971 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json
2018-03-30 07:08:15,995 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/neutron_policy.json'
2018-03-30 07:08:15,998 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -8,6 +8,8 @@
     "admin_only": "rule:context_is_admin",
     "regular_user": "",
     "shared": "field:networks:shared=True",
+    "shared_firewalls": "field:firewalls:shared=True",
+    "shared_firewall_policies": "field:firewall_policies:shared=True",
     "shared_subnetpools": "field:subnetpools:shared=True",
     "shared_address_scopes": "field:address_scopes:shared=True",
     "external": "field:networks:router:external=True",
@@ -111,8 +113,27 @@
     "create_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
     "update_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
 
+    "create_firewall": "",
+    "get_firewall": "rule:admin_or_owner",
+    "create_firewall:shared": "rule:admin_only",
+    "get_firewall:shared": "rule:admin_only",
+    "update_firewall": "rule:admin_or_owner",
+    "update_firewall:shared": "rule:admin_only",
+    "delete_firewall": "rule:admin_or_owner",
+
+    "create_firewall_policy": "",
+    "get_firewall_policy": "rule:admin_or_owner or rule:shared_firewall_policies",
+    "create_firewall_policy:shared": "rule:admin_or_owner",
+    "update_firewall_policy": "rule:admin_or_owner",
+    "delete_firewall_policy": "rule:admin_or_owner",
+
     "insert_rule": "rule:admin_or_owner",
     "remove_rule": "rule:admin_or_owner",
+
+    "create_firewall_rule": "",
+    "get_firewall_rule": "rule:admin_or_owner or rule:shared_firewalls",
+    "update_firewall_rule": "rule:admin_or_owner",
+    "delete_firewall_rule": "rule:admin_or_owner",
 
     "create_qos_queue": "rule:admin_only",
     "get_qos_queue": "rule:admin_only",

2018-03-30 07:08:15,998 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 07:08:15.998581 duration_in_ms=27.266
2018-03-30 07:08:15,999 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 07:08:15.999051
2018-03-30 07:08:15,999 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json
2018-03-30 07:08:16,018 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/glance_policy.json'
2018-03-30 07:08:16,019 [salt.state       ][INFO    ][6189] File /usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json is in the correct state
2018-03-30 07:08:16,019 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 07:08:16.019307 duration_in_ms=20.256
2018-03-30 07:08:16,019 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 07:08:16.019780
2018-03-30 07:08:16,020 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json
2018-03-30 07:08:16,038 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/ceilometer_policy.json'
2018-03-30 07:08:16,040 [salt.state       ][INFO    ][6189] File changed:
New file
2018-03-30 07:08:16,040 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 07:08:16.040570 duration_in_ms=20.79
2018-03-30 07:08:16,041 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 07:08:16.040981
2018-03-30 07:08:16,041 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json
2018-03-30 07:08:16,063 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/cinder_policy.json'
2018-03-30 07:08:16,065 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -95,16 +95,16 @@
     "snapshot_extension:snapshot_manage": "rule:admin_api",
     "snapshot_extension:snapshot_unmanage": "rule:admin_api",
 
-    "consistencygroup:create" : "",
-    "consistencygroup:delete": "",
-    "consistencygroup:update": "",
-    "consistencygroup:get": "",
-    "consistencygroup:get_all": "",
+    "consistencygroup:create" : "group:nobody",
+    "consistencygroup:delete": "group:nobody",
+    "consistencygroup:update": "group:nobody",
+    "consistencygroup:get": "group:nobody",
+    "consistencygroup:get_all": "group:nobody",
 
-    "consistencygroup:create_cgsnapshot" : "",
-    "consistencygroup:delete_cgsnapshot": "",
-    "consistencygroup:get_cgsnapshot": "",
-    "consistencygroup:get_all_cgsnapshots": "",
+    "consistencygroup:create_cgsnapshot" : "group:nobody",
+    "consistencygroup:delete_cgsnapshot": "group:nobody",
+    "consistencygroup:get_cgsnapshot": "group:nobody",
+    "consistencygroup:get_all_cgsnapshots": "group:nobody",
 
     "scheduler_extension:scheduler_stats:get_pools" : "rule:admin_api",
     "message:delete": "rule:admin_or_owner",

2018-03-30 07:08:16,065 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 07:08:16.065412 duration_in_ms=24.431
2018-03-30 07:08:16,065 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 07:08:16.065841
2018-03-30 07:08:16,066 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json
2018-03-30 07:08:16,088 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/heat_policy.json'
2018-03-30 07:08:16,089 [salt.state       ][INFO    ][6189] File /usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json is in the correct state
2018-03-30 07:08:16,089 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 07:08:16.089375 duration_in_ms=23.534
2018-03-30 07:08:16,090 [salt.state       ][INFO    ][6189] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 07:08:16.089939
2018-03-30 07:08:16,090 [salt.state       ][INFO    ][6189] Executing state file.managed for /usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json
2018-03-30 07:08:16,108 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/pike/keystone_policy.json'
2018-03-30 07:08:16,110 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -28,7 +28,7 @@
     "identity:update_endpoint": "rule:admin_required",
     "identity:delete_endpoint": "rule:admin_required",
 
-    "identity:get_domain": "rule:admin_required or token.project.domain.id:%(target.domain.id)s",
+    "identity:get_domain": "rule:admin_required",
     "identity:list_domains": "rule:admin_required",
     "identity:create_domain": "rule:admin_required",
     "identity:update_domain": "rule:admin_required",
@@ -41,7 +41,7 @@
     "identity:update_project": "rule:admin_required",
     "identity:delete_project": "rule:admin_required",
 
-    "identity:get_user": "rule:admin_or_owner",
+    "identity:get_user": "rule:admin_required",
     "identity:list_users": "rule:admin_required",
     "identity:create_user": "rule:admin_required",
     "identity:update_user": "rule:admin_required",
@@ -173,10 +173,10 @@
     "identity:get_auth_projects": "",
     "identity:get_auth_domains": "",
 
-    "identity:list_projects_for_user": "",
-    "identity:list_domains_for_user": "",
+    "identity:list_projects_for_groups": "",
+    "identity:list_domains_for_groups": "",
 
-    "identity:list_revoke_events": "rule:service_or_admin",
+    "identity:list_revoke_events": "",
 
     "identity:create_policy_association_for_endpoint": "rule:admin_required",
     "identity:check_policy_association_for_endpoint": "rule:admin_required",
@@ -192,7 +192,6 @@
 
     "identity:create_domain_config": "rule:admin_required",
     "identity:get_domain_config": "rule:admin_required",
-    "identity:get_security_compliance_domain_config": "",
     "identity:update_domain_config": "rule:admin_required",
     "identity:delete_domain_config": "rule:admin_required",
     "identity:get_domain_config_default": "rule:admin_required"

2018-03-30 07:08:16,110 [salt.state       ][INFO    ][6189] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 07:08:16.110654 duration_in_ms=20.715
2018-03-30 07:08:16,111 [salt.state       ][INFO    ][6189] Running state [/etc/apache2/ports.conf] at time 07:08:16.110975
2018-03-30 07:08:16,111 [salt.state       ][INFO    ][6189] Executing state file.managed for /etc/apache2/ports.conf
2018-03-30 07:08:16,128 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/ports.conf'
2018-03-30 07:08:16,160 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -1,15 +1,16 @@
+
 # If you just change the port or add more ports here, you will likely also
 # have to change the VirtualHost statement in
 # /etc/apache2/sites-enabled/000-default.conf
 
-Listen 80
+Listen 0.0.0.0:8078
 
 <IfModule ssl_module>
-	Listen 443
+        Listen 0.0.0.0:443
 </IfModule>
 
 <IfModule mod_gnutls.c>
-	Listen 443
+        Listen 0.0.0.0:443
 </IfModule>
 
 # vim: syntax=apache ts=4 sw=4 sts=4 sr noet

2018-03-30 07:08:16,161 [salt.state       ][INFO    ][6189] Completed state [/etc/apache2/ports.conf] at time 07:08:16.161859 duration_in_ms=50.884
2018-03-30 07:08:16,162 [salt.state       ][INFO    ][6189] Running state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 07:08:16.162277
2018-03-30 07:08:16,162 [salt.state       ][INFO    ][6189] Executing state file.managed for /etc/apache2/conf-available/openstack-dashboard.conf
2018-03-30 07:08:16,182 [salt.fileclient  ][INFO    ][6189] Fetching file from saltenv 'base', ** done ** 'horizon/files/openstack-dashboard.conf.Debian'
2018-03-30 07:08:16,221 [salt.state       ][INFO    ][6189] File changed:
--- 
+++ 
@@ -1,14 +1,36 @@
-WSGIScriptAlias /horizon /usr/share/openstack-dashboard/openstack_dashboard/wsgi/django.wsgi process-group=horizon
-WSGIDaemonProcess horizon user=horizon group=horizon processes=3 threads=10 display-name=%{GROUP}
-WSGIProcessGroup horizon
 
-Alias /static /var/lib/openstack-dashboard/static/
-Alias /horizon/static /var/lib/openstack-dashboard/static/
 
-<Directory /usr/share/openstack-dashboard/openstack_dashboard/wsgi>
-  Require all granted
-</Directory>
+<VirtualHost 0.0.0.0:8078>
+  ServerName openstack-dashboard
 
-<Directory /var/lib/openstack-dashboard/static>
-  Require all granted
-</Directory>
+  WSGIScriptAlias / /usr/share/openstack-dashboard/openstack_dashboard/wsgi/django.wsgi
+  WSGIDaemonProcess horizon user=horizon group=horizon processes=3 threads=10
+  WSGIProcessGroup horizon
+
+  Alias /static /usr/share/openstack-dashboard/static
+
+  <Directory /usr/share/openstack-dashboard/openstack_dashboard/wsgi>
+    Order allow,deny
+    Allow from all
+  </Directory>
+
+  <Directory /usr/share/openstack-dashboard/static>
+    <IfModule mod_expires.c>
+      ExpiresActive On
+      ExpiresDefault "access 6 month"
+    </IfModule>
+    <IfModule mod_deflate.c>
+      SetOutputFilter DEFLATE
+    </IfModule>
+
+    Require all granted
+  </Directory>
+  ServerSignature Off
+  LogFormat "%h %t %m \"%U%q\" %H %>s %O %D \"%{Referer}i\" \"%{User-Agent}i\"" horizon
+  ErrorLog "/var/log/apache2/openstack_dashboard_error.log"
+  CustomLog "/var/log/apache2/openstack_dashboard_access.log" horizon
+  SetEnvIf X-Forwarded-Proto https HTTPS=1
+
+</VirtualHost>
+
+

2018-03-30 07:08:16,222 [salt.state       ][INFO    ][6189] Completed state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 07:08:16.222071 duration_in_ms=59.794
2018-03-30 07:08:16,227 [salt.state       ][INFO    ][6189] Running state [wsgi] at time 07:08:16.227156
2018-03-30 07:08:16,227 [salt.state       ][INFO    ][6189] Executing state apache_module.enabled for wsgi
2018-03-30 07:08:16,229 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['a2enmod', 'wsgi'] in directory '/root'
2018-03-30 07:08:16,300 [salt.state       ][INFO    ][6189] {'new': 'wsgi', 'old': None}
2018-03-30 07:08:16,300 [salt.state       ][INFO    ][6189] Completed state [wsgi] at time 07:08:16.300778 duration_in_ms=73.622
2018-03-30 07:08:16,308 [salt.state       ][INFO    ][6189] Running state [openstack-dashboard] at time 07:08:16.307970
2018-03-30 07:08:16,308 [salt.state       ][INFO    ][6189] Executing state apache_conf.enabled for openstack-dashboard
2018-03-30 07:08:16,308 [salt.state       ][INFO    ][6189] openstack-dashboard already enabled.
2018-03-30 07:08:16,309 [salt.state       ][INFO    ][6189] Completed state [openstack-dashboard] at time 07:08:16.308972 duration_in_ms=1.003
2018-03-30 07:08:16,493 [salt.state       ][INFO    ][6189] Running state [/var/log/horizon] at time 07:08:16.493107
2018-03-30 07:08:16,493 [salt.state       ][INFO    ][6189] Executing state file.directory for /var/log/horizon
2018-03-30 07:08:16,494 [salt.state       ][INFO    ][6189] {'/var/log/horizon': 'New Dir'}
2018-03-30 07:08:16,495 [salt.state       ][INFO    ][6189] Completed state [/var/log/horizon] at time 07:08:16.495167 duration_in_ms=2.06
2018-03-30 07:08:16,495 [salt.state       ][INFO    ][6189] Running state [/var/log/horizon/horizon.log] at time 07:08:16.495519
2018-03-30 07:08:16,495 [salt.state       ][INFO    ][6189] Executing state file.managed for /var/log/horizon/horizon.log
2018-03-30 07:08:16,496 [salt.loaded.int.states.file][WARNING ][6189] State for file: /var/log/horizon/horizon.log - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-03-30 07:08:16,497 [salt.state       ][INFO    ][6189] {'new': 'file /var/log/horizon/horizon.log created', 'group': 'adm', 'mode': '0640', 'user': 'horizon'}
2018-03-30 07:08:16,497 [salt.state       ][INFO    ][6189] Completed state [/var/log/horizon/horizon.log] at time 07:08:16.497302 duration_in_ms=1.783
2018-03-30 07:08:16,497 [salt.state       ][INFO    ][6189] Running state [apache2] at time 07:08:16.497943
2018-03-30 07:08:16,498 [salt.state       ][INFO    ][6189] Executing state service.running for apache2
2018-03-30 07:08:16,498 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-03-30 07:08:16,525 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-03-30 07:08:16,548 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-03-30 07:08:16,574 [salt.state       ][INFO    ][6189] The service apache2 is already running
2018-03-30 07:08:16,574 [salt.state       ][INFO    ][6189] Completed state [apache2] at time 07:08:16.574642 duration_in_ms=76.699
2018-03-30 07:08:16,575 [salt.state       ][INFO    ][6189] Running state [apache2] at time 07:08:16.575007
2018-03-30 07:08:16,575 [salt.state       ][INFO    ][6189] Executing state service.mod_watch for apache2
2018-03-30 07:08:16,576 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-03-30 07:08:16,599 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-03-30 07:08:16,624 [salt.loaded.int.module.cmdmod][INFO    ][6189] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'apache2.service'] in directory '/root'
2018-03-30 07:08:19,015 [salt.state       ][INFO    ][6189] {'apache2': True}
2018-03-30 07:08:19,016 [salt.state       ][INFO    ][6189] Completed state [apache2] at time 07:08:19.016588 duration_in_ms=2441.579
2018-03-30 07:08:19,021 [salt.minion      ][INFO    ][6189] Returning information for job: 20180330070500553159
2018-03-30 07:08:19,982 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command state.sls with jid 20180330070819975056
2018-03-30 07:08:20,009 [salt.minion      ][INFO    ][11233] Starting a new job with PID 11233
2018-03-30 07:08:21,671 [salt.state       ][INFO    ][11233] Loading fresh modules for state activity
2018-03-30 07:08:21,729 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/init.sls'
2018-03-30 07:08:21,761 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/server.sls'
2018-03-30 07:08:21,833 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/server/users.sls'
2018-03-30 07:08:21,878 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/server/sites.sls'
2018-03-30 07:08:21,934 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-03-30 07:08:21,954 [salt.loaded.int.module.cmdmod][ERROR   ][11233] Command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' failed with return code: 1
2018-03-30 07:08:21,955 [salt.loaded.int.module.cmdmod][ERROR   ][11233] output: cat: /etc/ssl/certs/172.30.10.101-with-chain.crt: No such file or directory
2018-03-30 07:08:21,955 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt' in directory '/root'
2018-03-30 07:08:22,054 [salt.state       ][INFO    ][11233] Running state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 07:08:22.054923
2018-03-30 07:08:22,055 [salt.state       ][INFO    ][11233] Executing state cmd.run for cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt
2018-03-30 07:08:22,056 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command '/bin/true' in directory '/root'
2018-03-30 07:08:22,076 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-03-30 07:08:22,096 [salt.state       ][INFO    ][11233] {'pid': 11250, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-03-30 07:08:22,097 [salt.state       ][INFO    ][11233] Completed state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 07:08:22.097134 duration_in_ms=42.211
2018-03-30 07:08:23,192 [salt.state       ][INFO    ][11233] Running state [nginx] at time 07:08:23.192650
2018-03-30 07:08:23,193 [salt.state       ][INFO    ][11233] Executing state pkg.installed for nginx
2018-03-30 07:08:23,193 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:08:23,547 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['apt-cache', '-q', 'policy', 'nginx'] in directory '/root'
2018-03-30 07:08:23,633 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-03-30 07:08:25,527 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-03-30 07:08:25,563 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'nginx'] in directory '/root'
2018-03-30 07:08:30,082 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070830074164
2018-03-30 07:08:30,110 [salt.minion      ][INFO    ][11641] Starting a new job with PID 11641
2018-03-30 07:08:30,134 [salt.minion      ][INFO    ][11641] Returning information for job: 20180330070830074164
2018-03-30 07:08:37,796 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:08:37,829 [salt.state       ][INFO    ][11233] Made the following changes:
'libgd3' changed from 'absent' to '2.1.1-4ubuntu0.16.04.8'
'nginx-core' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libxpm4' changed from 'absent' to '1:3.5.11-1ubuntu0.16.04.1'
'nginx' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'nginx-common' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libfontconfig' changed from 'absent' to '1'
'fonts-dejavu-core' changed from 'absent' to '2.35-1'
'fontconfig-config' changed from 'absent' to '2.11.94-0ubuntu1.1'
'libvpx3' changed from 'absent' to '1.5.0-2ubuntu1'
'libfontconfig1' changed from 'absent' to '2.11.94-0ubuntu1.1'

2018-03-30 07:08:37,848 [salt.state       ][INFO    ][11233] Loading fresh modules for state activity
2018-03-30 07:08:37,961 [salt.state       ][INFO    ][11233] Completed state [nginx] at time 07:08:37.961229 duration_in_ms=14768.579
2018-03-30 07:08:37,964 [salt.state       ][INFO    ][11233] Running state [apache2-utils] at time 07:08:37.964204
2018-03-30 07:08:37,964 [salt.state       ][INFO    ][11233] Executing state pkg.installed for apache2-utils
2018-03-30 07:08:38,322 [salt.state       ][INFO    ][11233] All specified packages are already installed
2018-03-30 07:08:38,323 [salt.state       ][INFO    ][11233] Completed state [apache2-utils] at time 07:08:38.322955 duration_in_ms=358.749
2018-03-30 07:08:38,324 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 07:08:38.324463
2018-03-30 07:08:38,324 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf
2018-03-30 07:08:38,355 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/proxy.conf'
2018-03-30 07:08:38,423 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_name.conf'
2018-03-30 07:08:38,446 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl.conf'
2018-03-30 07:08:38,479 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl_secure.conf'
2018-03-30 07:08:38,503 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_auth.conf'
2018-03-30 07:08:38,522 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_access_policy.conf'
2018-03-30 07:08:38,530 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:38,530 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 07:08:38.530314 duration_in_ms=205.85
2018-03-30 07:08:38,530 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 07:08:38.530665
2018-03-30 07:08:38,531 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf
2018-03-30 07:08:38,532 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf'}
2018-03-30 07:08:38,533 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 07:08:38.533153 duration_in_ms=2.488
2018-03-30 07:08:38,533 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 07:08:38.533826
2018-03-30 07:08:38,534 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf
2018-03-30 07:08:38,665 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:38,665 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 07:08:38.665628 duration_in_ms=131.802
2018-03-30 07:08:38,666 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 07:08:38.665967
2018-03-30 07:08:38,666 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf
2018-03-30 07:08:38,667 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf'}
2018-03-30 07:08:38,668 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 07:08:38.668129 duration_in_ms=2.162
2018-03-30 07:08:38,668 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 07:08:38.668655
2018-03-30 07:08:38,668 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf
2018-03-30 07:08:38,790 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:38,790 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 07:08:38.790931 duration_in_ms=122.276
2018-03-30 07:08:38,791 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 07:08:38.791185
2018-03-30 07:08:38,791 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf
2018-03-30 07:08:38,792 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf'}
2018-03-30 07:08:38,793 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 07:08:38.793024 duration_in_ms=1.839
2018-03-30 07:08:38,793 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 07:08:38.793482
2018-03-30 07:08:38,793 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf
2018-03-30 07:08:38,973 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:38,973 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 07:08:38.973335 duration_in_ms=179.853
2018-03-30 07:08:38,973 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 07:08:38.973554
2018-03-30 07:08:38,973 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf
2018-03-30 07:08:38,974 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf'}
2018-03-30 07:08:38,975 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 07:08:38.975080 duration_in_ms=1.525
2018-03-30 07:08:38,975 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 07:08:38.975450
2018-03-30 07:08:38,975 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_web.conf
2018-03-30 07:08:39,071 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,071 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 07:08:39.071363 duration_in_ms=95.913
2018-03-30 07:08:39,071 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 07:08:39.071555
2018-03-30 07:08:39,071 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf
2018-03-30 07:08:39,072 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf'}
2018-03-30 07:08:39,072 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 07:08:39.072918 duration_in_ms=1.362
2018-03-30 07:08:39,073 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 07:08:39.073254
2018-03-30 07:08:39,073 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf
2018-03-30 07:08:39,172 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,173 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 07:08:39.173090 duration_in_ms=99.836
2018-03-30 07:08:39,173 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 07:08:39.173351
2018-03-30 07:08:39,173 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf
2018-03-30 07:08:39,174 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf'}
2018-03-30 07:08:39,175 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 07:08:39.175099 duration_in_ms=1.748
2018-03-30 07:08:39,175 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 07:08:39.175535
2018-03-30 07:08:39,175 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_novnc.conf
2018-03-30 07:08:39,274 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,274 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 07:08:39.274701 duration_in_ms=99.166
2018-03-30 07:08:39,275 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 07:08:39.274969
2018-03-30 07:08:39,275 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_novnc.conf
2018-03-30 07:08:39,276 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_novnc.conf'}
2018-03-30 07:08:39,276 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 07:08:39.276372 duration_in_ms=1.402
2018-03-30 07:08:39,276 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 07:08:39.276713
2018-03-30 07:08:39,276 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf
2018-03-30 07:08:39,372 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,372 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 07:08:39.372663 duration_in_ms=95.949
2018-03-30 07:08:39,372 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 07:08:39.372853
2018-03-30 07:08:39,373 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf
2018-03-30 07:08:39,374 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf'}
2018-03-30 07:08:39,374 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 07:08:39.374285 duration_in_ms=1.431
2018-03-30 07:08:39,374 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 07:08:39.374616
2018-03-30 07:08:39,374 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf
2018-03-30 07:08:39,521 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,521 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 07:08:39.521938 duration_in_ms=147.321
2018-03-30 07:08:39,522 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 07:08:39.522603
2018-03-30 07:08:39,523 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf
2018-03-30 07:08:39,525 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf'}
2018-03-30 07:08:39,525 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 07:08:39.525874 duration_in_ms=3.271
2018-03-30 07:08:39,526 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 07:08:39.526796
2018-03-30 07:08:39,527 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf
2018-03-30 07:08:39,675 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,675 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 07:08:39.675792 duration_in_ms=148.996
2018-03-30 07:08:39,676 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 07:08:39.676379
2018-03-30 07:08:39,676 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf
2018-03-30 07:08:39,678 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf'}
2018-03-30 07:08:39,679 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 07:08:39.679372 duration_in_ms=2.993
2018-03-30 07:08:39,680 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf] at time 07:08:39.680171
2018-03-30 07:08:39,680 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf
2018-03-30 07:08:39,790 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,791 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova_ec2.conf] at time 07:08:39.790961 duration_in_ms=110.79
2018-03-30 07:08:39,791 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf] at time 07:08:39.791472
2018-03-30 07:08:39,791 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf
2018-03-30 07:08:39,793 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf'}
2018-03-30 07:08:39,794 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova_ec2.conf] at time 07:08:39.793977 duration_in_ms=2.505
2018-03-30 07:08:39,794 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 07:08:39.794647
2018-03-30 07:08:39,795 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf
2018-03-30 07:08:39,816 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/redirect.conf'
2018-03-30 07:08:39,823 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,823 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 07:08:39.823950 duration_in_ms=29.302
2018-03-30 07:08:39,824 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 07:08:39.824420
2018-03-30 07:08:39,824 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf
2018-03-30 07:08:39,826 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf'}
2018-03-30 07:08:39,826 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 07:08:39.826929 duration_in_ms=2.508
2018-03-30 07:08:39,827 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 07:08:39.827581
2018-03-30 07:08:39,828 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_static_reclass_doc.conf
2018-03-30 07:08:39,847 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/static.conf'
2018-03-30 07:08:39,889 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/_log.conf'
2018-03-30 07:08:39,962 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:39,963 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 07:08:39.963193 duration_in_ms=135.611
2018-03-30 07:08:39,963 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 07:08:39.963723
2018-03-30 07:08:39,964 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_static_reclass_doc.conf
2018-03-30 07:08:39,966 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf'}
2018-03-30 07:08:39,967 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 07:08:39.967157 duration_in_ms=3.433
2018-03-30 07:08:39,968 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 07:08:39.968018
2018-03-30 07:08:39,968 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf
2018-03-30 07:08:40,108 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:40,109 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 07:08:40.109117 duration_in_ms=141.098
2018-03-30 07:08:40,109 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 07:08:40.109534
2018-03-30 07:08:40,109 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf
2018-03-30 07:08:40,111 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf'}
2018-03-30 07:08:40,112 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 07:08:40.112294 duration_in_ms=2.759
2018-03-30 07:08:40,113 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 07:08:40.112957
2018-03-30 07:08:40,113 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_stats_stats.conf
2018-03-30 07:08:40,134 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/stats.conf'
2018-03-30 07:08:40,139 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:40,139 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 07:08:40.139527 duration_in_ms=26.569
2018-03-30 07:08:40,139 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 07:08:40.139924
2018-03-30 07:08:40,140 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_stats_stats.conf
2018-03-30 07:08:40,142 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_stats_stats.conf'}
2018-03-30 07:08:40,142 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 07:08:40.142572 duration_in_ms=2.648
2018-03-30 07:08:40,143 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 07:08:40.143197
2018-03-30 07:08:40,143 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf
2018-03-30 07:08:40,264 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:40,264 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 07:08:40.264745 duration_in_ms=121.547
2018-03-30 07:08:40,265 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 07:08:40.265198
2018-03-30 07:08:40,265 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf
2018-03-30 07:08:40,267 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf'}
2018-03-30 07:08:40,267 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 07:08:40.267591 duration_in_ms=2.393
2018-03-30 07:08:40,268 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 07:08:40.268170
2018-03-30 07:08:40,268 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf
2018-03-30 07:08:40,303 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070840295897
2018-03-30 07:08:40,334 [salt.minion      ][INFO    ][12046] Starting a new job with PID 12046
2018-03-30 07:08:40,356 [salt.minion      ][INFO    ][12046] Returning information for job: 20180330070840295897
2018-03-30 07:08:40,380 [salt.state       ][INFO    ][11233] File changed:
New file
2018-03-30 07:08:40,381 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 07:08:40.381184 duration_in_ms=113.014
2018-03-30 07:08:40,381 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 07:08:40.381512
2018-03-30 07:08:40,381 [salt.state       ][INFO    ][11233] Executing state file.symlink for /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf
2018-03-30 07:08:40,383 [salt.state       ][INFO    ][11233] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf'}
2018-03-30 07:08:40,383 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 07:08:40.383587 duration_in_ms=2.075
2018-03-30 07:08:40,384 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-enabled/default] at time 07:08:40.384094
2018-03-30 07:08:40,384 [salt.state       ][INFO    ][11233] Executing state file.absent for /etc/nginx/sites-enabled/default
2018-03-30 07:08:40,384 [salt.state       ][INFO    ][11233] {'removed': '/etc/nginx/sites-enabled/default'}
2018-03-30 07:08:40,385 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-enabled/default] at time 07:08:40.385065 duration_in_ms=0.971
2018-03-30 07:08:40,385 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/sites-available/default] at time 07:08:40.385533
2018-03-30 07:08:40,385 [salt.state       ][INFO    ][11233] Executing state file.absent for /etc/nginx/sites-available/default
2018-03-30 07:08:40,386 [salt.state       ][INFO    ][11233] {'removed': '/etc/nginx/sites-available/default'}
2018-03-30 07:08:40,386 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/sites-available/default] at time 07:08:40.386545 duration_in_ms=1.01
2018-03-30 07:08:40,387 [salt.state       ][INFO    ][11233] Running state [/etc/nginx/nginx.conf] at time 07:08:40.387013
2018-03-30 07:08:40,387 [salt.state       ][INFO    ][11233] Executing state file.managed for /etc/nginx/nginx.conf
2018-03-30 07:08:40,407 [salt.fileclient  ][INFO    ][11233] Fetching file from saltenv 'base', ** done ** 'nginx/files/nginx.conf'
2018-03-30 07:08:40,433 [salt.state       ][INFO    ][11233] File changed:
--- 
+++ 
@@ -1,85 +1,100 @@
 user www-data;
 worker_processes auto;
+worker_rlimit_nofile 20000;
 pid /run/nginx.pid;
 
+
 events {
-	worker_connections 768;
-	# multi_accept on;
+        worker_connections 1024;
+        # multi_accept on;
 }
 
 http {
 
-	##
-	# Basic Settings
-	##
+        ##
+        # Basic Settings
+        ##
 
-	sendfile on;
-	tcp_nopush on;
-	tcp_nodelay on;
-	keepalive_timeout 65;
-	types_hash_max_size 2048;
-	# server_tokens off;
+        sendfile on;
+        tcp_nopush on;
+        tcp_nodelay on;
+        keepalive_timeout 65;
+        types_hash_max_size 2048;
+        server_tokens off;
 
-	# server_names_hash_bucket_size 64;
-	# server_name_in_redirect off;
+        server_names_hash_bucket_size 128;
+        # server_name_in_redirect off;
 
-	include /etc/nginx/mime.types;
-	default_type application/octet-stream;
+        include /etc/nginx/mime.types;
+        default_type application/octet-stream;
 
-	##
-	# SSL Settings
-	##
+        ##
+        # Logging Settings
+        ##
 
-	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
-	ssl_prefer_server_ciphers on;
+        access_log /var/log/nginx/access.log;
+        error_log /var/log/nginx/error.log;
 
-	##
-	# Logging Settings
-	##
+        ##
+        # Gzip Settings
+        ##
 
-	access_log /var/log/nginx/access.log;
-	error_log /var/log/nginx/error.log;
+        gzip on;
+        gzip_disable "msie6";
 
-	##
-	# Gzip Settings
-	##
+        # gzip_vary on;
+        # gzip_proxied any;
+        # gzip_comp_level 6;
+        # gzip_buffers 16 8k;
+        # gzip_http_version 1.1;
+        # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
 
-	gzip on;
-	gzip_disable "msie6";
+        ##
+        # nginx-naxsi config
+        ##
+        # Uncomment it if you installed nginx-naxsi
+        ##
 
-	# gzip_vary on;
-	# gzip_proxied any;
-	# gzip_comp_level 6;
-	# gzip_buffers 16 8k;
-	# gzip_http_version 1.1;
-	# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
+        #include /etc/nginx/naxsi_core.rules;
 
-	##
-	# Virtual Host Configs
-	##
+        ##
+        # nginx-passenger config
+        ##
+        # Uncomment it if you installed nginx-passenger
+        ##
 
-	include /etc/nginx/conf.d/*.conf;
-	include /etc/nginx/sites-enabled/*;
+        #passenger_root /usr;
+        #passenger_ruby /usr/bin/ruby;
+
+
+
+        ##
+        # Virtual Host Configs
+        ##
+
+        include /etc/nginx/conf.d/*.conf;
+        include /etc/nginx/sites-enabled/*.conf;
 }
 
 
+
 #mail {
-#	# See sample authentication script at:
-#	# http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
-# 
-#	# auth_http localhost/auth.php;
-#	# pop3_capabilities "TOP" "USER";
-#	# imap_capabilities "IMAP4rev1" "UIDPLUS";
-# 
-#	server {
-#		listen     localhost:110;
-#		protocol   pop3;
-#		proxy      on;
-#	}
-# 
-#	server {
-#		listen     localhost:143;
-#		protocol   imap;
-#		proxy      on;
-#	}
+#       # See sample authentication script at:
+#       # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
+#
+#       # auth_http localhost/auth.php;
+#       # pop3_capabilities "TOP" "USER";
+#       # imap_capabilities "IMAP4rev1" "UIDPLUS";
+#
+#       server {
+#               listen     localhost:110;
+#               protocol   pop3;
+#               proxy      on;
+#       }
+#
+#       server {
+#               listen     localhost:143;
+#               protocol   imap;
+#               proxy      on;
+#       }
 #}

2018-03-30 07:08:40,433 [salt.state       ][INFO    ][11233] Completed state [/etc/nginx/nginx.conf] at time 07:08:40.433518 duration_in_ms=46.505
2018-03-30 07:08:40,434 [salt.state       ][INFO    ][11233] Running state [/etc/ssl/private] at time 07:08:40.434146
2018-03-30 07:08:40,434 [salt.state       ][INFO    ][11233] Executing state file.directory for /etc/ssl/private
2018-03-30 07:08:40,435 [salt.state       ][INFO    ][11233] Directory /etc/ssl/private is in the correct state
2018-03-30 07:08:40,435 [salt.state       ][INFO    ][11233] Completed state [/etc/ssl/private] at time 07:08:40.435702 duration_in_ms=1.556
2018-03-30 07:08:40,450 [salt.state       ][INFO    ][11233] Running state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 07:08:40.450794
2018-03-30 07:08:40,451 [salt.state       ][INFO    ][11233] Executing state cmd.run for openssl dhparam -out /etc/ssl/dhparams.pem 2048
2018-03-30 07:08:40,451 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command 'openssl dhparam -out /etc/ssl/dhparams.pem 2048' in directory '/root'
2018-03-30 07:08:50,324 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070850316179
2018-03-30 07:08:50,363 [salt.minion      ][INFO    ][12059] Starting a new job with PID 12059
2018-03-30 07:08:50,387 [salt.minion      ][INFO    ][12059] Returning information for job: 20180330070850316179
2018-03-30 07:09:00,351 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070900342409
2018-03-30 07:09:00,391 [salt.minion      ][INFO    ][12068] Starting a new job with PID 12068
2018-03-30 07:09:00,416 [salt.minion      ][INFO    ][12068] Returning information for job: 20180330070900342409
2018-03-30 07:09:10,376 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070910368079
2018-03-30 07:09:10,415 [salt.minion      ][INFO    ][12077] Starting a new job with PID 12077
2018-03-30 07:09:10,440 [salt.minion      ][INFO    ][12077] Returning information for job: 20180330070910368079
2018-03-30 07:09:20,403 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070920393517
2018-03-30 07:09:20,443 [salt.minion      ][INFO    ][12086] Starting a new job with PID 12086
2018-03-30 07:09:20,469 [salt.minion      ][INFO    ][12086] Returning information for job: 20180330070920393517
2018-03-30 07:09:30,449 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070930439777
2018-03-30 07:09:30,493 [salt.minion      ][INFO    ][12095] Starting a new job with PID 12095
2018-03-30 07:09:30,518 [salt.minion      ][INFO    ][12095] Returning information for job: 20180330070930439777
2018-03-30 07:09:40,524 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330070940515726
2018-03-30 07:09:40,563 [salt.minion      ][INFO    ][12104] Starting a new job with PID 12104
2018-03-30 07:09:40,588 [salt.minion      ][INFO    ][12104] Returning information for job: 20180330070940515726
2018-03-30 07:09:47,372 [salt.state       ][INFO    ][11233] {'pid': 12051, 'retcode': 0, 'stderr': "Generating DH parameters, 2048 bit long safe prime, generator 2\nThis is going to take a long time\n........................................................................................................................................................................................................................................................................................................................................................................................................................................+................................................................+...............................+......................................................................................................+...................................................................................................................+......................................+...........................................+.............+.................+...........................................................+.......................+......................................................................................................+.......................................................................................................................................................................................................................................................................................................+...............................................+...........+...................+............+...............................+....+..............................+........................................................................................................+...........................................................................+....................................+..........+..........................................................................................................+................................................................+...........................................+.+.......................................................................................................................................................................................................................................................+........................................................+...........+..........+...............................+.................................................................................+...........+..................................................................................................................................................................................................................................................................................................................................................................................................................................+...........................................................................+......................................+...............................................................................................................................................+.....................+.........................................................................................+............................+....................................................................................+.......................................................................................................................................................................+...............................................................................................................................+..................................................................................................................................................................................................................................................................................................................................................................................................................................................+.........................................................................................+.........................................................................................+..................+....................................+....................................................................+.................................................................................+..........+..........................................+.............................................+...................................................................................+...............+...........................+.................................................................................................................................................................................................................................................................................................................+................................+...............................+........+................................................+................................+.......................................................................................................................................................+.......................................................................................................................................................................+............................................................................................................................................................................................................................................................+...................................................................................................+................................................................................................................................+..........................................................................................................................+...................................................................................+.......+.+.........................+....................................................................................................................................+.........................+......................................................+......+........................................................................................................................................................................+............................................................+.........................................................................................................................................................................................................................................+........................................................................+........+.........................................................................................................+........................................................+.....................................................+.....................................................................................................................................+.............................................................................................................................+.....................................................+.................................................................................................................................+....+.............................................................................................................................................+.................................................................................................................................................................................................+......................................................................................................................................+...................................................................................................................+........................................................................................................................................+..............................................................................................................................................................................................................................................................................+...................+....................................................................................................+..................................................................................................................+............................................................................................................................................................................................................................................+.....................................+......................................................................................................+..........................................................+............................................................................................................................+......................................................................+....................................................................................................................................................................................................................................+....................................................................................................................+............................................................................................+...............................................................................................................................................................................................+.....................................................++*++*\nunable to write 'random state'", 'stdout': ''}
2018-03-30 07:09:47,373 [salt.state       ][INFO    ][11233] Completed state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 07:09:47.372939 duration_in_ms=66922.145
2018-03-30 07:09:47,375 [salt.state       ][INFO    ][11233] Running state [nginx] at time 07:09:47.375566
2018-03-30 07:09:47,375 [salt.state       ][INFO    ][11233] Executing state service.running for nginx
2018-03-30 07:09:47,376 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemctl', 'status', 'nginx.service', '-n', '0'] in directory '/root'
2018-03-30 07:09:47,399 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-03-30 07:09:47,418 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-03-30 07:09:47,436 [salt.state       ][INFO    ][11233] The service nginx is already running
2018-03-30 07:09:47,438 [salt.state       ][INFO    ][11233] Completed state [nginx] at time 07:09:47.438292 duration_in_ms=62.725
2018-03-30 07:09:47,439 [salt.state       ][INFO    ][11233] Running state [nginx] at time 07:09:47.439824
2018-03-30 07:09:47,441 [salt.state       ][INFO    ][11233] Executing state service.mod_watch for nginx
2018-03-30 07:09:47,442 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-03-30 07:09:47,458 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-03-30 07:09:47,471 [salt.loaded.int.module.cmdmod][INFO    ][11233] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'nginx.service'] in directory '/root'
2018-03-30 07:09:47,597 [salt.state       ][INFO    ][11233] {'nginx': True}
2018-03-30 07:09:47,598 [salt.state       ][INFO    ][11233] Completed state [nginx] at time 07:09:47.598314 duration_in_ms=158.492
2018-03-30 07:09:47,603 [salt.minion      ][INFO    ][11233] Returning information for job: 20180330070819975056
2018-03-30 07:10:10,074 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command pkg.version with jid 20180330071010066910
2018-03-30 07:10:10,102 [salt.minion      ][INFO    ][12144] Starting a new job with PID 12144
2018-03-30 07:10:10,135 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12144] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
2018-03-30 07:10:10,465 [salt.minion      ][INFO    ][12144] Returning information for job: 20180330071010066910
2018-03-30 07:10:11,092 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command file.symlink with jid 20180330071011089395
2018-03-30 07:10:11,119 [salt.minion      ][INFO    ][12152] Starting a new job with PID 12152
2018-03-30 07:10:11,136 [salt.minion      ][INFO    ][12152] Returning information for job: 20180330071011089395
2018-03-30 07:10:11,782 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command cmd.run with jid 20180330071011774729
2018-03-30 07:10:11,818 [salt.minion      ][INFO    ][12157] Starting a new job with PID 12157
2018-03-30 07:10:11,828 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12157] Executing command '/usr/share/openstack-dashboard/manage.py collectstatic --noinput' in directory '/root'
2018-03-30 07:10:13,117 [salt.minion      ][INFO    ][12157] Returning information for job: 20180330071011774729
2018-03-30 07:10:13,770 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command cmd.run with jid 20180330071013760539
2018-03-30 07:10:13,801 [salt.minion      ][INFO    ][12169] Starting a new job with PID 12169
2018-03-30 07:10:13,811 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12169] Executing command '/usr/share/openstack-dashboard/manage.py compress --force' in directory '/root'
2018-03-30 07:10:23,861 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330071023853905
2018-03-30 07:10:23,886 [salt.minion      ][INFO    ][12183] Starting a new job with PID 12183
2018-03-30 07:10:23,897 [salt.minion      ][INFO    ][12183] Returning information for job: 20180330071023853905
2018-03-30 07:10:34,086 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command saltutil.find_job with jid 20180330071034077285
2018-03-30 07:10:34,133 [salt.minion      ][INFO    ][12192] Starting a new job with PID 12192
2018-03-30 07:10:34,157 [salt.minion      ][INFO    ][12192] Returning information for job: 20180330071034077285
2018-03-30 07:10:42,831 [salt.minion      ][INFO    ][12169] Returning information for job: 20180330071013760539
2018-03-30 07:10:43,480 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command file.append with jid 20180330071043473122
2018-03-30 07:10:43,509 [salt.minion      ][INFO    ][12203] Starting a new job with PID 12203
2018-03-30 07:10:43,530 [salt.minion      ][INFO    ][12203] Returning information for job: 20180330071043473122
2018-03-30 07:10:44,173 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command service.reload with jid 20180330071044164638
2018-03-30 07:10:44,202 [salt.minion      ][INFO    ][12208] Starting a new job with PID 12208
2018-03-30 07:10:45,136 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12208] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-03-30 07:10:45,160 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12208] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-03-30 07:10:45,198 [salt.loader.192.168.11.2.int.module.cmdmod][INFO    ][12208] Executing command ['systemd-run', '--scope', 'systemctl', 'reload', 'apache2.service'] in directory '/root'
2018-03-30 07:10:45,375 [salt.minion      ][INFO    ][12208] Returning information for job: 20180330071044164638
2018-03-30 07:10:46,245 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command pillar.get with jid 20180330071046236655
2018-03-30 07:10:46,275 [salt.minion      ][INFO    ][12342] Starting a new job with PID 12342
2018-03-30 07:10:46,284 [salt.minion      ][INFO    ][12342] Returning information for job: 20180330071046236655
2018-03-30 07:10:46,951 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command cp.push with jid 20180330071046939286
2018-03-30 07:10:46,980 [salt.minion      ][INFO    ][12347] Starting a new job with PID 12347
2018-03-30 07:10:47,008 [salt.minion      ][INFO    ][12347] Returning information for job: 20180330071046939286
2018-03-30 07:11:33,075 [salt.minion      ][INFO    ][1338] User sudo_ubuntu Executing command cp.push_dir with jid 20180330071133067681
2018-03-30 07:11:33,115 [salt.minion      ][INFO    ][12374] Starting a new job with PID 12374
