2018-09-01 22:08:28,665 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:29,085 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:29,088 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:29,091 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:29,093 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,759 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,765 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,770 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:32,494 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:32,496 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:32,500 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:32,503 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,375 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,079 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,082 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,084 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,087 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,090 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,093 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,096 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,099 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,102 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,105 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,108 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,111 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,113 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,116 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,119 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,122 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,124 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,127 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,130 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,132 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,135 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,138 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,140 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,143 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,146 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,148 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,151 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,154 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,156 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,159 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,161 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,164 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,166 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,169 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,172 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,174 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,177 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,179 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,182 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,184 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,187 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,189 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,192 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:34,194 [salt.utils.decorators:82  ][ERROR   ][1871] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:09:26,059 [salt.utils.decorators:613 ][WARNING ][1871] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:09:30,363 [salt.loaded.int.states.file:2150][WARNING ][1871] State for file: /etc/ssl/certs/ca-salt_master_ca.crt - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-09-01 22:09:33,375 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3210] Executing command ['systemctl', 'status', 'salt-minion.service', '-n', '0'] in directory '/root'
2018-09-01 22:09:33,404 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3210] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'salt-minion.service'] in directory '/root'
2018-09-01 22:09:33,420 [salt.utils.parsers:1051][WARNING ][1536] Minion received a SIGTERM. Exiting.
2018-09-01 22:09:34,242 [salt.cli.daemons :293 ][INFO    ][3264] Setting up the Salt Minion "prx02.mcp-ovs-ha.local"
2018-09-01 22:09:34,320 [salt.cli.daemons :82  ][INFO    ][3264] Starting up the Salt Minion
2018-09-01 22:09:34,320 [salt.utils.event :1017][INFO    ][3264] Starting pull socket on /var/run/salt/minion/minion_event_0f78866554_pull.ipc
2018-09-01 22:09:34,911 [salt.minion      :976 ][INFO    ][3264] Creating minion process manager
2018-09-01 22:09:35,971 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][3264] Executing command ['date', '+%z'] in directory '/root'
2018-09-01 22:09:35,996 [salt.utils.schedule:568 ][INFO    ][3264] Updating job settings for scheduled job: __mine_interval
2018-09-01 22:09:36,046 [salt.minion      :1107][INFO    ][3264] Added mine.update to scheduler
2018-09-01 22:09:36,062 [salt.minion      :1965][INFO    ][3264] Minion is starting as user 'root'
2018-09-01 22:09:36,075 [salt.minion      :2324][INFO    ][3264] Minion is ready to receive requests!
2018-09-01 22:10:24,282 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command state.apply with jid 20180901221024268410
2018-09-01 22:10:24,308 [salt.minion      :1431][INFO    ][3353] Starting a new job with PID 3353
2018-09-01 22:10:29,278 [salt.state       :905 ][INFO    ][3353] Loading fresh modules for state activity
2018-09-01 22:10:29,886 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221029878664
2018-09-01 22:10:29,908 [salt.minion      :1431][INFO    ][3360] Starting a new job with PID 3360
2018-09-01 22:10:29,930 [salt.minion      :1708][INFO    ][3360] Returning information for job: 20180901221029878664
2018-09-01 22:10:30,400 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/init.sls'
2018-09-01 22:10:30,771 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/init.sls'
2018-09-01 22:10:30,864 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/env.sls'
2018-09-01 22:10:30,935 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/profile.sls'
2018-09-01 22:10:31,006 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/repo.sls'
2018-09-01 22:10:31,163 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/package.sls'
2018-09-01 22:10:31,242 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/timezone.sls'
2018-09-01 22:10:31,310 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/kernel.sls'
2018-09-01 22:10:31,398 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/cpu.sls'
2018-09-01 22:10:31,467 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/sysfs.sls'
2018-09-01 22:10:31,540 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/locale.sls'
2018-09-01 22:10:31,609 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/user.sls'
2018-09-01 22:10:31,698 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/group.sls'
2018-09-01 22:10:32,458 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/limit.sls'
2018-09-01 22:10:32,703 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/systemd.sls'
2018-09-01 22:10:32,774 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/apt.sls'
2018-09-01 22:10:32,848 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/system/banner.sls'
2018-09-01 22:10:32,921 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/network/init.sls'
2018-09-01 22:10:32,991 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/network/hostname.sls'
2018-09-01 22:10:33,060 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/network/host.sls'
2018-09-01 22:10:33,170 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/network/interface.sls'
2018-09-01 22:10:33,312 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/network/proxy.sls'
2018-09-01 22:10:33,387 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/storage/init.sls'
2018-09-01 22:10:33,484 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'ntp/init.sls'
2018-09-01 22:10:33,518 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'ntp/client.sls'
2018-09-01 22:10:33,557 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'ntp/server.sls'
2018-09-01 22:10:33,590 [salt.state       :1770][INFO    ][3353] Running state [/etc/environment] at time 22:10:33.590336
2018-09-01 22:10:33,590 [salt.state       :1803][INFO    ][3353] Executing state file.blockreplace for [/etc/environment]
2018-09-01 22:10:33,598 [salt.state       :290 ][INFO    ][3353] File changed:
--- 
+++ 
@@ -1 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
+# SALT MANAGED VARIABLES - DO NOT EDIT - START
+# 
+# SALT MANAGED VARIABLES - END

2018-09-01 22:10:33,598 [salt.state       :1941][INFO    ][3353] Completed state [/etc/environment] at time 22:10:33.598243 duration_in_ms=7.908
2018-09-01 22:10:33,598 [salt.state       :1770][INFO    ][3353] Running state [/etc/profile.d] at time 22:10:33.598450
2018-09-01 22:10:33,598 [salt.state       :1803][INFO    ][3353] Executing state file.directory for [/etc/profile.d]
2018-09-01 22:10:33,605 [salt.state       :290 ][INFO    ][3353] Directory /etc/profile.d is in the correct state
Directory /etc/profile.d updated
2018-09-01 22:10:33,605 [salt.state       :1941][INFO    ][3353] Completed state [/etc/profile.d] at time 22:10:33.605335 duration_in_ms=6.885
2018-09-01 22:10:34,123 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 22:10:34.123069
2018-09-01 22:10:34,123 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/apt/apt.conf.d/99prefer_ipv4-salt]
2018-09-01 22:10:34,495 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf'
2018-09-01 22:10:34,508 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:10:34,508 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 22:10:34.508487 duration_in_ms=385.42
2018-09-01 22:10:34,508 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 22:10:34.508739
2018-09-01 22:10:34,509 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/apt/apt.conf.d/99allow_downgrades-salt]
2018-09-01 22:10:34,527 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:10:34,527 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 22:10:34.527760 duration_in_ms=19.02
2018-09-01 22:10:34,528 [salt.state       :1770][INFO    ][3353] Running state [linux_repo_prereq_pkgs] at time 22:10:34.528802
2018-09-01 22:10:34,529 [salt.state       :1803][INFO    ][3353] Executing state pkg.installed for [linux_repo_prereq_pkgs]
2018-09-01 22:10:34,529 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:10:34,822 [salt.state       :290 ][INFO    ][3353] All specified packages are already installed
2018-09-01 22:10:34,823 [salt.state       :1941][INFO    ][3353] Completed state [linux_repo_prereq_pkgs] at time 22:10:34.823048 duration_in_ms=294.245
2018-09-01 22:10:34,823 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/99proxies-salt] at time 22:10:34.823332
2018-09-01 22:10:34,823 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/apt/apt.conf.d/99proxies-salt]
2018-09-01 22:10:34,838 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf.d_proxies'
2018-09-01 22:10:34,850 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:10:34,851 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/99proxies-salt] at time 22:10:34.850994 duration_in_ms=27.662
2018-09-01 22:10:34,851 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 22:10:34.851219
2018-09-01 22:10:34,851 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack]
2018-09-01 22:10:34,851 [salt.state       :290 ][INFO    ][3353] File /etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack is not present
2018-09-01 22:10:34,851 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 22:10:34.851916 duration_in_ms=0.696
2018-09-01 22:10:34,852 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/preferences.d/mirantis_openstack] at time 22:10:34.852113
2018-09-01 22:10:34,852 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/apt/preferences.d/mirantis_openstack]
2018-09-01 22:10:34,866 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/preferences_repo'
2018-09-01 22:10:34,941 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:10:34,941 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/preferences.d/mirantis_openstack] at time 22:10:34.941902 duration_in_ms=89.789
2018-09-01 22:10:34,944 [salt.state       :1770][INFO    ][3353] Running state [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main] at time 22:10:34.944778
2018-09-01 22:10:34,945 [salt.state       :1803][INFO    ][3353] Executing state pkgrepo.managed for [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main]
2018-09-01 22:10:35,771 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-key', 'add', '/var/cache/salt/minion/extrn_files/base/mirror.mirantis.com/nightly/openstack-queens/xenial/archive-queens.key'] in directory '/root'
2018-09-01 22:10:36,292 [salt.state       :290 ][INFO    ][3353] {'repo': 'deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main'}
2018-09-01 22:10:36,293 [salt.state       :1941][INFO    ][3353] Completed state [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main] at time 22:10:36.293194 duration_in_ms=1348.415
2018-09-01 22:10:36,293 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 22:10:36.293517
2018-09-01 22:10:36,293 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-uca]
2018-09-01 22:10:36,294 [salt.state       :290 ][INFO    ][3353] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-09-01 22:10:36,294 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 22:10:36.294490 duration_in_ms=0.973
2018-09-01 22:10:36,294 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/preferences.d/uca] at time 22:10:36.294696
2018-09-01 22:10:36,294 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/apt/preferences.d/uca]
2018-09-01 22:10:36,572 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:10:36,572 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/preferences.d/uca] at time 22:10:36.572447 duration_in_ms=277.751
2018-09-01 22:10:36,576 [salt.state       :1770][INFO    ][3353] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 22:10:36.576095
2018-09-01 22:10:36,576 [salt.state       :1803][INFO    ][3353] Executing state cmd.run for [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA]
2018-09-01 22:10:36,576 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'test -e /etc/apt/sources.list.d/uca.list' in directory '/root'
2018-09-01 22:10:36,594 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-09-01 22:10:36,924 [salt.state       :290 ][INFO    ][3353] {'pid': 3562, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: public key "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" imported\ngpg: Total number processed: 1\ngpg:               imported: 1  (RSA: 1)', 'stdout': 'Executing: /tmp/tmp.hMqljZM7z7/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-09-01 22:10:36,924 [salt.state       :1941][INFO    ][3353] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 22:10:36.924654 duration_in_ms=348.558
2018-09-01 22:10:36,927 [salt.state       :1770][INFO    ][3353] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main] at time 22:10:36.927045
2018-09-01 22:10:36,927 [salt.state       :1803][INFO    ][3353] Executing state pkgrepo.managed for [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main]
2018-09-01 22:10:37,002 [salt.state       :290 ][INFO    ][3353] {'repo': 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main'}
2018-09-01 22:10:37,002 [salt.state       :1941][INFO    ][3353] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main] at time 22:10:37.002874 duration_in_ms=75.83
2018-09-01 22:10:37,003 [salt.state       :1770][INFO    ][3353] Running state [pkg.refresh_db] at time 22:10:37.003740
2018-09-01 22:10:37,003 [salt.state       :1803][INFO    ][3353] Executing state module.run for [pkg.refresh_db]
2018-09-01 22:10:37,004 [salt.utils.decorators:613 ][WARNING ][3353] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:10:37,004 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:10:39,931 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221039915966
2018-09-01 22:10:40,185 [salt.minion      :1431][INFO    ][4035] Starting a new job with PID 4035
2018-09-01 22:10:40,199 [salt.minion      :1708][INFO    ][4035] Returning information for job: 20180901221039915966
2018-09-01 22:10:41,193 [salt.state       :290 ][INFO    ][3353] {'ret': {'http://security.ubuntu.com/ubuntu xenial-security InRelease': True, 'http://archive.ubuntu.com/ubuntu xenial-backports InRelease': None, 'http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial/main amd64 Packages': True, 'http://archive.ubuntu.com/ubuntu xenial-updates InRelease': None, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens/main amd64 Packages': True, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens Release.gpg': True, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens InRelease': False, 'http://repo.saltstack.com/apt/ubuntu/16.04/amd64/2017.7 xenial InRelease': None, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens Release': True, 'http://archive.ubuntu.com/ubuntu xenial InRelease': None, 'http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial InRelease': True}}
2018-09-01 22:10:41,195 [salt.state       :1941][INFO    ][3353] Completed state [pkg.refresh_db] at time 22:10:41.195302 duration_in_ms=4191.561
2018-09-01 22:10:41,195 [salt.state       :1770][INFO    ][3353] Running state [linux_extra_packages_latest] at time 22:10:41.195671
2018-09-01 22:10:41,195 [salt.state       :1803][INFO    ][3353] Executing state pkg.latest for [linux_extra_packages_latest]
2018-09-01 22:10:41,206 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-09-01 22:10:41,274 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-cache', '-q', 'policy', 'python-tornado'] in directory '/root'
2018-09-01 22:10:41,371 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:10:41,392 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'libapache2-mod-wsgi', 'python-tornado'] in directory '/root'
2018-09-01 22:10:49,990 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221049974773
2018-09-01 22:10:50,010 [salt.minion      :1431][INFO    ][4138] Starting a new job with PID 4138
2018-09-01 22:10:50,028 [salt.minion      :1708][INFO    ][4138] Returning information for job: 20180901221049974773
2018-09-01 22:11:00,047 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221100027387
2018-09-01 22:11:00,071 [salt.minion      :1431][INFO    ][4422] Starting a new job with PID 4422
2018-09-01 22:11:00,104 [salt.minion      :1708][INFO    ][4422] Returning information for job: 20180901221100027387
2018-09-01 22:11:08,125 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:08,164 [salt.state       :290 ][INFO    ][3353] Made the following changes:
'python-tornado' changed from '4.2.1-2~ds+1' to '4.5.3-1.0~u16.04+mcp1'
'libaprutil1-ldap' changed from 'absent' to '1.5.4-1build1'
'libapr1' changed from 'absent' to '1.5.2-3'
'libpython2.7' changed from 'absent' to '2.7.12-1ubuntu0~16.04.3'
'libapache2-mod-wsgi' changed from 'absent' to '4.4.15-0.1.1~u16.04+mcp2'
'apache2-api-20120211' changed from 'absent' to '1'
'libaprutil1' changed from 'absent' to '1.5.4-1build1'
'httpd-wsgi' changed from 'absent' to '1'
'python-singledispatch' changed from 'absent' to '3.4.0.3-2'
'liblua5.1-0' changed from 'absent' to '5.1.5-8ubuntu1'
'libaprutil1-dbd-sqlite3' changed from 'absent' to '1.5.4-1build1'
'python-backports-abc' changed from 'absent' to '0.5-2.0~u16.04+mcp1'
'apache2-bin' changed from 'absent' to '2.4.18-2ubuntu3.9'

2018-09-01 22:11:08,183 [salt.state       :905 ][INFO    ][3353] Loading fresh modules for state activity
2018-09-01 22:11:08,206 [salt.state       :1941][INFO    ][3353] Completed state [linux_extra_packages_latest] at time 22:11:08.206913 duration_in_ms=27011.242
2018-09-01 22:11:08,209 [salt.state       :1770][INFO    ][3353] Running state [UTC] at time 22:11:08.209543
2018-09-01 22:11:08,209 [salt.state       :1803][INFO    ][3353] Executing state timezone.system for [UTC]
2018-09-01 22:11:08,211 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['timedatectl'] in directory '/root'
2018-09-01 22:11:08,604 [salt.state       :290 ][INFO    ][3353] Timezone UTC already set, UTC already set to UTC
2018-09-01 22:11:08,604 [salt.state       :1941][INFO    ][3353] Completed state [UTC] at time 22:11:08.604889 duration_in_ms=395.345
2018-09-01 22:11:08,608 [salt.state       :1770][INFO    ][3353] Running state [nf_conntrack] at time 22:11:08.608282
2018-09-01 22:11:08,608 [salt.state       :1803][INFO    ][3353] Executing state kmod.present for [nf_conntrack]
2018-09-01 22:11:08,609 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:09,700 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:09,722 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'modprobe nf_conntrack' in directory '/root'
2018-09-01 22:11:09,751 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:09,838 [salt.state       :290 ][INFO    ][3353] {'nf_conntrack': 'loaded'}
2018-09-01 22:11:09,839 [salt.state       :1941][INFO    ][3353] Completed state [nf_conntrack] at time 22:11:09.839076 duration_in_ms=1230.794
2018-09-01 22:11:09,841 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_keepalive_probes] at time 22:11:09.841311
2018-09-01 22:11:09,841 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_keepalive_probes]
2018-09-01 22:11:09,842 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_keepalive_probes="8"' in directory '/root'
2018-09-01 22:11:09,868 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_keepalive_probes': 8}
2018-09-01 22:11:09,869 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_keepalive_probes] at time 22:11:09.869289 duration_in_ms=27.976
2018-09-01 22:11:09,870 [salt.state       :1770][INFO    ][3353] Running state [fs.file-max] at time 22:11:09.870120
2018-09-01 22:11:09,870 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [fs.file-max]
2018-09-01 22:11:09,872 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w fs.file-max="124165"' in directory '/root'
2018-09-01 22:11:09,887 [salt.state       :290 ][INFO    ][3353] {'fs.file-max': 124165}
2018-09-01 22:11:09,887 [salt.state       :1941][INFO    ][3353] Completed state [fs.file-max] at time 22:11:09.887748 duration_in_ms=17.628
2018-09-01 22:11:09,888 [salt.state       :1770][INFO    ][3353] Running state [net.core.somaxconn] at time 22:11:09.888337
2018-09-01 22:11:09,888 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.core.somaxconn]
2018-09-01 22:11:09,982 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.core.somaxconn="4096"' in directory '/root'
2018-09-01 22:11:10,006 [salt.state       :290 ][INFO    ][3353] {'net.core.somaxconn': 4096}
2018-09-01 22:11:10,008 [salt.state       :1941][INFO    ][3353] Completed state [net.core.somaxconn] at time 22:11:10.007849 duration_in_ms=119.509
2018-09-01 22:11:10,008 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_max_syn_backlog] at time 22:11:10.008893
2018-09-01 22:11:10,009 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_max_syn_backlog]
2018-09-01 22:11:10,011 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_max_syn_backlog="8192"' in directory '/root'
2018-09-01 22:11:10,026 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_max_syn_backlog': 8192}
2018-09-01 22:11:10,027 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_max_syn_backlog] at time 22:11:10.027745 duration_in_ms=18.851
2018-09-01 22:11:10,028 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_tw_reuse] at time 22:11:10.028454
2018-09-01 22:11:10,029 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_tw_reuse]
2018-09-01 22:11:10,030 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_tw_reuse="1"' in directory '/root'
2018-09-01 22:11:10,047 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_tw_reuse': 1}
2018-09-01 22:11:10,048 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_tw_reuse] at time 22:11:10.048611 duration_in_ms=20.157
2018-09-01 22:11:10,049 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_congestion_control] at time 22:11:10.049491
2018-09-01 22:11:10,050 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_congestion_control]
2018-09-01 22:11:10,052 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_congestion_control="yeah"' in directory '/root'
2018-09-01 22:11:10,154 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221110138402
2018-09-01 22:11:10,174 [salt.minion      :1431][INFO    ][4561] Starting a new job with PID 4561
2018-09-01 22:11:10,193 [salt.minion      :1708][INFO    ][4561] Returning information for job: 20180901221110138402
2018-09-01 22:11:10,223 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_congestion_control': 'yeah'}
2018-09-01 22:11:10,223 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_congestion_control] at time 22:11:10.223539 duration_in_ms=174.048
2018-09-01 22:11:10,224 [salt.state       :1770][INFO    ][3353] Running state [net.nf_conntrack_max] at time 22:11:10.224107
2018-09-01 22:11:10,224 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.nf_conntrack_max]
2018-09-01 22:11:10,226 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.nf_conntrack_max="1048576"' in directory '/root'
2018-09-01 22:11:10,238 [salt.state       :290 ][INFO    ][3353] {'net.nf_conntrack_max': 1048576}
2018-09-01 22:11:10,238 [salt.state       :1941][INFO    ][3353] Completed state [net.nf_conntrack_max] at time 22:11:10.238562 duration_in_ms=14.455
2018-09-01 22:11:10,238 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_retries2] at time 22:11:10.238856
2018-09-01 22:11:10,239 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_retries2]
2018-09-01 22:11:10,239 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_retries2="5"' in directory '/root'
2018-09-01 22:11:10,250 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_retries2': 5}
2018-09-01 22:11:10,250 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_retries2] at time 22:11:10.250861 duration_in_ms=12.005
2018-09-01 22:11:10,251 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_fin_timeout] at time 22:11:10.251274
2018-09-01 22:11:10,251 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_fin_timeout]
2018-09-01 22:11:10,252 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_fin_timeout="30"' in directory '/root'
2018-09-01 22:11:10,263 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_fin_timeout': 30}
2018-09-01 22:11:10,264 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_fin_timeout] at time 22:11:10.264353 duration_in_ms=13.079
2018-09-01 22:11:10,264 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_slow_start_after_idle] at time 22:11:10.264839
2018-09-01 22:11:10,265 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_slow_start_after_idle]
2018-09-01 22:11:10,304 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_slow_start_after_idle="0"' in directory '/root'
2018-09-01 22:11:10,323 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_slow_start_after_idle': 0}
2018-09-01 22:11:10,324 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 22:11:10.324382 duration_in_ms=59.542
2018-09-01 22:11:10,325 [salt.state       :1770][INFO    ][3353] Running state [vm.swappiness] at time 22:11:10.325213
2018-09-01 22:11:10,325 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [vm.swappiness]
2018-09-01 22:11:10,327 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w vm.swappiness="10"' in directory '/root'
2018-09-01 22:11:10,341 [salt.state       :290 ][INFO    ][3353] {'vm.swappiness': 10}
2018-09-01 22:11:10,341 [salt.state       :1941][INFO    ][3353] Completed state [vm.swappiness] at time 22:11:10.341767 duration_in_ms=16.553
2018-09-01 22:11:10,342 [salt.state       :1770][INFO    ][3353] Running state [net.core.netdev_max_backlog] at time 22:11:10.342160
2018-09-01 22:11:10,342 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.core.netdev_max_backlog]
2018-09-01 22:11:10,370 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.core.netdev_max_backlog="261144"' in directory '/root'
2018-09-01 22:11:10,384 [salt.state       :290 ][INFO    ][3353] {'net.core.netdev_max_backlog': 261144}
2018-09-01 22:11:10,385 [salt.state       :1941][INFO    ][3353] Completed state [net.core.netdev_max_backlog] at time 22:11:10.385224 duration_in_ms=43.063
2018-09-01 22:11:10,385 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.neigh.default.gc_thresh1] at time 22:11:10.385826
2018-09-01 22:11:10,386 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh1]
2018-09-01 22:11:10,436 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh1="4096"' in directory '/root'
2018-09-01 22:11:10,452 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.neigh.default.gc_thresh1': 4096}
2018-09-01 22:11:10,453 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 22:11:10.453531 duration_in_ms=67.704
2018-09-01 22:11:10,454 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.neigh.default.gc_thresh2] at time 22:11:10.454344
2018-09-01 22:11:10,455 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh2]
2018-09-01 22:11:10,457 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh2="8192"' in directory '/root'
2018-09-01 22:11:10,469 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.neigh.default.gc_thresh2': 8192}
2018-09-01 22:11:10,469 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 22:11:10.469904 duration_in_ms=15.561
2018-09-01 22:11:10,470 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.neigh.default.gc_thresh3] at time 22:11:10.470211
2018-09-01 22:11:10,470 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh3]
2018-09-01 22:11:10,471 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh3="16384"' in directory '/root'
2018-09-01 22:11:10,483 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.neigh.default.gc_thresh3': 16384}
2018-09-01 22:11:10,484 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 22:11:10.484554 duration_in_ms=14.342
2018-09-01 22:11:10,485 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_keepalive_intvl] at time 22:11:10.485164
2018-09-01 22:11:10,485 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_keepalive_intvl]
2018-09-01 22:11:10,487 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_keepalive_intvl="3"' in directory '/root'
2018-09-01 22:11:10,500 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_keepalive_intvl': 3}
2018-09-01 22:11:10,501 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_keepalive_intvl] at time 22:11:10.501196 duration_in_ms=16.031
2018-09-01 22:11:10,502 [salt.state       :1770][INFO    ][3353] Running state [net.ipv4.tcp_keepalive_time] at time 22:11:10.502035
2018-09-01 22:11:10,502 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [net.ipv4.tcp_keepalive_time]
2018-09-01 22:11:10,504 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w net.ipv4.tcp_keepalive_time="30"' in directory '/root'
2018-09-01 22:11:10,519 [salt.state       :290 ][INFO    ][3353] {'net.ipv4.tcp_keepalive_time': 30}
2018-09-01 22:11:10,519 [salt.state       :1941][INFO    ][3353] Completed state [net.ipv4.tcp_keepalive_time] at time 22:11:10.519368 duration_in_ms=17.334
2018-09-01 22:11:10,519 [salt.state       :1770][INFO    ][3353] Running state [kernel.panic] at time 22:11:10.519714
2018-09-01 22:11:10,520 [salt.state       :1803][INFO    ][3353] Executing state sysctl.present for [kernel.panic]
2018-09-01 22:11:10,604 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'sysctl -w kernel.panic="60"' in directory '/root'
2018-09-01 22:11:10,628 [salt.state       :290 ][INFO    ][3353] {'kernel.panic': 60}
2018-09-01 22:11:10,629 [salt.state       :1941][INFO    ][3353] Completed state [kernel.panic] at time 22:11:10.629466 duration_in_ms=109.75
2018-09-01 22:11:10,640 [salt.state       :1770][INFO    ][3353] Running state [linux_sysfs_package] at time 22:11:10.639930
2018-09-01 22:11:10,640 [salt.state       :1803][INFO    ][3353] Executing state pkg.installed for [linux_sysfs_package]
2018-09-01 22:11:11,117 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-cache', '-q', 'policy', 'sysfsutils'] in directory '/root'
2018-09-01 22:11:11,207 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:11:12,886 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:11:12,911 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'sysfsutils'] in directory '/root'
2018-09-01 22:11:20,433 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221120415125
2018-09-01 22:11:20,459 [salt.minion      :1431][INFO    ][5342] Starting a new job with PID 5342
2018-09-01 22:11:20,475 [salt.minion      :1708][INFO    ][5342] Returning information for job: 20180901221120415125
2018-09-01 22:11:20,687 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:20,726 [salt.state       :290 ][INFO    ][3353] Made the following changes:
'libsysfs2' changed from 'absent' to '2.1.0+repack-4'
'sysfsutils' changed from 'absent' to '2.1.0+repack-4'

2018-09-01 22:11:20,739 [salt.state       :905 ][INFO    ][3353] Loading fresh modules for state activity
2018-09-01 22:11:20,761 [salt.state       :1941][INFO    ][3353] Completed state [linux_sysfs_package] at time 22:11:20.761319 duration_in_ms=10121.389
2018-09-01 22:11:20,764 [salt.state       :1770][INFO    ][3353] Running state [/etc/sysfs.d] at time 22:11:20.764544
2018-09-01 22:11:20,764 [salt.state       :1803][INFO    ][3353] Executing state file.directory for [/etc/sysfs.d]
2018-09-01 22:11:20,767 [salt.state       :290 ][INFO    ][3353] Directory /etc/sysfs.d is in the correct state
Directory /etc/sysfs.d updated
2018-09-01 22:11:20,767 [salt.state       :1941][INFO    ][3353] Completed state [/etc/sysfs.d] at time 22:11:20.767601 duration_in_ms=3.056
2018-09-01 22:11:21,096 [salt.state       :1770][INFO    ][3353] Running state [ondemand] at time 22:11:21.096465
2018-09-01 22:11:21,096 [salt.state       :1803][INFO    ][3353] Executing state service.dead for [ondemand]
2018-09-01 22:11:21,097 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:21,111 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,122 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,135 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,191 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,209 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,237 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,259 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', '/usr/sbin/update-rc.d', '-f', 'ondemand', 'remove'] in directory '/root'
2018-09-01 22:11:21,466 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,489 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'runlevel' in directory '/root'
2018-09-01 22:11:21,500 [salt.state       :290 ][INFO    ][3353] {'ondemand': True}
2018-09-01 22:11:21,501 [salt.state       :1941][INFO    ][3353] Completed state [ondemand] at time 22:11:21.500981 duration_in_ms=404.516
2018-09-01 22:11:21,502 [salt.state       :1770][INFO    ][3353] Running state [cs_CZ.UTF-8] at time 22:11:21.502146
2018-09-01 22:11:21,502 [salt.state       :1803][INFO    ][3353] Executing state locale.present for [cs_CZ.UTF-8]
2018-09-01 22:11:21,503 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'locale -a' in directory '/root'
2018-09-01 22:11:21,528 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['locale-gen', 'cs_CZ.utf8'] in directory '/root'
2018-09-01 22:11:22,462 [salt.state       :290 ][INFO    ][3353] {'locale': 'cs_CZ.UTF-8'}
2018-09-01 22:11:22,462 [salt.state       :1941][INFO    ][3353] Completed state [cs_CZ.UTF-8] at time 22:11:22.462912 duration_in_ms=960.765
2018-09-01 22:11:22,463 [salt.state       :1770][INFO    ][3353] Running state [en_US.UTF-8] at time 22:11:22.463412
2018-09-01 22:11:22,463 [salt.state       :1803][INFO    ][3353] Executing state locale.present for [en_US.UTF-8]
2018-09-01 22:11:22,464 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'locale -a' in directory '/root'
2018-09-01 22:11:22,484 [salt.state       :290 ][INFO    ][3353] Locale en_US.UTF-8 is already present
2018-09-01 22:11:22,485 [salt.state       :1941][INFO    ][3353] Completed state [en_US.UTF-8] at time 22:11:22.485055 duration_in_ms=21.643
2018-09-01 22:11:22,487 [salt.state       :1770][INFO    ][3353] Running state [en_US.UTF-8] at time 22:11:22.487684
2018-09-01 22:11:22,488 [salt.state       :1803][INFO    ][3353] Executing state locale.system for [en_US.UTF-8]
2018-09-01 22:11:22,489 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'localectl' in directory '/root'
2018-09-01 22:11:23,048 [salt.state       :290 ][INFO    ][3353] System locale en_US.UTF-8 already set
2018-09-01 22:11:23,049 [salt.state       :1941][INFO    ][3353] Completed state [en_US.UTF-8] at time 22:11:23.049843 duration_in_ms=562.159
2018-09-01 22:11:23,051 [salt.state       :1770][INFO    ][3353] Running state [root] at time 22:11:23.051901
2018-09-01 22:11:23,052 [salt.state       :1803][INFO    ][3353] Executing state group.present for [root]
2018-09-01 22:11:23,053 [salt.state       :290 ][INFO    ][3353] Group root is present and up to date
2018-09-01 22:11:23,054 [salt.state       :1941][INFO    ][3353] Completed state [root] at time 22:11:23.054560 duration_in_ms=2.659
2018-09-01 22:11:23,057 [salt.state       :1770][INFO    ][3353] Running state [root] at time 22:11:23.057501
2018-09-01 22:11:23,058 [salt.state       :1803][INFO    ][3353] Executing state user.present for [root]
2018-09-01 22:11:23,062 [salt.state       :290 ][INFO    ][3353] User root is present and up to date
2018-09-01 22:11:23,062 [salt.state       :1941][INFO    ][3353] Completed state [root] at time 22:11:23.062460 duration_in_ms=4.959
2018-09-01 22:11:23,063 [salt.state       :1770][INFO    ][3353] Running state [/root] at time 22:11:23.063499
2018-09-01 22:11:23,064 [salt.state       :1803][INFO    ][3353] Executing state file.directory for [/root]
2018-09-01 22:11:23,065 [salt.state       :290 ][INFO    ][3353] Directory /root is in the correct state
Directory /root updated
2018-09-01 22:11:23,065 [salt.state       :1941][INFO    ][3353] Completed state [/root] at time 22:11:23.065530 duration_in_ms=2.031
2018-09-01 22:11:23,065 [salt.state       :1770][INFO    ][3353] Running state [/etc/sudoers.d/90-salt-user-root] at time 22:11:23.065934
2018-09-01 22:11:23,066 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/sudoers.d/90-salt-user-root]
2018-09-01 22:11:23,066 [salt.state       :290 ][INFO    ][3353] File /etc/sudoers.d/90-salt-user-root is not present
2018-09-01 22:11:23,067 [salt.state       :1941][INFO    ][3353] Completed state [/etc/sudoers.d/90-salt-user-root] at time 22:11:23.067170 duration_in_ms=1.236
2018-09-01 22:11:23,067 [salt.state       :1770][INFO    ][3353] Running state [ubuntu] at time 22:11:23.067587
2018-09-01 22:11:23,068 [salt.state       :1803][INFO    ][3353] Executing state group.present for [ubuntu]
2018-09-01 22:11:23,068 [salt.state       :290 ][INFO    ][3353] Group ubuntu is present and up to date
2018-09-01 22:11:23,068 [salt.state       :1941][INFO    ][3353] Completed state [ubuntu] at time 22:11:23.068861 duration_in_ms=1.273
2018-09-01 22:11:23,069 [salt.state       :1770][INFO    ][3353] Running state [ubuntu] at time 22:11:23.069753
2018-09-01 22:11:23,070 [salt.state       :1803][INFO    ][3353] Executing state user.present for [ubuntu]
2018-09-01 22:11:23,073 [salt.state       :290 ][INFO    ][3353] {'passwd': 'XXX-REDACTED-XXX'}
2018-09-01 22:11:23,073 [salt.state       :1941][INFO    ][3353] Completed state [ubuntu] at time 22:11:23.073866 duration_in_ms=4.113
2018-09-01 22:11:23,074 [salt.state       :1770][INFO    ][3353] Running state [/home/ubuntu] at time 22:11:23.074804
2018-09-01 22:11:23,075 [salt.state       :1803][INFO    ][3353] Executing state file.directory for [/home/ubuntu]
2018-09-01 22:11:23,076 [salt.state       :290 ][INFO    ][3353] {'mode': '0700'}
2018-09-01 22:11:23,076 [salt.state       :1941][INFO    ][3353] Completed state [/home/ubuntu] at time 22:11:23.076645 duration_in_ms=1.842
2018-09-01 22:11:23,078 [salt.state       :1770][INFO    ][3353] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 22:11:23.078093
2018-09-01 22:11:23,078 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/sudoers.d/90-salt-user-ubuntu]
2018-09-01 22:11:23,101 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/sudoer'
2018-09-01 22:11:23,114 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command '/usr/sbin/visudo -c -f /tmp/__salt.tmp.NNPu04' in directory '/root'
2018-09-01 22:11:23,243 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:11:23,244 [salt.state       :1941][INFO    ][3353] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 22:11:23.244848 duration_in_ms=166.754
2018-09-01 22:11:23,245 [salt.state       :1770][INFO    ][3353] Running state [/etc/security/limits.d/90-salt-default.conf] at time 22:11:23.245611
2018-09-01 22:11:23,246 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/security/limits.d/90-salt-default.conf]
2018-09-01 22:11:23,272 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/limits.conf'
2018-09-01 22:11:23,338 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:11:23,338 [salt.state       :1941][INFO    ][3353] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 22:11:23.338301 duration_in_ms=92.691
2018-09-01 22:11:23,338 [salt.state       :1770][INFO    ][3353] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 22:11:23.338524
2018-09-01 22:11:23,338 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/systemd/system.conf.d/90-salt.conf]
2018-09-01 22:11:23,354 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/systemd.conf'
2018-09-01 22:11:23,415 [salt.state       :290 ][INFO    ][3353] File changed:
New file
2018-09-01 22:11:23,416 [salt.state       :1941][INFO    ][3353] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 22:11:23.416152 duration_in_ms=77.629
2018-09-01 22:11:23,417 [salt.state       :1770][INFO    ][3353] Running state [service.systemctl_reload] at time 22:11:23.417901
2018-09-01 22:11:23,418 [salt.state       :1803][INFO    ][3353] Executing state module.wait for [service.systemctl_reload]
2018-09-01 22:11:23,418 [salt.state       :290 ][INFO    ][3353] No changes made for service.systemctl_reload
2018-09-01 22:11:23,418 [salt.state       :1941][INFO    ][3353] Completed state [service.systemctl_reload] at time 22:11:23.418494 duration_in_ms=0.593
2018-09-01 22:11:23,418 [salt.state       :1770][INFO    ][3353] Running state [service.systemctl_reload] at time 22:11:23.418658
2018-09-01 22:11:23,418 [salt.state       :1803][INFO    ][3353] Executing state module.mod_watch for [service.systemctl_reload]
2018-09-01 22:11:23,419 [salt.utils.decorators:613 ][WARNING ][3353] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:11:23,419 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', '--system', 'daemon-reload'] in directory '/root'
2018-09-01 22:11:23,535 [salt.state       :290 ][INFO    ][3353] {'ret': True}
2018-09-01 22:11:23,536 [salt.state       :1941][INFO    ][3353] Completed state [service.systemctl_reload] at time 22:11:23.536402 duration_in_ms=117.744
2018-09-01 22:11:23,536 [salt.state       :1770][INFO    ][3353] Running state [/etc/issue] at time 22:11:23.536818
2018-09-01 22:11:23,537 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/issue]
2018-09-01 22:11:23,556 [salt.state       :290 ][INFO    ][3353] File changed:
--- 
+++ 
@@ -1,2 +1,9 @@
-Ubuntu 16.04.5 LTS \n \l
-
+=================================== WARNING ====================================
+You have accessed a computer managed by OPNFV.
+You are required to have authorisation from OPNFV
+before you proceed and you are strictly limited to use set out within that
+authorisation. Unauthorised access to or misuse of this system is prohibited
+and constitutes an offence under the Computer Misuse Act 1990.
+If you disclose any information obtained through this system without authority
+OPNFV may take legal action against you.
+================================================================================

2018-09-01 22:11:23,557 [salt.state       :1941][INFO    ][3353] Completed state [/etc/issue] at time 22:11:23.557174 duration_in_ms=20.355
2018-09-01 22:11:23,557 [salt.state       :1770][INFO    ][3353] Running state [/etc/hostname] at time 22:11:23.557587
2018-09-01 22:11:23,558 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/hostname]
2018-09-01 22:11:23,577 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'linux/files/hostname'
2018-09-01 22:11:23,588 [salt.state       :290 ][INFO    ][3353] File changed:
--- 
+++ 
@@ -1 +1 @@
-ubuntu
+prx02

2018-09-01 22:11:23,588 [salt.state       :1941][INFO    ][3353] Completed state [/etc/hostname] at time 22:11:23.588264 duration_in_ms=30.676
2018-09-01 22:11:23,591 [salt.state       :1770][INFO    ][3353] Running state [hostname prx02] at time 22:11:23.591027
2018-09-01 22:11:23,591 [salt.state       :1803][INFO    ][3353] Executing state cmd.run for [hostname prx02]
2018-09-01 22:11:23,592 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'test "$(hostname)" = "prx02"' in directory '/root'
2018-09-01 22:11:23,606 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command 'hostname prx02' in directory '/root'
2018-09-01 22:11:23,621 [salt.state       :290 ][INFO    ][3353] {'pid': 5570, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-09-01 22:11:23,622 [salt.state       :1941][INFO    ][3353] Completed state [hostname prx02] at time 22:11:23.622583 duration_in_ms=31.555
2018-09-01 22:11:23,624 [salt.state       :1770][INFO    ][3353] Running state [mdb02] at time 22:11:23.624266
2018-09-01 22:11:23,624 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb02]
2018-09-01 22:11:23,626 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb02'}
2018-09-01 22:11:23,626 [salt.state       :1941][INFO    ][3353] Completed state [mdb02] at time 22:11:23.626427 duration_in_ms=2.161
2018-09-01 22:11:23,626 [salt.state       :1770][INFO    ][3353] Running state [mdb02.mcp-ovs-ha.local] at time 22:11:23.626836
2018-09-01 22:11:23,627 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb02.mcp-ovs-ha.local]
2018-09-01 22:11:23,771 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb02.mcp-ovs-ha.local'}
2018-09-01 22:11:23,772 [salt.state       :1941][INFO    ][3353] Completed state [mdb02.mcp-ovs-ha.local] at time 22:11:23.772382 duration_in_ms=145.546
2018-09-01 22:11:23,773 [salt.state       :1770][INFO    ][3353] Running state [mdb03] at time 22:11:23.773069
2018-09-01 22:11:23,773 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb03]
2018-09-01 22:11:23,776 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb03'}
2018-09-01 22:11:23,777 [salt.state       :1941][INFO    ][3353] Completed state [mdb03] at time 22:11:23.777349 duration_in_ms=4.28
2018-09-01 22:11:23,778 [salt.state       :1770][INFO    ][3353] Running state [mdb03.mcp-ovs-ha.local] at time 22:11:23.778006
2018-09-01 22:11:23,778 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb03.mcp-ovs-ha.local]
2018-09-01 22:11:23,782 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb03.mcp-ovs-ha.local'}
2018-09-01 22:11:23,783 [salt.state       :1941][INFO    ][3353] Completed state [mdb03.mcp-ovs-ha.local] at time 22:11:23.783347 duration_in_ms=5.341
2018-09-01 22:11:23,784 [salt.state       :1770][INFO    ][3353] Running state [mdb01] at time 22:11:23.783992
2018-09-01 22:11:23,784 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb01]
2018-09-01 22:11:23,788 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb01'}
2018-09-01 22:11:23,789 [salt.state       :1941][INFO    ][3353] Completed state [mdb01] at time 22:11:23.789392 duration_in_ms=5.399
2018-09-01 22:11:23,790 [salt.state       :1770][INFO    ][3353] Running state [mdb01.mcp-ovs-ha.local] at time 22:11:23.790044
2018-09-01 22:11:23,790 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb01.mcp-ovs-ha.local]
2018-09-01 22:11:23,794 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb01.mcp-ovs-ha.local'}
2018-09-01 22:11:23,795 [salt.state       :1941][INFO    ][3353] Completed state [mdb01.mcp-ovs-ha.local] at time 22:11:23.795366 duration_in_ms=5.322
2018-09-01 22:11:23,796 [salt.state       :1770][INFO    ][3353] Running state [mdb] at time 22:11:23.796055
2018-09-01 22:11:23,796 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb]
2018-09-01 22:11:23,800 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb'}
2018-09-01 22:11:23,801 [salt.state       :1941][INFO    ][3353] Completed state [mdb] at time 22:11:23.801409 duration_in_ms=5.355
2018-09-01 22:11:23,802 [salt.state       :1770][INFO    ][3353] Running state [mdb.mcp-ovs-ha.local] at time 22:11:23.802120
2018-09-01 22:11:23,802 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mdb.mcp-ovs-ha.local]
2018-09-01 22:11:23,808 [salt.state       :290 ][INFO    ][3353] {'host': 'mdb.mcp-ovs-ha.local'}
2018-09-01 22:11:23,808 [salt.state       :1941][INFO    ][3353] Completed state [mdb.mcp-ovs-ha.local] at time 22:11:23.808427 duration_in_ms=6.307
2018-09-01 22:11:23,809 [salt.state       :1770][INFO    ][3353] Running state [cfg01] at time 22:11:23.809133
2018-09-01 22:11:23,809 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cfg01]
2018-09-01 22:11:23,812 [salt.state       :290 ][INFO    ][3353] {'host': 'cfg01'}
2018-09-01 22:11:23,812 [salt.state       :1941][INFO    ][3353] Completed state [cfg01] at time 22:11:23.812905 duration_in_ms=3.773
2018-09-01 22:11:23,813 [salt.state       :1770][INFO    ][3353] Running state [cfg01.mcp-ovs-ha.local] at time 22:11:23.813259
2018-09-01 22:11:23,813 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2018-09-01 22:11:23,818 [salt.state       :290 ][INFO    ][3353] {'host': 'cfg01.mcp-ovs-ha.local'}
2018-09-01 22:11:23,819 [salt.state       :1941][INFO    ][3353] Completed state [cfg01.mcp-ovs-ha.local] at time 22:11:23.818975 duration_in_ms=5.716
2018-09-01 22:11:23,819 [salt.state       :1770][INFO    ][3353] Running state [prx01] at time 22:11:23.819292
2018-09-01 22:11:23,819 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx01]
2018-09-01 22:11:23,824 [salt.state       :290 ][INFO    ][3353] {'host': 'prx01'}
2018-09-01 22:11:23,825 [salt.state       :1941][INFO    ][3353] Completed state [prx01] at time 22:11:23.825079 duration_in_ms=5.787
2018-09-01 22:11:23,825 [salt.state       :1770][INFO    ][3353] Running state [prx01.mcp-ovs-ha.local] at time 22:11:23.825415
2018-09-01 22:11:23,825 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx01.mcp-ovs-ha.local]
2018-09-01 22:11:23,830 [salt.state       :290 ][INFO    ][3353] {'host': 'prx01.mcp-ovs-ha.local'}
2018-09-01 22:11:23,831 [salt.state       :1941][INFO    ][3353] Completed state [prx01.mcp-ovs-ha.local] at time 22:11:23.831148 duration_in_ms=5.733
2018-09-01 22:11:23,831 [salt.state       :1770][INFO    ][3353] Running state [kvm01] at time 22:11:23.831568
2018-09-01 22:11:23,831 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm01]
2018-09-01 22:11:23,855 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm01'}
2018-09-01 22:11:23,855 [salt.state       :1941][INFO    ][3353] Completed state [kvm01] at time 22:11:23.855432 duration_in_ms=23.863
2018-09-01 22:11:23,856 [salt.state       :1770][INFO    ][3353] Running state [kvm01.mcp-ovs-ha.local] at time 22:11:23.856080
2018-09-01 22:11:23,856 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm01.mcp-ovs-ha.local]
2018-09-01 22:11:23,860 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm01.mcp-ovs-ha.local'}
2018-09-01 22:11:23,861 [salt.state       :1941][INFO    ][3353] Completed state [kvm01.mcp-ovs-ha.local] at time 22:11:23.861376 duration_in_ms=5.296
2018-09-01 22:11:23,862 [salt.state       :1770][INFO    ][3353] Running state [kvm03] at time 22:11:23.862013
2018-09-01 22:11:23,862 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm03]
2018-09-01 22:11:23,866 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm03'}
2018-09-01 22:11:23,867 [salt.state       :1941][INFO    ][3353] Completed state [kvm03] at time 22:11:23.867394 duration_in_ms=5.381
2018-09-01 22:11:23,868 [salt.state       :1770][INFO    ][3353] Running state [kvm03.mcp-ovs-ha.local] at time 22:11:23.868027
2018-09-01 22:11:23,868 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm03.mcp-ovs-ha.local]
2018-09-01 22:11:23,872 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm03.mcp-ovs-ha.local'}
2018-09-01 22:11:23,873 [salt.state       :1941][INFO    ][3353] Completed state [kvm03.mcp-ovs-ha.local] at time 22:11:23.873247 duration_in_ms=5.221
2018-09-01 22:11:23,873 [salt.state       :1770][INFO    ][3353] Running state [kvm02] at time 22:11:23.873730
2018-09-01 22:11:23,874 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm02]
2018-09-01 22:11:23,879 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm02'}
2018-09-01 22:11:23,879 [salt.state       :1941][INFO    ][3353] Completed state [kvm02] at time 22:11:23.879411 duration_in_ms=5.68
2018-09-01 22:11:23,880 [salt.state       :1770][INFO    ][3353] Running state [kvm02.mcp-ovs-ha.local] at time 22:11:23.880088
2018-09-01 22:11:23,880 [salt.state       :1803][INFO    ][3353] Executing state host.present for [kvm02.mcp-ovs-ha.local]
2018-09-01 22:11:23,884 [salt.state       :290 ][INFO    ][3353] {'host': 'kvm02.mcp-ovs-ha.local'}
2018-09-01 22:11:23,885 [salt.state       :1941][INFO    ][3353] Completed state [kvm02.mcp-ovs-ha.local] at time 22:11:23.885441 duration_in_ms=5.353
2018-09-01 22:11:23,886 [salt.state       :1770][INFO    ][3353] Running state [dbs] at time 22:11:23.886090
2018-09-01 22:11:23,886 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs]
2018-09-01 22:11:23,890 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs'}
2018-09-01 22:11:23,890 [salt.state       :1941][INFO    ][3353] Completed state [dbs] at time 22:11:23.890833 duration_in_ms=4.743
2018-09-01 22:11:23,891 [salt.state       :1770][INFO    ][3353] Running state [dbs.mcp-ovs-ha.local] at time 22:11:23.891514
2018-09-01 22:11:23,892 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs.mcp-ovs-ha.local]
2018-09-01 22:11:23,896 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs.mcp-ovs-ha.local'}
2018-09-01 22:11:23,896 [salt.state       :1941][INFO    ][3353] Completed state [dbs.mcp-ovs-ha.local] at time 22:11:23.896877 duration_in_ms=5.363
2018-09-01 22:11:23,897 [salt.state       :1770][INFO    ][3353] Running state [prx] at time 22:11:23.897626
2018-09-01 22:11:23,898 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx]
2018-09-01 22:11:23,902 [salt.state       :290 ][INFO    ][3353] {'host': 'prx'}
2018-09-01 22:11:23,902 [salt.state       :1941][INFO    ][3353] Completed state [prx] at time 22:11:23.902845 duration_in_ms=5.219
2018-09-01 22:11:23,903 [salt.state       :1770][INFO    ][3353] Running state [prx.mcp-ovs-ha.local] at time 22:11:23.903528
2018-09-01 22:11:23,904 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx.mcp-ovs-ha.local]
2018-09-01 22:11:23,908 [salt.state       :290 ][INFO    ][3353] {'host': 'prx.mcp-ovs-ha.local'}
2018-09-01 22:11:23,908 [salt.state       :1941][INFO    ][3353] Completed state [prx.mcp-ovs-ha.local] at time 22:11:23.908783 duration_in_ms=5.255
2018-09-01 22:11:23,909 [salt.state       :1770][INFO    ][3353] Running state [prx02] at time 22:11:23.909444
2018-09-01 22:11:23,909 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx02]
2018-09-01 22:11:23,914 [salt.state       :290 ][INFO    ][3353] {'host': 'prx02'}
2018-09-01 22:11:23,914 [salt.state       :1941][INFO    ][3353] Completed state [prx02] at time 22:11:23.914859 duration_in_ms=5.415
2018-09-01 22:11:23,915 [salt.state       :1770][INFO    ][3353] Running state [prx02.mcp-ovs-ha.local] at time 22:11:23.915568
2018-09-01 22:11:23,916 [salt.state       :1803][INFO    ][3353] Executing state host.present for [prx02.mcp-ovs-ha.local]
2018-09-01 22:11:24,165 [salt.state       :290 ][INFO    ][3353] {'host': 'prx02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,166 [salt.state       :1941][INFO    ][3353] Completed state [prx02.mcp-ovs-ha.local] at time 22:11:24.166379 duration_in_ms=250.811
2018-09-01 22:11:24,167 [salt.state       :1770][INFO    ][3353] Running state [msg02] at time 22:11:24.167058
2018-09-01 22:11:24,167 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg02]
2018-09-01 22:11:24,172 [salt.state       :290 ][INFO    ][3353] {'host': 'msg02'}
2018-09-01 22:11:24,172 [salt.state       :1941][INFO    ][3353] Completed state [msg02] at time 22:11:24.172391 duration_in_ms=5.333
2018-09-01 22:11:24,173 [salt.state       :1770][INFO    ][3353] Running state [msg02.mcp-ovs-ha.local] at time 22:11:24.173049
2018-09-01 22:11:24,173 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg02.mcp-ovs-ha.local]
2018-09-01 22:11:24,178 [salt.state       :290 ][INFO    ][3353] {'host': 'msg02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,178 [salt.state       :1941][INFO    ][3353] Completed state [msg02.mcp-ovs-ha.local] at time 22:11:24.178393 duration_in_ms=5.345
2018-09-01 22:11:24,179 [salt.state       :1770][INFO    ][3353] Running state [msg03] at time 22:11:24.179049
2018-09-01 22:11:24,179 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg03]
2018-09-01 22:11:24,184 [salt.state       :290 ][INFO    ][3353] {'host': 'msg03'}
2018-09-01 22:11:24,184 [salt.state       :1941][INFO    ][3353] Completed state [msg03] at time 22:11:24.184429 duration_in_ms=5.38
2018-09-01 22:11:24,185 [salt.state       :1770][INFO    ][3353] Running state [msg03.mcp-ovs-ha.local] at time 22:11:24.185089
2018-09-01 22:11:24,185 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg03.mcp-ovs-ha.local]
2018-09-01 22:11:24,190 [salt.state       :290 ][INFO    ][3353] {'host': 'msg03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,190 [salt.state       :1941][INFO    ][3353] Completed state [msg03.mcp-ovs-ha.local] at time 22:11:24.190449 duration_in_ms=5.36
2018-09-01 22:11:24,191 [salt.state       :1770][INFO    ][3353] Running state [msg01] at time 22:11:24.191136
2018-09-01 22:11:24,191 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg01]
2018-09-01 22:11:24,195 [salt.state       :290 ][INFO    ][3353] {'host': 'msg01'}
2018-09-01 22:11:24,196 [salt.state       :1941][INFO    ][3353] Completed state [msg01] at time 22:11:24.196386 duration_in_ms=5.25
2018-09-01 22:11:24,197 [salt.state       :1770][INFO    ][3353] Running state [msg01.mcp-ovs-ha.local] at time 22:11:24.197054
2018-09-01 22:11:24,197 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg01.mcp-ovs-ha.local]
2018-09-01 22:11:24,202 [salt.state       :290 ][INFO    ][3353] {'host': 'msg01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,202 [salt.state       :1941][INFO    ][3353] Completed state [msg01.mcp-ovs-ha.local] at time 22:11:24.202437 duration_in_ms=5.383
2018-09-01 22:11:24,203 [salt.state       :1770][INFO    ][3353] Running state [msg] at time 22:11:24.203108
2018-09-01 22:11:24,203 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg]
2018-09-01 22:11:24,208 [salt.state       :290 ][INFO    ][3353] {'host': 'msg'}
2018-09-01 22:11:24,208 [salt.state       :1941][INFO    ][3353] Completed state [msg] at time 22:11:24.208462 duration_in_ms=5.354
2018-09-01 22:11:24,209 [salt.state       :1770][INFO    ][3353] Running state [msg.mcp-ovs-ha.local] at time 22:11:24.209208
2018-09-01 22:11:24,209 [salt.state       :1803][INFO    ][3353] Executing state host.present for [msg.mcp-ovs-ha.local]
2018-09-01 22:11:24,214 [salt.state       :290 ][INFO    ][3353] {'host': 'msg.mcp-ovs-ha.local'}
2018-09-01 22:11:24,214 [salt.state       :1941][INFO    ][3353] Completed state [msg.mcp-ovs-ha.local] at time 22:11:24.214505 duration_in_ms=5.297
2018-09-01 22:11:24,215 [salt.state       :1770][INFO    ][3353] Running state [cfg01] at time 22:11:24.215243
2018-09-01 22:11:24,215 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cfg01]
2018-09-01 22:11:24,216 [salt.state       :290 ][INFO    ][3353] Host cfg01 (10.167.4.11) already present
2018-09-01 22:11:24,217 [salt.state       :1941][INFO    ][3353] Completed state [cfg01] at time 22:11:24.217219 duration_in_ms=1.977
2018-09-01 22:11:24,218 [salt.state       :1770][INFO    ][3353] Running state [cfg01.mcp-ovs-ha.local] at time 22:11:24.217933
2018-09-01 22:11:24,218 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2018-09-01 22:11:24,219 [salt.state       :290 ][INFO    ][3353] Host cfg01.mcp-ovs-ha.local (10.167.4.11) already present
2018-09-01 22:11:24,219 [salt.state       :1941][INFO    ][3353] Completed state [cfg01.mcp-ovs-ha.local] at time 22:11:24.219871 duration_in_ms=1.938
2018-09-01 22:11:24,220 [salt.state       :1770][INFO    ][3353] Running state [cmp002] at time 22:11:24.220559
2018-09-01 22:11:24,221 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cmp002]
2018-09-01 22:11:24,223 [salt.state       :290 ][INFO    ][3353] {'host': 'cmp002'}
2018-09-01 22:11:24,224 [salt.state       :1941][INFO    ][3353] Completed state [cmp002] at time 22:11:24.223975 duration_in_ms=3.416
2018-09-01 22:11:24,224 [salt.state       :1770][INFO    ][3353] Running state [cmp002.mcp-ovs-ha.local] at time 22:11:24.224679
2018-09-01 22:11:24,225 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cmp002.mcp-ovs-ha.local]
2018-09-01 22:11:24,227 [salt.state       :290 ][INFO    ][3353] {'host': 'cmp002.mcp-ovs-ha.local'}
2018-09-01 22:11:24,227 [salt.state       :1941][INFO    ][3353] Completed state [cmp002.mcp-ovs-ha.local] at time 22:11:24.227729 duration_in_ms=3.051
2018-09-01 22:11:24,228 [salt.state       :1770][INFO    ][3353] Running state [cmp001] at time 22:11:24.228056
2018-09-01 22:11:24,228 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cmp001]
2018-09-01 22:11:24,364 [salt.state       :290 ][INFO    ][3353] {'host': 'cmp001'}
2018-09-01 22:11:24,364 [salt.state       :1941][INFO    ][3353] Completed state [cmp001] at time 22:11:24.364404 duration_in_ms=136.347
2018-09-01 22:11:24,364 [salt.state       :1770][INFO    ][3353] Running state [cmp001.mcp-ovs-ha.local] at time 22:11:24.364929
2018-09-01 22:11:24,365 [salt.state       :1803][INFO    ][3353] Executing state host.present for [cmp001.mcp-ovs-ha.local]
2018-09-01 22:11:24,369 [salt.state       :290 ][INFO    ][3353] {'host': 'cmp001.mcp-ovs-ha.local'}
2018-09-01 22:11:24,370 [salt.state       :1941][INFO    ][3353] Completed state [cmp001.mcp-ovs-ha.local] at time 22:11:24.370299 duration_in_ms=5.369
2018-09-01 22:11:24,370 [salt.state       :1770][INFO    ][3353] Running state [dbs01] at time 22:11:24.370879
2018-09-01 22:11:24,371 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs01]
2018-09-01 22:11:24,376 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs01'}
2018-09-01 22:11:24,376 [salt.state       :1941][INFO    ][3353] Completed state [dbs01] at time 22:11:24.376403 duration_in_ms=5.524
2018-09-01 22:11:24,377 [salt.state       :1770][INFO    ][3353] Running state [dbs01.mcp-ovs-ha.local] at time 22:11:24.376998
2018-09-01 22:11:24,377 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs01.mcp-ovs-ha.local]
2018-09-01 22:11:24,381 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,382 [salt.state       :1941][INFO    ][3353] Completed state [dbs01.mcp-ovs-ha.local] at time 22:11:24.382121 duration_in_ms=5.123
2018-09-01 22:11:24,382 [salt.state       :1770][INFO    ][3353] Running state [dbs02] at time 22:11:24.382538
2018-09-01 22:11:24,382 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs02]
2018-09-01 22:11:24,388 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs02'}
2018-09-01 22:11:24,388 [salt.state       :1941][INFO    ][3353] Completed state [dbs02] at time 22:11:24.388474 duration_in_ms=5.935
2018-09-01 22:11:24,389 [salt.state       :1770][INFO    ][3353] Running state [dbs02.mcp-ovs-ha.local] at time 22:11:24.389090
2018-09-01 22:11:24,389 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs02.mcp-ovs-ha.local]
2018-09-01 22:11:24,394 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,394 [salt.state       :1941][INFO    ][3353] Completed state [dbs02.mcp-ovs-ha.local] at time 22:11:24.394637 duration_in_ms=5.547
2018-09-01 22:11:24,395 [salt.state       :1770][INFO    ][3353] Running state [dbs03] at time 22:11:24.395439
2018-09-01 22:11:24,396 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs03]
2018-09-01 22:11:24,400 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs03'}
2018-09-01 22:11:24,400 [salt.state       :1941][INFO    ][3353] Completed state [dbs03] at time 22:11:24.400584 duration_in_ms=5.145
2018-09-01 22:11:24,401 [salt.state       :1770][INFO    ][3353] Running state [dbs03.mcp-ovs-ha.local] at time 22:11:24.401366
2018-09-01 22:11:24,401 [salt.state       :1803][INFO    ][3353] Executing state host.present for [dbs03.mcp-ovs-ha.local]
2018-09-01 22:11:24,406 [salt.state       :290 ][INFO    ][3353] {'host': 'dbs03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,406 [salt.state       :1941][INFO    ][3353] Completed state [dbs03.mcp-ovs-ha.local] at time 22:11:24.406411 duration_in_ms=5.045
2018-09-01 22:11:24,407 [salt.state       :1770][INFO    ][3353] Running state [mas01] at time 22:11:24.407105
2018-09-01 22:11:24,407 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mas01]
2018-09-01 22:11:24,411 [salt.state       :290 ][INFO    ][3353] {'host': 'mas01'}
2018-09-01 22:11:24,412 [salt.state       :1941][INFO    ][3353] Completed state [mas01] at time 22:11:24.412127 duration_in_ms=5.022
2018-09-01 22:11:24,412 [salt.state       :1770][INFO    ][3353] Running state [mas01.mcp-ovs-ha.local] at time 22:11:24.412620
2018-09-01 22:11:24,412 [salt.state       :1803][INFO    ][3353] Executing state host.present for [mas01.mcp-ovs-ha.local]
2018-09-01 22:11:24,417 [salt.state       :290 ][INFO    ][3353] {'host': 'mas01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,418 [salt.state       :1941][INFO    ][3353] Completed state [mas01.mcp-ovs-ha.local] at time 22:11:24.418139 duration_in_ms=5.519
2018-09-01 22:11:24,418 [salt.state       :1770][INFO    ][3353] Running state [ctl02] at time 22:11:24.418641
2018-09-01 22:11:24,418 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl02]
2018-09-01 22:11:24,423 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl02'}
2018-09-01 22:11:24,424 [salt.state       :1941][INFO    ][3353] Completed state [ctl02] at time 22:11:24.424105 duration_in_ms=5.464
2018-09-01 22:11:24,424 [salt.state       :1770][INFO    ][3353] Running state [ctl02.mcp-ovs-ha.local] at time 22:11:24.424567
2018-09-01 22:11:24,424 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl02.mcp-ovs-ha.local]
2018-09-01 22:11:24,478 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,478 [salt.state       :1941][INFO    ][3353] Completed state [ctl02.mcp-ovs-ha.local] at time 22:11:24.478571 duration_in_ms=54.003
2018-09-01 22:11:24,479 [salt.state       :1770][INFO    ][3353] Running state [ctl03] at time 22:11:24.479085
2018-09-01 22:11:24,479 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl03]
2018-09-01 22:11:24,483 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl03'}
2018-09-01 22:11:24,484 [salt.state       :1941][INFO    ][3353] Completed state [ctl03] at time 22:11:24.484052 duration_in_ms=4.967
2018-09-01 22:11:24,484 [salt.state       :1770][INFO    ][3353] Running state [ctl03.mcp-ovs-ha.local] at time 22:11:24.484496
2018-09-01 22:11:24,484 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl03.mcp-ovs-ha.local]
2018-09-01 22:11:24,490 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,490 [salt.state       :1941][INFO    ][3353] Completed state [ctl03.mcp-ovs-ha.local] at time 22:11:24.490458 duration_in_ms=5.961
2018-09-01 22:11:24,491 [salt.state       :1770][INFO    ][3353] Running state [ctl01] at time 22:11:24.491193
2018-09-01 22:11:24,491 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl01]
2018-09-01 22:11:24,496 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl01'}
2018-09-01 22:11:24,496 [salt.state       :1941][INFO    ][3353] Completed state [ctl01] at time 22:11:24.496319 duration_in_ms=5.126
2018-09-01 22:11:24,496 [salt.state       :1770][INFO    ][3353] Running state [ctl01.mcp-ovs-ha.local] at time 22:11:24.496884
2018-09-01 22:11:24,497 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl01.mcp-ovs-ha.local]
2018-09-01 22:11:24,501 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,502 [salt.state       :1941][INFO    ][3353] Completed state [ctl01.mcp-ovs-ha.local] at time 22:11:24.502036 duration_in_ms=5.153
2018-09-01 22:11:24,502 [salt.state       :1770][INFO    ][3353] Running state [ctl] at time 22:11:24.502364
2018-09-01 22:11:24,502 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl]
2018-09-01 22:11:24,504 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl'}
2018-09-01 22:11:24,504 [salt.state       :1941][INFO    ][3353] Completed state [ctl] at time 22:11:24.504490 duration_in_ms=2.125
2018-09-01 22:11:24,504 [salt.state       :1770][INFO    ][3353] Running state [ctl.mcp-ovs-ha.local] at time 22:11:24.504943
2018-09-01 22:11:24,505 [salt.state       :1803][INFO    ][3353] Executing state host.present for [ctl.mcp-ovs-ha.local]
2018-09-01 22:11:24,510 [salt.state       :290 ][INFO    ][3353] {'host': 'ctl.mcp-ovs-ha.local'}
2018-09-01 22:11:24,511 [salt.state       :1941][INFO    ][3353] Completed state [ctl.mcp-ovs-ha.local] at time 22:11:24.511021 duration_in_ms=6.078
2018-09-01 22:11:24,511 [salt.state       :1770][INFO    ][3353] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 22:11:24.511547
2018-09-01 22:11:24,512 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/network/interfaces.d/50-cloud-init.cfg]
2018-09-01 22:11:24,512 [salt.state       :290 ][INFO    ][3353] {'removed': '/etc/network/interfaces.d/50-cloud-init.cfg'}
2018-09-01 22:11:24,512 [salt.state       :1941][INFO    ][3353] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 22:11:24.512753 duration_in_ms=1.206
2018-09-01 22:11:24,514 [salt.state       :1770][INFO    ][3353] Running state [ens3] at time 22:11:24.514690
2018-09-01 22:11:24,515 [salt.state       :1803][INFO    ][3353] Executing state network.managed for [ens3]
2018-09-01 22:11:24,631 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['ifup', 'ens3'] in directory '/root'
2018-09-01 22:11:24,647 [salt.loaded.int.module.cmdmod:722 ][ERROR   ][3353] Command '['ifup', 'ens3']' failed with return code: 1
2018-09-01 22:11:24,648 [salt.loaded.int.module.cmdmod:724 ][ERROR   ][3353] stdout: RTNETLINK answers: File exists
Failed to bring up ens3.
2018-09-01 22:11:24,648 [salt.loaded.int.module.cmdmod:728 ][ERROR   ][3353] retcode: 1
2018-09-01 22:11:25,188 [salt.state       :290 ][INFO    ][3353] {'interface': 'Added network interface.', 'status': 'Interface ens3 is up'}
2018-09-01 22:11:25,280 [salt.state       :1941][INFO    ][3353] Completed state [ens3] at time 22:11:25.280860 duration_in_ms=766.169
2018-09-01 22:11:25,281 [salt.state       :1770][INFO    ][3353] Running state [linux_system_network] at time 22:11:25.281509
2018-09-01 22:11:25,281 [salt.state       :1803][INFO    ][3353] Executing state network.system for [linux_system_network]
2018-09-01 22:11:25,282 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'status', 'networking.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:25,298 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-active', 'networking.service'] in directory '/root'
2018-09-01 22:11:25,599 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'status', 'NetworkManager.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:25,616 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'systemctl', 'enable', 'networking.service'] in directory '/root'
2018-09-01 22:11:26,176 [salt.loaded.int.module.debian_ip:1970][WARNING ][3353] The network state sls is requiring a reboot of the system to properly apply network configuration.
2018-09-01 22:11:26,177 [salt.state       :290 ][INFO    ][3353] {'network_settings': u'--- \n+++ \n@@ -1,2 +1,4 @@\n NETWORKING=yes\n\n HOSTNAME=prx02\n\n+DOMAIN=mcp-ovs-ha.local\n\n+SEARCH=maas\n'}
2018-09-01 22:11:26,178 [salt.state       :1941][INFO    ][3353] Completed state [linux_system_network] at time 22:11:26.178552 duration_in_ms=897.043
2018-09-01 22:11:26,179 [salt.state       :1770][INFO    ][3353] Running state [ens2] at time 22:11:26.179521
2018-09-01 22:11:26,180 [salt.state       :1803][INFO    ][3353] Executing state network.managed for [ens2]
2018-09-01 22:11:26,203 [salt.state       :290 ][INFO    ][3353] {'interface': 'Added network interface.'}
2018-09-01 22:11:26,204 [salt.state       :1941][INFO    ][3353] Completed state [ens2] at time 22:11:26.204061 duration_in_ms=24.539
2018-09-01 22:11:26,204 [salt.state       :1770][INFO    ][3353] Running state [ens4] at time 22:11:26.204562
2018-09-01 22:11:26,205 [salt.state       :1803][INFO    ][3353] Executing state network.managed for [ens4]
2018-09-01 22:11:26,254 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['ifup', 'ens4'] in directory '/root'
2018-09-01 22:11:27,076 [salt.state       :290 ][INFO    ][3353] {'interface': 'Added network interface.', 'status': 'Interface ens4 is up'}
2018-09-01 22:11:27,076 [salt.state       :1941][INFO    ][3353] Completed state [ens4] at time 22:11:27.076770 duration_in_ms=872.207
2018-09-01 22:11:27,077 [salt.state       :1770][INFO    ][3353] Running state [/etc/profile.d/proxy.sh] at time 22:11:27.077254
2018-09-01 22:11:27,077 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/profile.d/proxy.sh]
2018-09-01 22:11:27,078 [salt.state       :290 ][INFO    ][3353] File /etc/profile.d/proxy.sh is not present
2018-09-01 22:11:27,078 [salt.state       :1941][INFO    ][3353] Completed state [/etc/profile.d/proxy.sh] at time 22:11:27.078607 duration_in_ms=1.353
2018-09-01 22:11:27,079 [salt.state       :1770][INFO    ][3353] Running state [/etc/apt/apt.conf.d/95proxies] at time 22:11:27.078987
2018-09-01 22:11:27,079 [salt.state       :1803][INFO    ][3353] Executing state file.absent for [/etc/apt/apt.conf.d/95proxies]
2018-09-01 22:11:27,079 [salt.state       :290 ][INFO    ][3353] File /etc/apt/apt.conf.d/95proxies is not present
2018-09-01 22:11:27,080 [salt.state       :1941][INFO    ][3353] Completed state [/etc/apt/apt.conf.d/95proxies] at time 22:11:27.080156 duration_in_ms=1.169
2018-09-01 22:11:27,081 [salt.state       :1770][INFO    ][3353] Running state [ntp] at time 22:11:27.081467
2018-09-01 22:11:27,081 [salt.state       :1803][INFO    ][3353] Executing state pkg.installed for [ntp]
2018-09-01 22:11:27,238 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:11:27,262 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'ntp'] in directory '/root'
2018-09-01 22:11:30,562 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221130546731
2018-09-01 22:11:30,584 [salt.minion      :1431][INFO    ][6476] Starting a new job with PID 6476
2018-09-01 22:11:30,600 [salt.minion      :1708][INFO    ][6476] Returning information for job: 20180901221130546731
2018-09-01 22:11:33,503 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:33,540 [salt.state       :290 ][INFO    ][3353] Made the following changes:
'ntp' changed from 'absent' to '1:4.2.8p4+dfsg-3ubuntu5.9'
'libopts25' changed from 'absent' to '1:5.18.7-3'

2018-09-01 22:11:33,554 [salt.state       :905 ][INFO    ][3353] Loading fresh modules for state activity
2018-09-01 22:11:33,585 [salt.state       :1941][INFO    ][3353] Completed state [ntp] at time 22:11:33.585836 duration_in_ms=6504.366
2018-09-01 22:11:33,595 [salt.state       :1770][INFO    ][3353] Running state [/etc/ntp.conf] at time 22:11:33.595239
2018-09-01 22:11:33,596 [salt.state       :1803][INFO    ][3353] Executing state file.managed for [/etc/ntp.conf]
2018-09-01 22:11:33,621 [salt.fileclient  :1215][INFO    ][3353] Fetching file from saltenv 'base', ** done ** 'ntp/files/ntp.conf'
2018-09-01 22:11:33,668 [salt.state       :290 ][INFO    ][3353] File changed:
--- 
+++ 
@@ -1,66 +1,25 @@
-# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help
 
-driftfile /var/lib/ntp/ntp.drift
 
-# Enable this if you want statistics to be logged.
-#statsdir /var/log/ntpstats/
+# ntpd will only synchronize your clock.
 
-statistics loopstats peerstats clockstats
-filegen loopstats file loopstats type day enable
-filegen peerstats file peerstats type day enable
-filegen clockstats file clockstats type day enable
+# For details, see:
+# - the ntp.conf man page
+# - http://support.ntp.org/bin/view/Support/GettingStarted
+# - https://wiki.archlinux.org/index.php/Network_Time_Protocol_daemon
 
-# Specify one or more NTP servers.
+# Associate to cloud NTP pool servers
+server 1.pool.ntp.org iburst
+server 0.pool.ntp.org
 
-# Use servers from the NTP Pool Project. Approved by Ubuntu Technical Board
-# on 2011-02-08 (LP: #104525). See http://www.pool.ntp.org/join.html for
-# more information.
-pool 0.ubuntu.pool.ntp.org iburst
-pool 1.ubuntu.pool.ntp.org iburst
-pool 2.ubuntu.pool.ntp.org iburst
-pool 3.ubuntu.pool.ntp.org iburst
+# Exchange time with everybody, but don't allow configuration.
+restrict -4 default kod nomodify notrap nopeer noquery
+restrict -6 default kod nomodify notrap nopeer noquery
 
-# Use Ubuntu's ntp server as a fallback.
-pool ntp.ubuntu.com
-
-# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for
-# details.  The web page <http://support.ntp.org/bin/view/Support/AccessRestrictions>
-# might also be helpful.
-#
-# Note that "restrict" applies to both servers and clients, so a configuration
-# that might be intended to block requests from certain clients could also end
-# up blocking replies from your own upstream servers.
-
-# By default, exchange time with everybody, but don't allow configuration.
-restrict -4 default kod notrap nomodify nopeer noquery limited
-restrict -6 default kod notrap nomodify nopeer noquery limited
-
-# Local users may interrogate the ntp server more closely.
+# Only allow read-only access from localhost
 restrict 127.0.0.1
 restrict ::1
 
-# Needed for adding pool entries
-restrict source notrap nomodify noquery
+# mode7 is required for collectd monitoring
 
-# Clients from this (example!) subnet have unlimited access, but only if
-# cryptographically authenticated.
-#restrict 192.168.123.0 mask 255.255.255.0 notrust
-
-
-# If you want to provide time to your local subnet, change the next line.
-# (Again, the address is an example only.)
-#broadcast 192.168.123.255
-
-# If you want to listen to time broadcasts on your local subnet, de-comment the
-# next lines.  Please do this only if you trust everybody on the network!
-#disable auth
-#broadcastclient
-
-#Changes recquired to use pps synchonisation as explained in documentation:
-#http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm#AEN3918
-
-#server 127.127.8.1 mode 135 prefer    # Meinberg GPS167 with PPS
-#fudge 127.127.8.1 time1 0.0042        # relative to PPS for my hardware
-
-#server 127.127.22.1                   # ATOM(PPS)
-#fudge 127.127.22.1 flag3 1            # enable PPS API
+# Location of drift file
+driftfile /var/lib/ntp/ntp.drift

2018-09-01 22:11:33,676 [salt.state       :1941][INFO    ][3353] Completed state [/etc/ntp.conf] at time 22:11:33.676477 duration_in_ms=81.239
2018-09-01 22:11:33,972 [salt.state       :1770][INFO    ][3353] Running state [ntp] at time 22:11:33.971971
2018-09-01 22:11:33,972 [salt.state       :1803][INFO    ][3353] Executing state service.running for [ntp]
2018-09-01 22:11:33,973 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:33,986 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-09-01 22:11:34,000 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-09-01 22:11:34,021 [salt.state       :290 ][INFO    ][3353] The service ntp is already running
2018-09-01 22:11:34,022 [salt.state       :1941][INFO    ][3353] Completed state [ntp] at time 22:11:34.022728 duration_in_ms=50.757
2018-09-01 22:11:34,023 [salt.state       :1770][INFO    ][3353] Running state [ntp] at time 22:11:34.023330
2018-09-01 22:11:34,024 [salt.state       :1803][INFO    ][3353] Executing state service.mod_watch for [ntp]
2018-09-01 22:11:34,025 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-09-01 22:11:34,040 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3353] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'ntp.service'] in directory '/root'
2018-09-01 22:11:34,100 [salt.state       :290 ][INFO    ][3353] {'ntp': True}
2018-09-01 22:11:34,101 [salt.state       :1941][INFO    ][3353] Completed state [ntp] at time 22:11:34.101104 duration_in_ms=77.774
2018-09-01 22:11:34,105 [salt.minion      :1708][INFO    ][3353] Returning information for job: 20180901221024268410
2018-09-01 22:11:41,108 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command ssh.set_auth_key with jid 20180901221141092238
2018-09-01 22:11:41,132 [salt.minion      :1431][INFO    ][6827] Starting a new job with PID 6827
2018-09-01 22:11:41,150 [salt.minion      :1708][INFO    ][6827] Returning information for job: 20180901221141092238
2018-09-01 22:11:41,873 [salt.minion      :1307][INFO    ][3264] User sudo_ubuntu Executing command system.reboot with jid 20180901221141855449
2018-09-01 22:11:41,896 [salt.minion      :1431][INFO    ][6832] Starting a new job with PID 6832
2018-09-01 22:11:41,903 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][6832] Executing command ['shutdown', '-r', 'now'] in directory '/root'
2018-09-01 22:11:42,106 [salt.utils.parsers:1051][WARNING ][3264] Minion received a SIGTERM. Exiting.
2018-09-01 22:11:42,107 [salt.cli.daemons :82  ][INFO    ][3264] Shutting down the Salt Minion
2018-09-01 22:11:43,721 [salt.minion      :1708][INFO    ][6832] Returning information for job: 20180901221141855449
2018-09-01 22:11:57,107 [salt.cli.daemons :293 ][INFO    ][1683] Setting up the Salt Minion "prx02.mcp-ovs-ha.local"
2018-09-01 22:11:57,767 [salt.cli.daemons :82  ][INFO    ][1683] Starting up the Salt Minion
2018-09-01 22:11:57,768 [salt.utils.event :1017][INFO    ][1683] Starting pull socket on /var/run/salt/minion/minion_event_0f78866554_pull.ipc
2018-09-01 22:11:58,998 [salt.minion      :976 ][INFO    ][1683] Creating minion process manager
2018-09-01 22:12:00,117 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1683] Executing command ['date', '+%z'] in directory '/root'
2018-09-01 22:12:00,193 [salt.utils.schedule:568 ][INFO    ][1683] Updating job settings for scheduled job: __mine_interval
2018-09-01 22:12:00,196 [salt.minion      :1107][INFO    ][1683] Added mine.update to scheduler
2018-09-01 22:12:00,212 [salt.minion      :1965][INFO    ][1683] Minion is starting as user 'root'
2018-09-01 22:12:00,229 [salt.minion      :2324][INFO    ][1683] Minion is ready to receive requests!
2018-09-01 22:12:03,333 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221202627128
2018-09-01 22:12:03,358 [salt.minion      :1431][INFO    ][1812] Starting a new job with PID 1812
2018-09-01 22:12:03,415 [salt.minion      :1708][INFO    ][1812] Returning information for job: 20180901221202627128
2018-09-01 22:12:23,386 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command test.ping with jid 20180901221223373922
2018-09-01 22:12:23,405 [salt.minion      :1431][INFO    ][1817] Starting a new job with PID 1817
2018-09-01 22:12:23,496 [salt.minion      :1708][INFO    ][1817] Returning information for job: 20180901221223373922
2018-09-01 22:12:24,142 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command pkg.upgrade with jid 20180901221224128382
2018-09-01 22:12:24,165 [salt.minion      :1431][INFO    ][1822] Starting a new job with PID 1822
2018-09-01 22:12:24,366 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1822] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:12:25,236 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1822] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'dist-upgrade'] in directory '/root'
2018-09-01 22:12:29,250 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221229236054
2018-09-01 22:12:29,275 [salt.minion      :1431][INFO    ][1862] Starting a new job with PID 1862
2018-09-01 22:12:29,290 [salt.minion      :1708][INFO    ][1862] Returning information for job: 20180901221229236054
2018-09-01 22:12:39,422 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221239411245
2018-09-01 22:12:39,442 [salt.minion      :1431][INFO    ][1893] Starting a new job with PID 1893
2018-09-01 22:12:39,464 [salt.minion      :1708][INFO    ][1893] Returning information for job: 20180901221239411245
2018-09-01 22:12:49,549 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221249539023
2018-09-01 22:12:49,574 [salt.minion      :1431][INFO    ][1916] Starting a new job with PID 1916
2018-09-01 22:12:49,592 [salt.minion      :1708][INFO    ][1916] Returning information for job: 20180901221249539023
2018-09-01 22:12:59,757 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221259747836
2018-09-01 22:12:59,776 [salt.minion      :1431][INFO    ][1946] Starting a new job with PID 1946
2018-09-01 22:12:59,793 [salt.minion      :1708][INFO    ][1946] Returning information for job: 20180901221259747836
2018-09-01 22:13:09,874 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221309862012
2018-09-01 22:13:09,897 [salt.minion      :1431][INFO    ][1963] Starting a new job with PID 1963
2018-09-01 22:13:10,108 [salt.minion      :1708][INFO    ][1963] Returning information for job: 20180901221309862012
2018-09-01 22:13:20,058 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221320047537
2018-09-01 22:13:20,082 [salt.minion      :1431][INFO    ][1983] Starting a new job with PID 1983
2018-09-01 22:13:20,107 [salt.minion      :1708][INFO    ][1983] Returning information for job: 20180901221320047537
2018-09-01 22:13:30,231 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221330220870
2018-09-01 22:13:30,248 [salt.minion      :1431][INFO    ][2009] Starting a new job with PID 2009
2018-09-01 22:13:30,262 [salt.minion      :1708][INFO    ][2009] Returning information for job: 20180901221330220870
2018-09-01 22:13:40,363 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221340351888
2018-09-01 22:13:40,386 [salt.minion      :1431][INFO    ][2029] Starting a new job with PID 2029
2018-09-01 22:13:40,406 [salt.minion      :1708][INFO    ][2029] Returning information for job: 20180901221340351888
2018-09-01 22:13:50,434 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221350425072
2018-09-01 22:13:50,454 [salt.minion      :1431][INFO    ][2062] Starting a new job with PID 2062
2018-09-01 22:13:50,470 [salt.minion      :1708][INFO    ][2062] Returning information for job: 20180901221350425072
2018-09-01 22:14:00,576 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221400565809
2018-09-01 22:14:00,599 [salt.minion      :1431][INFO    ][2123] Starting a new job with PID 2123
2018-09-01 22:14:00,645 [salt.minion      :1708][INFO    ][2123] Returning information for job: 20180901221400565809
2018-09-01 22:14:10,661 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221410651093
2018-09-01 22:14:10,679 [salt.minion      :1431][INFO    ][2238] Starting a new job with PID 2238
2018-09-01 22:14:10,693 [salt.minion      :1708][INFO    ][2238] Returning information for job: 20180901221410651093
2018-09-01 22:14:20,763 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221420753114
2018-09-01 22:14:20,780 [salt.minion      :1431][INFO    ][2376] Starting a new job with PID 2376
2018-09-01 22:14:20,794 [salt.minion      :1708][INFO    ][2376] Returning information for job: 20180901221420753114
2018-09-01 22:14:30,792 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221430782986
2018-09-01 22:14:30,814 [salt.minion      :1431][INFO    ][2917] Starting a new job with PID 2917
2018-09-01 22:14:30,832 [salt.minion      :1708][INFO    ][2917] Returning information for job: 20180901221430782986
2018-09-01 22:14:40,503 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1822] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:14:40,542 [salt.minion      :1708][INFO    ][1822] Returning information for job: 20180901221224128382
2018-09-01 22:14:41,399 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command test.ping with jid 20180901221441389573
2018-09-01 22:14:41,422 [salt.minion      :1431][INFO    ][3195] Starting a new job with PID 3195
2018-09-01 22:14:41,437 [salt.minion      :1708][INFO    ][3195] Returning information for job: 20180901221441389573
2018-09-01 22:16:26,467 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command state.sls with jid 20180901221626460345
2018-09-01 22:16:26,485 [salt.minion      :1431][INFO    ][3200] Starting a new job with PID 3200
2018-09-01 22:16:27,138 [salt.state       :905 ][INFO    ][3200] Loading fresh modules for state activity
2018-09-01 22:16:27,207 [salt.fileclient  :1215][INFO    ][3200] Fetching file from saltenv 'base', ** done ** 'keepalived/init.sls'
2018-09-01 22:16:27,239 [salt.fileclient  :1215][INFO    ][3200] Fetching file from saltenv 'base', ** done ** 'keepalived/cluster.sls'
2018-09-01 22:16:29,833 [salt.state       :1770][INFO    ][3200] Running state [keepalived] at time 22:16:29.833724
2018-09-01 22:16:29,834 [salt.state       :1803][INFO    ][3200] Executing state pkg.installed for [keepalived]
2018-09-01 22:16:29,835 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:16:30,202 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['apt-cache', '-q', 'policy', 'keepalived'] in directory '/root'
2018-09-01 22:16:30,343 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:16:31,575 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221631567325
2018-09-01 22:16:31,609 [salt.minion      :1431][INFO    ][3552] Starting a new job with PID 3552
2018-09-01 22:16:31,624 [salt.minion      :1708][INFO    ][3552] Returning information for job: 20180901221631567325
2018-09-01 22:16:32,425 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:16:32,455 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'keepalived'] in directory '/root'
2018-09-01 22:16:40,848 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:16:40,887 [salt.state       :290 ][INFO    ][3200] Made the following changes:
'libsnmp30' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'libsensors4' changed from 'absent' to '1:3.4.0-2'
'libsnmp-base' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'keepalived' changed from 'absent' to '1:1.3.9-1ubuntu0.18.04.1~cloud0'
'ipvsadm' changed from 'absent' to '1:1.28-3'
'libnl-route-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'

2018-09-01 22:16:40,929 [salt.state       :905 ][INFO    ][3200] Loading fresh modules for state activity
2018-09-01 22:16:40,963 [salt.state       :1941][INFO    ][3200] Completed state [keepalived] at time 22:16:40.963409 duration_in_ms=11129.685
2018-09-01 22:16:40,967 [salt.state       :1770][INFO    ][3200] Running state [lsof] at time 22:16:40.967687
2018-09-01 22:16:40,967 [salt.state       :1803][INFO    ][3200] Executing state pkg.installed for [lsof]
2018-09-01 22:16:41,435 [salt.state       :290 ][INFO    ][3200] All specified packages are already installed
2018-09-01 22:16:41,436 [salt.state       :1941][INFO    ][3200] Completed state [lsof] at time 22:16:41.436239 duration_in_ms=468.551
2018-09-01 22:16:41,478 [salt.state       :1770][INFO    ][3200] Running state [/etc/keepalived/keepalived.conf] at time 22:16:41.478653
2018-09-01 22:16:41,479 [salt.state       :1803][INFO    ][3200] Executing state file.managed for [/etc/keepalived/keepalived.conf]
2018-09-01 22:16:41,506 [salt.fileclient  :1215][INFO    ][3200] Fetching file from saltenv 'base', ** done ** 'keepalived/files/keepalived.conf'
2018-09-01 22:16:41,543 [salt.state       :290 ][INFO    ][3200] File changed:
New file
2018-09-01 22:16:41,543 [salt.state       :1941][INFO    ][3200] Completed state [/etc/keepalived/keepalived.conf] at time 22:16:41.543306 duration_in_ms=64.653
2018-09-01 22:16:41,543 [salt.state       :1770][INFO    ][3200] Running state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 22:16:41.543500
2018-09-01 22:16:41,543 [salt.state       :1803][INFO    ][3200] Executing state file.managed for [/usr/local/bin/vrrp_script_check_pidof.sh]
2018-09-01 22:16:41,731 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221641724096
2018-09-01 22:16:41,752 [salt.minion      :1431][INFO    ][4505] Starting a new job with PID 4505
2018-09-01 22:16:41,765 [salt.minion      :1708][INFO    ][4505] Returning information for job: 20180901221641724096
2018-09-01 22:16:42,689 [salt.fileclient  :1215][INFO    ][3200] Fetching file from saltenv 'base', ** done ** 'keepalived/files/vrrp_script_check_pidof.sh'
2018-09-01 22:16:42,697 [salt.state       :290 ][INFO    ][3200] File changed:
New file
2018-09-01 22:16:42,697 [salt.state       :1941][INFO    ][3200] Completed state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 22:16:42.697521 duration_in_ms=1154.02
2018-09-01 22:16:42,712 [salt.state       :1770][INFO    ][3200] Running state [keepalived] at time 22:16:42.712402
2018-09-01 22:16:42,712 [salt.state       :1803][INFO    ][3200] Executing state service.running for [keepalived]
2018-09-01 22:16:42,713 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'status', 'keepalived.service', '-n', '0'] in directory '/root'
2018-09-01 22:16:42,736 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,748 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,767 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,881 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,904 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,924 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3200] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:16:42,941 [salt.state       :290 ][INFO    ][3200] {'keepalived': True}
2018-09-01 22:16:42,941 [salt.state       :1941][INFO    ][3200] Completed state [keepalived] at time 22:16:42.941681 duration_in_ms=229.279
2018-09-01 22:16:42,943 [salt.minion      :1708][INFO    ][3200] Returning information for job: 20180901221626460345
2018-09-01 22:18:48,626 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command pillar.get with jid 20180901221848613405
2018-09-01 22:18:48,650 [salt.minion      :1431][INFO    ][4621] Starting a new job with PID 4621
2018-09-01 22:18:48,657 [salt.minion      :1708][INFO    ][4621] Returning information for job: 20180901221848613405
2018-09-01 22:31:13,403 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command state.sls with jid 20180901223113386300
2018-09-01 22:31:13,425 [salt.minion      :1431][INFO    ][4947] Starting a new job with PID 4947
2018-09-01 22:31:18,511 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901223118491130
2018-09-01 22:31:18,532 [salt.minion      :1431][INFO    ][4954] Starting a new job with PID 4954
2018-09-01 22:31:18,546 [salt.minion      :1708][INFO    ][4954] Returning information for job: 20180901223118491130
2018-09-01 22:31:19,058 [salt.state       :905 ][INFO    ][4947] Loading fresh modules for state activity
2018-09-01 22:31:20,110 [salt.fileclient  :1215][INFO    ][4947] Fetching file from saltenv 'base', ** done ** 'memcached/init.sls'
2018-09-01 22:31:20,140 [salt.fileclient  :1215][INFO    ][4947] Fetching file from saltenv 'base', ** done ** 'memcached/server.sls'
2018-09-01 22:31:20,163 [salt.fileclient  :1215][INFO    ][4947] Fetching file from saltenv 'base', ** done ** 'memcached/map.jinja'
2018-09-01 22:31:20,700 [salt.state       :1770][INFO    ][4947] Running state [memcached] at time 22:31:20.700597
2018-09-01 22:31:20,701 [salt.state       :1803][INFO    ][4947] Executing state pkg.installed for [memcached]
2018-09-01 22:31:20,701 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:21,021 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['apt-cache', '-q', 'policy', 'memcached'] in directory '/root'
2018-09-01 22:31:21,133 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:31:22,862 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:31:22,889 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'memcached'] in directory '/root'
2018-09-01 22:31:28,146 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:28,182 [salt.state       :290 ][INFO    ][4947] Made the following changes:
'memcached' changed from 'absent' to '1.4.25-2ubuntu1.4'

2018-09-01 22:31:28,197 [salt.state       :905 ][INFO    ][4947] Loading fresh modules for state activity
2018-09-01 22:31:28,227 [salt.state       :1941][INFO    ][4947] Completed state [memcached] at time 22:31:28.227364 duration_in_ms=7526.766
2018-09-01 22:31:28,231 [salt.state       :1770][INFO    ][4947] Running state [python-memcache] at time 22:31:28.231281
2018-09-01 22:31:28,231 [salt.state       :1803][INFO    ][4947] Executing state pkg.installed for [python-memcache]
2018-09-01 22:31:28,657 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:31:28,674 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901223128656973
2018-09-01 22:31:28,687 [salt.minion      :1431][INFO    ][5771] Starting a new job with PID 5771
2018-09-01 22:31:28,689 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-memcache'] in directory '/root'
2018-09-01 22:31:28,698 [salt.minion      :1708][INFO    ][5771] Returning information for job: 20180901223128656973
2018-09-01 22:31:30,770 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:30,808 [salt.state       :290 ][INFO    ][4947] Made the following changes:
'python-memcache' changed from 'absent' to '1.57+fixed-1~u16.04+mcp1'

2018-09-01 22:31:30,821 [salt.state       :905 ][INFO    ][4947] Loading fresh modules for state activity
2018-09-01 22:31:30,843 [salt.state       :1941][INFO    ][4947] Completed state [python-memcache] at time 22:31:30.843868 duration_in_ms=2612.587
2018-09-01 22:31:30,847 [salt.state       :1770][INFO    ][4947] Running state [/etc/memcached.conf] at time 22:31:30.847231
2018-09-01 22:31:30,847 [salt.state       :1803][INFO    ][4947] Executing state file.managed for [/etc/memcached.conf]
2018-09-01 22:31:30,878 [salt.fileclient  :1215][INFO    ][4947] Fetching file from saltenv 'base', ** done ** 'memcached/files/memcached.conf'
2018-09-01 22:31:30,903 [salt.state       :290 ][INFO    ][4947] File changed:
--- 
+++ 
@@ -1,11 +1,10 @@
+
 # memcached default config file
 # 2003 - Jay Bonci <jaybonci@debian.org>
-# This configuration file is read by the start-memcached script provided as
-# part of the Debian GNU/Linux distribution.
+# This configuration file is read by the start-memcached script provided as part of the Debian GNU/Linux distribution. 
 
 # Run memcached as a daemon. This command is implied, and is not needed for the
-# daemon to run. See the README.Debian that comes with this package for more
-# information.
+# daemon to run. See the README.Debian that comes with this package for more information.
 -d
 
 # Log memcached's output to /var/log/memcached
@@ -18,13 +17,13 @@
 # -vv
 
 # Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
-# Note that the daemon will grow to this size, but does not start out holding this much
-# memory
+# Note that the daemon will grow to this size, but does not start out holding this much memory
 -m 64
 
 # Default connection port is 11211
 -p 11211
 
+-U 11211
 # Run the daemon as root. The start-memcached will default to running as root if no
 # -u command is present in this config file
 -u memcache
@@ -32,10 +31,12 @@
 # Specify which IP address to listen on. The default is to listen on all IP addresses
 # This parameter is one of the only security measures that memcached has, so make sure
 # it's listening on a firewalled interface.
--l 127.0.0.1
+-l 0.0.0.0
 
 # Limit the number of simultaneous incoming connections. The daemon default is 1024
 # -c 1024
+# Mirantis
+-c 8192
 
 # Lock down all paged memory. Consult with the README and homepage before you do this
 # -k
@@ -45,3 +46,9 @@
 
 # Maximize core file limit
 # -r
+
+# Number of threads to use to process incoming requests.
+-t 1
+
+# Set size of each slab page. Default value for this parameter is 1m, minimum is 1k, max is 128m.
+-I 1m

2018-09-01 22:31:30,909 [salt.state       :1941][INFO    ][4947] Completed state [/etc/memcached.conf] at time 22:31:30.909332 duration_in_ms=62.099
2018-09-01 22:31:31,211 [salt.state       :1770][INFO    ][4947] Running state [memcached] at time 22:31:31.211491
2018-09-01 22:31:31,212 [salt.state       :1803][INFO    ][4947] Executing state service.running for [memcached]
2018-09-01 22:31:31,212 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemctl', 'status', 'memcached.service', '-n', '0'] in directory '/root'
2018-09-01 22:31:31,225 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,235 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,245 [salt.state       :290 ][INFO    ][4947] The service memcached is already running
2018-09-01 22:31:31,246 [salt.state       :1941][INFO    ][4947] Completed state [memcached] at time 22:31:31.246042 duration_in_ms=34.551
2018-09-01 22:31:31,246 [salt.state       :1770][INFO    ][4947] Running state [memcached] at time 22:31:31.246403
2018-09-01 22:31:31,246 [salt.state       :1803][INFO    ][4947] Executing state service.mod_watch for [memcached]
2018-09-01 22:31:31,247 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,257 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4947] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,279 [salt.state       :290 ][INFO    ][4947] {'memcached': True}
2018-09-01 22:31:31,282 [salt.state       :1941][INFO    ][4947] Completed state [memcached] at time 22:31:31.282353 duration_in_ms=35.949
2018-09-01 22:31:31,283 [salt.minion      :1708][INFO    ][4947] Returning information for job: 20180901223113386300
2018-09-01 23:12:01,231 [salt.utils.schedule:1375][INFO    ][1683] Running scheduled job: __mine_interval
2018-09-01 23:26:03,782 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command state.sls with jid 20180901232603765218
2018-09-01 23:26:03,803 [salt.minion      :1431][INFO    ][7274] Starting a new job with PID 7274
2018-09-01 23:26:08,446 [salt.state       :905 ][INFO    ][7274] Loading fresh modules for state activity
2018-09-01 23:26:08,485 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/init.sls'
2018-09-01 23:26:08,517 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/init.sls'
2018-09-01 23:26:08,543 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/service/init.sls'
2018-09-01 23:26:08,618 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/service/modules.sls'
2018-09-01 23:26:08,674 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/service/mpm.sls'
2018-09-01 23:26:08,730 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/site.sls'
2018-09-01 23:26:08,824 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/users.sls'
2018-09-01 23:26:08,891 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232608872791
2018-09-01 23:26:08,898 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/server/robots.sls'
2018-09-01 23:26:08,911 [salt.minion      :1431][INFO    ][7297] Starting a new job with PID 7297
2018-09-01 23:26:08,927 [salt.minion      :1708][INFO    ][7297] Returning information for job: 20180901232608872791
2018-09-01 23:26:08,951 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/init.sls'
2018-09-01 23:26:08,977 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/server/init.sls'
2018-09-01 23:26:08,999 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/server/service.sls'
2018-09-01 23:26:09,079 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/server/plugin.sls'
2018-09-01 23:26:09,685 [salt.state       :1770][INFO    ][7274] Running state [apache2] at time 23:26:09.685917
2018-09-01 23:26:09,686 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [apache2]
2018-09-01 23:26:09,687 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:26:10,007 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['apt-cache', '-q', 'policy', 'apache2'] in directory '/root'
2018-09-01 23:26:10,094 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 23:26:13,493 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:26:13,519 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'apache2'] in directory '/root'
2018-09-01 23:26:19,084 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232619066644
2018-09-01 23:26:19,101 [salt.minion      :1431][INFO    ][8231] Starting a new job with PID 8231
2018-09-01 23:26:19,116 [salt.minion      :1708][INFO    ][8231] Returning information for job: 20180901232619066644
2018-09-01 23:26:26,348 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:26:26,384 [salt.state       :290 ][INFO    ][7274] Made the following changes:
'apache2-data' changed from 'absent' to '2.4.18-2ubuntu3.9'
'httpd-cgi' changed from 'absent' to '1'
'apache2-utils' changed from 'absent' to '2.4.18-2ubuntu3.9'
'httpd' changed from 'absent' to '1'
'ssl-cert' changed from 'absent' to '1.0.37'
'apache2' changed from 'absent' to '2.4.18-2ubuntu3.9'

2018-09-01 23:26:26,399 [salt.state       :905 ][INFO    ][7274] Loading fresh modules for state activity
2018-09-01 23:26:26,423 [salt.state       :1941][INFO    ][7274] Completed state [apache2] at time 23:26:26.423843 duration_in_ms=16737.926
2018-09-01 23:26:26,428 [salt.state       :1770][INFO    ][7274] Running state [openssl] at time 23:26:26.428132
2018-09-01 23:26:26,428 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [openssl]
2018-09-01 23:26:26,875 [salt.state       :290 ][INFO    ][7274] All specified packages are already installed
2018-09-01 23:26:26,875 [salt.state       :1941][INFO    ][7274] Completed state [openssl] at time 23:26:26.875832 duration_in_ms=447.7
2018-09-01 23:26:26,877 [salt.state       :1770][INFO    ][7274] Running state [a2enmod ssl] at time 23:26:26.876995
2018-09-01 23:26:26,877 [salt.state       :1803][INFO    ][7274] Executing state cmd.run for [a2enmod ssl]
2018-09-01 23:26:26,878 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command 'a2enmod ssl' in directory '/root'
2018-09-01 23:26:26,932 [salt.state       :290 ][INFO    ][7274] {'pid': 8751, 'retcode': 0, 'stderr': '', 'stdout': 'Considering dependency setenvif for ssl:\nModule setenvif already enabled\nConsidering dependency mime for ssl:\nModule mime already enabled\nConsidering dependency socache_shmcb for ssl:\nEnabling module socache_shmcb.\nEnabling module ssl.\nSee /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:26,932 [salt.state       :1941][INFO    ][7274] Completed state [a2enmod ssl] at time 23:26:26.932662 duration_in_ms=55.667
2018-09-01 23:26:26,933 [salt.state       :1770][INFO    ][7274] Running state [a2enmod rewrite] at time 23:26:26.933410
2018-09-01 23:26:26,933 [salt.state       :1803][INFO    ][7274] Executing state cmd.run for [a2enmod rewrite]
2018-09-01 23:26:26,934 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command 'a2enmod rewrite' in directory '/root'
2018-09-01 23:26:26,977 [salt.state       :290 ][INFO    ][7274] {'pid': 8764, 'retcode': 0, 'stderr': '', 'stdout': 'Enabling module rewrite.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:26,977 [salt.state       :1941][INFO    ][7274] Completed state [a2enmod rewrite] at time 23:26:26.977563 duration_in_ms=44.152
2018-09-01 23:26:26,981 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/mods-available/mpm_prefork.conf] at time 23:26:26.981853
2018-09-01 23:26:26,982 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/etc/apache2/mods-available/mpm_prefork.conf]
2018-09-01 23:26:27,007 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/files/mpm/mpm_prefork.conf'
2018-09-01 23:26:27,046 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -6,11 +6,12 @@
 # MaxConnectionsPerChild: maximum number of requests a server process serves
 
 <IfModule mpm_prefork_module>
-	StartServers			 5
-	MinSpareServers		  5
-	MaxSpareServers		 10
-	MaxRequestWorkers	  150
-	MaxConnectionsPerChild   0
+    StartServers            5
+    MinSpareServers         5
+    MaxSpareServers         10
+    MaxRequestWorkers       150
+    MaxConnectionsPerChild  0
+    ServerLimit             150
 </IfModule>
 
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+# vim: syntax=apache ts=4 sw=4 sts=4 sr et

2018-09-01 23:26:27,047 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/mods-available/mpm_prefork.conf] at time 23:26:27.047612 duration_in_ms=65.757
2018-09-01 23:26:27,048 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/mods-enabled/mpm_worker.load] at time 23:26:27.048090
2018-09-01 23:26:27,048 [salt.state       :1803][INFO    ][7274] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.load]
2018-09-01 23:26:27,049 [salt.state       :290 ][INFO    ][7274] File /etc/apache2/mods-enabled/mpm_worker.load is not present
2018-09-01 23:26:27,049 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/mods-enabled/mpm_worker.load] at time 23:26:27.049558 duration_in_ms=1.468
2018-09-01 23:26:27,050 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/mods-enabled/mpm_event.load] at time 23:26:27.049989
2018-09-01 23:26:27,050 [salt.state       :1803][INFO    ][7274] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_event.load]
2018-09-01 23:26:27,051 [salt.state       :290 ][INFO    ][7274] {'removed': '/etc/apache2/mods-enabled/mpm_event.load'}
2018-09-01 23:26:27,051 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/mods-enabled/mpm_event.load] at time 23:26:27.051472 duration_in_ms=1.483
2018-09-01 23:26:27,052 [salt.state       :1770][INFO    ][7274] Running state [a2enmod mpm_prefork] at time 23:26:27.052903
2018-09-01 23:26:27,053 [salt.state       :1803][INFO    ][7274] Executing state cmd.run for [a2enmod mpm_prefork]
2018-09-01 23:26:27,054 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command 'a2enmod mpm_prefork' in directory '/root'
2018-09-01 23:26:27,098 [salt.state       :290 ][INFO    ][7274] {'pid': 8777, 'retcode': 0, 'stderr': '', 'stdout': 'Considering conflict mpm_event for mpm_prefork:\nConsidering conflict mpm_worker for mpm_prefork:\nEnabling module mpm_prefork.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:27,099 [salt.state       :1941][INFO    ][7274] Completed state [a2enmod mpm_prefork] at time 23:26:27.099226 duration_in_ms=46.324
2018-09-01 23:26:27,099 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 23:26:27.099651
2018-09-01 23:26:27,099 [salt.state       :1803][INFO    ][7274] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.conf]
2018-09-01 23:26:27,100 [salt.state       :290 ][INFO    ][7274] File /etc/apache2/mods-enabled/mpm_worker.conf is not present
2018-09-01 23:26:27,100 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 23:26:27.100569 duration_in_ms=0.918
2018-09-01 23:26:27,100 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/mods-enabled/mpm_event.conf] at time 23:26:27.100770
2018-09-01 23:26:27,100 [salt.state       :1803][INFO    ][7274] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_event.conf]
2018-09-01 23:26:27,101 [salt.state       :290 ][INFO    ][7274] {'removed': '/etc/apache2/mods-enabled/mpm_event.conf'}
2018-09-01 23:26:27,101 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/mods-enabled/mpm_event.conf] at time 23:26:27.101476 duration_in_ms=0.706
2018-09-01 23:26:27,114 [salt.state       :1770][INFO    ][7274] Running state [apache_server_service_task] at time 23:26:27.114528
2018-09-01 23:26:27,114 [salt.state       :1803][INFO    ][7274] Executing state test.show_notification for [apache_server_service_task]
2018-09-01 23:26:27,115 [salt.state       :290 ][INFO    ][7274] Running apache.server.service
2018-09-01 23:26:27,115 [salt.state       :1941][INFO    ][7274] Completed state [apache_server_service_task] at time 23:26:27.115491 duration_in_ms=0.962
2018-09-01 23:26:27,116 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/ports.conf] at time 23:26:27.116403
2018-09-01 23:26:27,116 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/etc/apache2/ports.conf]
2018-09-01 23:26:27,135 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/files/ports.conf'
2018-09-01 23:26:27,174 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -2,14 +2,4 @@
 # have to change the VirtualHost statement in
 # /etc/apache2/sites-enabled/000-default.conf
 
-Listen 80
-
-<IfModule ssl_module>
-	Listen 443
-</IfModule>
-
-<IfModule mod_gnutls.c>
-	Listen 443
-</IfModule>
-
 # vim: syntax=apache ts=4 sw=4 sts=4 sr noet

2018-09-01 23:26:27,174 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/ports.conf] at time 23:26:27.174830 duration_in_ms=58.426
2018-09-01 23:26:27,175 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/conf-available/security.conf] at time 23:26:27.175185
2018-09-01 23:26:27,175 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/etc/apache2/conf-available/security.conf]
2018-09-01 23:26:27,190 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'apache/files/security.conf'
2018-09-01 23:26:27,286 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -1,73 +1,14 @@
-#
-# Disable access to the entire file system except for the directories that
-# are explicitly allowed later.
-#
-# This currently breaks the configurations that come with some web application
-# Debian packages.
-#
-#<Directory />
-#   AllowOverride None
-#   Require all denied
-#</Directory>
+ServerSignature Off
+TraceEnable Off
+ServerTokens Prod
+<DirectoryMatch "/\.svn">
+    Require all denied
+</DirectoryMatch>
 
+<DirectoryMatch "/\.git">
+    Require all denied
+</DirectoryMatch>
 
-# Changing the following options will not really affect the security of the
-# server, but might make attacks slightly more difficult in some cases.
-
-#
-# ServerTokens
-# This directive configures what you return as the Server HTTP response
-# Header. The default is 'Full' which sends information about the OS-Type
-# and compiled in modules.
-# Set to one of:  Full | OS | Minimal | Minor | Major | Prod
-# where Full conveys the most information, and Prod the least.
-#ServerTokens Minimal
-ServerTokens OS
-#ServerTokens Full
-
-#
-# Optionally add a line containing the server version and virtual host
-# name to server-generated pages (internal error documents, FTP directory
-# listings, mod_status and mod_info output etc., but not CGI generated
-# documents or custom error documents).
-# Set to "EMail" to also include a mailto: link to the ServerAdmin.
-# Set to one of:  On | Off | EMail
-#ServerSignature Off
-ServerSignature On
-
-#
-# Allow TRACE method
-#
-# Set to "extended" to also reflect the request body (only for testing and
-# diagnostic purposes).
-#
-# Set to one of:  On | Off | extended
-TraceEnable Off
-#TraceEnable On
-
-#
-# Forbid access to version control directories
-#
-# If you use version control systems in your document root, you should
-# probably deny access to their directories. For example, for subversion:
-#
-#<DirectoryMatch "/\.svn">
-#   Require all denied
-#</DirectoryMatch>
-
-#
-# Setting this header will prevent MSIE from interpreting files as something
-# else than declared by the content type in the HTTP headers.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Content-Type-Options: "nosniff"
-
-#
-# Setting this header will prevent other sites from embedding pages from this
-# site as frames. This defends against clickjacking attacks.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Frame-Options: "sameorigin"
-
-
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+<DirectoryMatch "/\.hg">
+    Require all denied
+</DirectoryMatch>

2018-09-01 23:26:27,286 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/conf-available/security.conf] at time 23:26:27.286290 duration_in_ms=111.105
2018-09-01 23:26:27,290 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/sites-enabled/000-default.conf] at time 23:26:27.290650
2018-09-01 23:26:27,290 [salt.state       :1803][INFO    ][7274] Executing state file.absent for [/etc/apache2/sites-enabled/000-default.conf]
2018-09-01 23:26:27,291 [salt.state       :290 ][INFO    ][7274] {'removed': '/etc/apache2/sites-enabled/000-default.conf'}
2018-09-01 23:26:27,291 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/sites-enabled/000-default.conf] at time 23:26:27.291263 duration_in_ms=0.613
2018-09-01 23:26:27,292 [salt.state       :1770][INFO    ][7274] Running state [apache2] at time 23:26:27.292528
2018-09-01 23:26:27,292 [salt.state       :1803][INFO    ][7274] Executing state service.running for [apache2]
2018-09-01 23:26:27,293 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-09-01 23:26:27,317 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,330 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,346 [salt.state       :290 ][INFO    ][7274] The service apache2 is already running
2018-09-01 23:26:27,347 [salt.state       :1941][INFO    ][7274] Completed state [apache2] at time 23:26:27.347708 duration_in_ms=55.18
2018-09-01 23:26:27,348 [salt.state       :1770][INFO    ][7274] Running state [apache2] at time 23:26:27.348341
2018-09-01 23:26:27,349 [salt.state       :1803][INFO    ][7274] Executing state service.mod_watch for [apache2]
2018-09-01 23:26:27,350 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,365 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemd-run', '--scope', 'systemctl', 'reload', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,534 [salt.state       :290 ][INFO    ][7274] {'apache2': True}
2018-09-01 23:26:27,534 [salt.state       :1941][INFO    ][7274] Completed state [apache2] at time 23:26:27.534713 duration_in_ms=186.371
2018-09-01 23:26:27,536 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/conf-enabled/security.conf] at time 23:26:27.536644
2018-09-01 23:26:27,537 [salt.state       :1803][INFO    ][7274] Executing state file.symlink for [/etc/apache2/conf-enabled/security.conf]
2018-09-01 23:26:27,539 [salt.state       :290 ][INFO    ][7274] {'new': '/etc/apache2/conf-enabled/security.conf'}
2018-09-01 23:26:27,540 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/conf-enabled/security.conf] at time 23:26:27.540323 duration_in_ms=3.679
2018-09-01 23:26:27,540 [salt.state       :1770][INFO    ][7274] Running state [openstack-dashboard] at time 23:26:27.540912
2018-09-01 23:26:27,541 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [openstack-dashboard]
2018-09-01 23:26:27,563 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:26:27,587 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'openstack-dashboard'] in directory '/root'
2018-09-01 23:26:29,277 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232629259730
2018-09-01 23:26:29,297 [salt.minion      :1431][INFO    ][8887] Starting a new job with PID 8887
2018-09-01 23:26:29,312 [salt.minion      :1708][INFO    ][8887] Returning information for job: 20180901232629259730
2018-09-01 23:26:39,477 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232639456740
2018-09-01 23:26:39,499 [salt.minion      :1431][INFO    ][8896] Starting a new job with PID 8896
2018-09-01 23:26:39,524 [salt.minion      :1708][INFO    ][8896] Returning information for job: 20180901232639456740
2018-09-01 23:26:49,696 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232649677329
2018-09-01 23:26:49,711 [salt.minion      :1431][INFO    ][9017] Starting a new job with PID 9017
2018-09-01 23:26:49,728 [salt.minion      :1708][INFO    ][9017] Returning information for job: 20180901232649677329
2018-09-01 23:26:59,902 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232659884234
2018-09-01 23:27:00,213 [salt.minion      :1431][INFO    ][9273] Starting a new job with PID 9273
2018-09-01 23:27:00,229 [salt.minion      :1708][INFO    ][9273] Returning information for job: 20180901232659884234
2018-09-01 23:27:09,973 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232709954170
2018-09-01 23:27:09,994 [salt.minion      :1431][INFO    ][9471] Starting a new job with PID 9471
2018-09-01 23:27:10,010 [salt.minion      :1708][INFO    ][9471] Returning information for job: 20180901232709954170
2018-09-01 23:27:20,154 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232720134742
2018-09-01 23:27:20,175 [salt.minion      :1431][INFO    ][9719] Starting a new job with PID 9719
2018-09-01 23:27:20,192 [salt.minion      :1708][INFO    ][9719] Returning information for job: 20180901232720134742
2018-09-01 23:27:30,363 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232730344309
2018-09-01 23:27:30,383 [salt.minion      :1431][INFO    ][9971] Starting a new job with PID 9971
2018-09-01 23:27:30,405 [salt.minion      :1708][INFO    ][9971] Returning information for job: 20180901232730344309
2018-09-01 23:27:40,575 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232740553565
2018-09-01 23:27:40,588 [salt.minion      :1431][INFO    ][10229] Starting a new job with PID 10229
2018-09-01 23:27:40,605 [salt.minion      :1708][INFO    ][10229] Returning information for job: 20180901232740553565
2018-09-01 23:27:50,774 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232750758196
2018-09-01 23:27:50,789 [salt.minion      :1431][INFO    ][10520] Starting a new job with PID 10520
2018-09-01 23:27:50,807 [salt.minion      :1708][INFO    ][10520] Returning information for job: 20180901232750758196
2018-09-01 23:28:00,978 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232800959711
2018-09-01 23:28:00,996 [salt.minion      :1431][INFO    ][11279] Starting a new job with PID 11279
2018-09-01 23:28:01,015 [salt.minion      :1708][INFO    ][11279] Returning information for job: 20180901232800959711
2018-09-01 23:28:11,200 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232811181712
2018-09-01 23:28:11,224 [salt.minion      :1431][INFO    ][11290] Starting a new job with PID 11290
2018-09-01 23:28:11,246 [salt.minion      :1708][INFO    ][11290] Returning information for job: 20180901232811181712
2018-09-01 23:28:21,420 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232821401517
2018-09-01 23:28:21,437 [salt.minion      :1431][INFO    ][11429] Starting a new job with PID 11429
2018-09-01 23:28:21,456 [salt.minion      :1708][INFO    ][11429] Returning information for job: 20180901232821401517
2018-09-01 23:28:31,637 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232831619612
2018-09-01 23:28:31,658 [salt.minion      :1431][INFO    ][11854] Starting a new job with PID 11854
2018-09-01 23:28:31,674 [salt.minion      :1708][INFO    ][11854] Returning information for job: 20180901232831619612
2018-09-01 23:28:41,852 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232841833815
2018-09-01 23:28:41,872 [salt.minion      :1431][INFO    ][12205] Starting a new job with PID 12205
2018-09-01 23:28:41,888 [salt.minion      :1708][INFO    ][12205] Returning information for job: 20180901232841833815
2018-09-01 23:28:52,068 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232852050086
2018-09-01 23:28:52,086 [salt.minion      :1431][INFO    ][12472] Starting a new job with PID 12472
2018-09-01 23:28:52,099 [salt.minion      :1708][INFO    ][12472] Returning information for job: 20180901232852050086
2018-09-01 23:29:02,294 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232902277717
2018-09-01 23:29:02,312 [salt.minion      :1431][INFO    ][12537] Starting a new job with PID 12537
2018-09-01 23:29:02,328 [salt.minion      :1708][INFO    ][12537] Returning information for job: 20180901232902277717
2018-09-01 23:29:12,517 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232912501134
2018-09-01 23:29:12,532 [salt.minion      :1431][INFO    ][12546] Starting a new job with PID 12546
2018-09-01 23:29:12,545 [salt.minion      :1708][INFO    ][12546] Returning information for job: 20180901232912501134
2018-09-01 23:29:22,577 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232922562529
2018-09-01 23:29:22,594 [salt.minion      :1431][INFO    ][12590] Starting a new job with PID 12590
2018-09-01 23:29:22,607 [salt.minion      :1708][INFO    ][12590] Returning information for job: 20180901232922562529
2018-09-01 23:29:23,150 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:23,205 [salt.state       :290 ][INFO    ][7274] Made the following changes:
'python-routes' changed from 'absent' to '2.4.1-1~u16.04+mcp2'
'python-retrying' changed from 'absent' to '1.3.3-1'
'libjs-angular-file-upload' changed from 'absent' to '12.0.4+dfsg1-2.1~u16.04+mcp2'
'python-os-service-types' changed from 'absent' to '1.1.0-1.0~u16.04+mcp1'
'python-kombu' changed from 'absent' to '4.1.0-1~u16.04+mcp1'
'python-oslo.concurrency' changed from 'absent' to '3.25.0-1.0~u16.04+mcp2'
'python-xstatic-angular-fileupload' changed from 'absent' to '12.0.4.0+dfsg1-1.1~u16.04+mcp2'
'python-sqlparse' changed from 'absent' to '0.2.2-1~u16.04+mcp1'
'python-pint' changed from 'absent' to '0.6-1ubuntu1'
'python-monotonic' changed from 'absent' to '0.6-2'
'python2.7-pymongo' changed from 'absent' to '1'
'python-openstacksdk' changed from 'absent' to '0.11.3+repack-1.0~u16.04+mcp2'
'python-deprecation' changed from 'absent' to '1.0.1-1~u16.04+mcp2'
'python2.7-bson' changed from 'absent' to '1'
'libtiff5' changed from 'absent' to '4.0.6-1ubuntu0.4'
'python-secretstorage' changed from 'absent' to '2.1.3-1'
'libjs-jsencrypt' changed from 'absent' to '2.3.0+dfsg2-1~u16.04+mcp2'
'python-glanceclient' changed from 'absent' to '1:2.10.0-1.0~u16.04+mcp3'
'python-formencode' changed from 'absent' to '1.3.0-0ubuntu5'
'twitter-bootstrap' changed from 'absent' to '1'
'libjs-term.js' changed from 'absent' to '0.0.7-1~u16.04+mcp2'
'python-cachetools' changed from 'absent' to '2.0.0-2.0~u16.04+mcp1'
'python-xstatic-jasmine' changed from 'absent' to '2.4.1.1+fixed1-1~u16.04+mcp1'
'python-semantic-version' changed from 'absent' to '2.3.1-1'
'python-blinker' changed from 'absent' to '1.3.dfsg2-1build1'
'python-django-common' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'python-roman' changed from 'absent' to '2.0.0-2'
'python-prettytable' changed from 'absent' to '0.7.2-3'
'python-bs4' changed from 'absent' to '4.6.0-1~u16.04+mcp1'
'python2.7-pymongo-ext' changed from 'absent' to '1'
'python-tenacity' changed from 'absent' to '4.8.0-1.0~u16.04+mcp1'
'python-unittest2' changed from 'absent' to '1.1.0-6.1'
'python-setuptools' changed from 'absent' to '39.0.1-2~cloud0'
'python2.7-django-appconf' changed from 'absent' to '1'
'python-stevedore' changed from 'absent' to '1:1.25.0-1~u16.04+mcp2'
'docutils-doc' changed from 'absent' to '0.12+dfsg-1'
'python-dbus' changed from 'absent' to '1.2.0-3'
'python-gridfs' changed from 'absent' to '3.2-1build1'
'python-fixtures' changed from 'absent' to '3.0.0-1.1~u16.04+mcp2'
'python-xstatic-jquery.tablesorter' changed from 'absent' to '2.14.5.1-2.0~u16.04+mcp1'
'libjs-twitter-bootstrap' changed from 'absent' to '2.0.2+dfsg-9'
'python-testtools' changed from 'absent' to '2.3.0-1.0~u16.04+mcp1'
'libjs-jquery-cookie' changed from 'absent' to '10-2ubuntu2'
'python-anyjson' changed from 'absent' to '0.3.3-1build1'
'libjs-angularjs-smart-table' changed from 'absent' to '1.4.13-1~u16.04+mcp2'
'python-xstatic-hogan' changed from 'absent' to '2.0.0.2-1'
'python-dogpile.cache' changed from 'absent' to '0.6.2-1.1~u16.04+mcp2'
'python-compressor' changed from 'absent' to '1'
'python-dnspython' changed from 'absent' to '1.14.0-3.1~u16.04+mcp2'
'libjs-spin.js' changed from 'absent' to '1.2.8+dfsg2-1'
'fonts-roboto-fontface' changed from 'absent' to '0.5.0-2~u16.04+mcp2'
'python-pil' changed from 'absent' to '3.1.2-0ubuntu1.1'
'docutils-common' changed from 'absent' to '0.12+dfsg-1'
'python2.7-lxml' changed from 'absent' to '1'
'python-pika' changed from 'absent' to '0.10.0-1'
'libpaper-utils' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-fasteners' changed from 'absent' to '0.12.0-2ubuntu1'
'python-babel' changed from 'absent' to '2.3.4+dfsg.1-2.1~u16.04+mcp2'
'python-osc-lib' changed from 'absent' to '1.9.0-1.0~u16.04+mcp1'
'liblcms2-2' changed from 'absent' to '2.6-3ubuntu2'
'python2.7-simplejson' changed from 'absent' to '1'
'python-extras' changed from 'absent' to '1.0.0-2.0~u16.04+mcp1'
'python-xstatic-bootstrap-scss' changed from 'absent' to '3.3.7.1-2~u16.04+mcp3'
'python-xstatic-term.js' changed from 'absent' to '0.0.7.0-2~u16.04+mcp2'
'python-bson-ext' changed from 'absent' to '3.2-1build1'
'python-scgi' changed from 'absent' to '1.13-1.1build1'
'python2.7-pil' changed from 'absent' to '1'
'python-repoze.lru' changed from 'absent' to '0.6-6'
'python-posix-ipc' changed from 'absent' to '0.9.8-2build2'
'formencode-i18n' changed from 'absent' to '1.3.0-0ubuntu5'
'python-xstatic-angular-bootstrap' changed from 'absent' to '2.2.0.0-1.1~u16.04+mcp2'
'python2.7-testtools' changed from 'absent' to '1'
'docutils' changed from 'absent' to '1'
'python-django-pyscss' changed from 'absent' to '2.0.2-4'
'python-xstatic-bootstrap-datepicker' changed from 'absent' to '1.3.1.1-1~u16.04+mcp1'
'python2.7-dbus' changed from 'absent' to '1'
'python-oslo.middleware' changed from 'absent' to '3.34.0-1.0~u16.04+mcp2'
'fonts-materialdesignicons-webfont' changed from 'absent' to '1.4.57-1.1~u16.04+mcp2'
'python-xstatic-angular' changed from 'absent' to '1.5.8.0-1.1~u16.04+mcp2'
'python-pillow' changed from 'absent' to '1'
'python2.7-cinderclient' changed from 'absent' to '1'
'libpaperg' changed from 'absent' to '1'
'python2.7-netifaces' changed from 'absent' to '1'
'python-xstatic-mdi' changed from 'absent' to '1.4.57.0-1.1~u16.04+mcp2'
'python-xstatic-jquery' changed from 'absent' to '1.10.2.1-2~u16.04+mcp2'
'python-oslo.context' changed from 'absent' to '1:2.20.0-1.0~u16.04+mcp1'
'python-neutronclient' changed from 'absent' to '1:6.7.0-1.0~u16.04+mcp12'
'python-pymongo-ext' changed from 'absent' to '3.2-1build1'
'python-xstatic-angular-schema-form' changed from 'absent' to '0.8.13.0-1.1~u16.04+mcp2'
'python2.7-pyinotify' changed from 'absent' to '1'
'libjs-jquery-tablesorter' changed from 'absent' to '10-2ubuntu2'
'python-pyparsing' changed from 'absent' to '2.1.10+dfsg1-1.1~u16.04+mcp2'
'python-babel-localedata' changed from 'absent' to '2.3.4+dfsg.1-2.1~u16.04+mcp2'
'python-positional' changed from 'absent' to '1.1.1-3.1~u16.04+mcp2'
'python-appconf' changed from 'absent' to '1'
'python-cmd2' changed from 'absent' to '0.6.8-1'
'libjs-magic-search' changed from 'absent' to '0.2.5-1'
'python-distribute' changed from 'absent' to '1'
'python-xstatic-tv4' changed from 'absent' to '1.2.7.0-1.1~u16.04+mcp2'
'python-oslo-log' changed from 'absent' to '1'
'python-keystoneclient' changed from 'absent' to '1:3.15.0-1.0~u16.04+mcp2'
'python-xstatic-font-awesome' changed from 'absent' to '4.7.0.0-3~u16.04+mcp2'
'python-rjsmin' changed from 'absent' to '1.0.12+dfsg1-2ubuntu1'
'python-pygments' changed from 'absent' to '2.2.0+dfsg-1~u16.04+mcp2'
'python-pathlib' changed from 'absent' to '1.0.1-2'
'python-iso8601' changed from 'absent' to '0.1.11-1'
'python-xstatic-jsencrypt' changed from 'absent' to '2.3.1.1-2~u16.04+mcp2'
'python-jsonpatch' changed from 'absent' to '1.21-1~u16.04+mcp1'
'python-xstatic-d3' changed from 'absent' to '3.5.17.0-2~u16.04+mcp2'
'libwebpmux1' changed from 'absent' to '0.4.4-1'
'python-xstatic-roboto-fontface' changed from 'absent' to '0.5.0.0-2~u16.04+mcp2'
'python-oslo.policy' changed from 'absent' to '1.33.2-1.0~u16.04+mcp3'
'python-xstatic' changed from 'absent' to '1.0.0-4'
'python-paste' changed from 'absent' to '2.0.3+dfsg-4.1~u16.04+mcp1'
'python-xstatic-jquery-ui' changed from 'absent' to '1.12.0.1+debian+dfsg3-2~u16.04+mcp2'
'python-lxml' changed from 'absent' to '3.5.0-1build1'
'python-oslo.config' changed from 'absent' to '1:5.2.0-1.0~u16.04+mcp5'
'python-futurist' changed from 'absent' to '1.6.0-1.0~u16.04+mcp1'
'libpaper1' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-webob' changed from 'absent' to '1:1.7.2-1~u16.04+mcp2'
'python2.7-gi' changed from 'absent' to '1'
'python-linecache2' changed from 'absent' to '1.0.0-2'
'python-xstatic-objectpath' changed from 'absent' to '1.2.1.0-2.1~u16.04+mcp2'
'python-pastedeploy-tpl' changed from 'absent' to '1.5.2-1'
'python-oauthlib' changed from 'absent' to '1.0.3-1'
'python-mimeparse' changed from 'absent' to '0.1.4-1build1'
'python-gi' changed from 'absent' to '3.20.0-0ubuntu1'
'python-xstatic-spin' changed from 'absent' to '1.2.8.0+dfsg1-1'
'python2.7-django-compressor' changed from 'absent' to '1'
'python-xstatic-angular-lrdragndrop' changed from 'absent' to '1.0.2.2-1'
'python-contextlib2' changed from 'absent' to '0.5.1-1'
'python-xstatic-bootswatch' changed from 'absent' to '3.3.7.0-2~u16.04+mcp2'
'python-xstatic-jquery-migrate' changed from 'absent' to '1.2.1.1+dfsg1-1'
'libjs-jquery.quicksearch' changed from 'absent' to '2.0.4-1'
'python-novaclient' changed from 'absent' to '2:9.1.1-1~u16.04+mcp6'
'python-oslo.utils' changed from 'absent' to '3.35.0-1.0~u16.04+mcp8'
'python-pika-pool' changed from 'absent' to '0.1.3-1ubuntu1'
'python-django' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'libjs-twitter-bootstrap-datepicker' changed from 'absent' to '1.3.1+dfsg1-1'
'python-debtcollector' changed from 'absent' to '1.3.0-2'
'python2.7-iso8601' changed from 'absent' to '1'
'python-bson' changed from 'absent' to '3.2-1build1'
'python-simplejson' changed from 'absent' to '3.8.1-1ubuntu2'
'fonts-font-awesome' changed from 'absent' to '4.7.0~dfsg-3~u16.04+mcp2'
'python-docutils' changed from 'absent' to '0.12+dfsg-1'
'python-openid' changed from 'absent' to '2.2.5-6'
'python-pastedeploy' changed from 'absent' to '1.5.2-1'
'python2.7-cmd2' changed from 'absent' to '1'
'libjs-jquery-ui' changed from 'absent' to '1.12.1+dfsg-5~u16.04+mcp2'
'python-tz' changed from 'absent' to '2014.10~dfsg1-0ubuntu2'
'python-pastescript' changed from 'absent' to '1.7.5-3build1'
'python-cliff' changed from 'absent' to '2.8.0-1~u16.04+mcp2'
'python-oslo.i18n' changed from 'absent' to '3.19.0-1.0~u16.04+mcp6'
'python-munch' changed from 'absent' to '2.2.0-1.0~u16.04+mcp1'
'python-xstatic-magic-search' changed from 'absent' to '0.2.5.1-1'
'python-appdirs' changed from 'absent' to '1.4.0-2'
'python2.7-pathlib' changed from 'absent' to '1'
'python-statsd' changed from 'absent' to '3.2.1-2~u16.04+mcp2'
'libjs-d3' changed from 'absent' to '3.5.17-2~u16.04+mcp2'
'python-keyring' changed from 'absent' to '8.5.1-1.1~u16.04+mcp2'
'python-django-appconf' changed from 'absent' to '1.0.1-4'
'python-xstatic-jquery.quicksearch' changed from 'absent' to '2.0.4.1-1'
'python-xstatic-smart-table' changed from 'absent' to '1.4.13.2-2~u16.04+mcp1'
'python-oslo-utils' changed from 'absent' to '1'
'python-oslo.serialization' changed from 'absent' to '2.24.0-1.0~u16.04+mcp1'
'python-django-babel' changed from 'absent' to '0.5.1-1.1~u16.04+mcp2'
'python-unicodecsv' changed from 'absent' to '0.14.1-1'
'python-wrapt' changed from 'absent' to '1.8.0-5build2'
'python-rfc3986' changed from 'absent' to '0.3.1-2.1~u16.04+mcp2'
'python-eventlet' changed from 'absent' to '0.20.0-4~u16.04+mcp2'
'python-django-horizon' changed from 'absent' to '3:13.0.1-4~u16.04+mcp46'
'python2.7-pyparsing' changed from 'absent' to '1'
'python-oslo.log' changed from 'absent' to '3.36.0-1.0~u16.04+mcp6'
'python-pyscss' changed from 'absent' to '1.3.4-5'
'python-pyinotify' changed from 'absent' to '0.9.6-1.1~u16.04+mcp2'
'libjpeg-turbo8' changed from 'absent' to '1.4.2-0ubuntu3.1'
'libjs-angularjs' changed from 'absent' to '1.5.10-1.1~u16.04+mcp2'
'libjpeg8' changed from 'absent' to '8c-2ubuntu8'
'python-amqp' changed from 'absent' to '2.2.1-1~exp1~u16.04+mcp1'
'libjs-bootswatch' changed from 'absent' to '3.3.7+dfsg2-1~u16.04+mcp2'
'libwebp5' changed from 'absent' to '0.4.4-1'
'python-vine' changed from 'absent' to '1.1.3+dfsg-2~u16.04+mcp3'
'python-django-compressor' changed from 'absent' to '2.1-1~u16.04+mcp2'
'python-netifaces' changed from 'absent' to '0.10.4-0.1build2'
'python-decorator' changed from 'absent' to '4.0.6-1'
'python-osprofiler' changed from 'absent' to '1.15.2-1.0~u16.04+mcp3'
'python-os-client-config' changed from 'absent' to '1.29.0-1.0~u16.04+mcp2'
'python-oslo.messaging' changed from 'absent' to '5.35.1-1.0~u16.04+mcp16'
'python-warlock' changed from 'absent' to '1.2.0-2.0~u16.04+mcp1'
'python-tempita' changed from 'absent' to '0.5.2-1build1'
'python-keyrings.alt' changed from 'absent' to '1.1.1-1'
'openstack-dashboard' changed from 'absent' to '3:13.0.1-4~u16.04+mcp46'
'python-json-pointer' changed from 'absent' to '1.9-3'
'libjs-lrdragndrop' changed from 'absent' to '1.0.2-2'
'python-html5lib' changed from 'absent' to '0.999-4'
'python-swiftclient' changed from 'absent' to '1:3.4.0-1~u16.04+mcp2'
'python-jwt' changed from 'absent' to '1.3.0-1ubuntu0.1'
'python2.7-gridfs' changed from 'absent' to '1'
'python-greenlet' changed from 'absent' to '0.4.12-2.0~u16.04+mcp1'
'python-oslo.service' changed from 'absent' to '1.29.0-1.0~u16.04+mcp1'
'python-rcssmin' changed from 'absent' to '1.0.6-1ubuntu1'
'python-ceilometerclient' changed from 'absent' to '2.9.0-2~u16.04+mcp1'
'python-csscompressor' changed from 'absent' to '0.9.4-2'
'python-traceback2' changed from 'absent' to '1.4.0-3'
'python-jmespath' changed from 'absent' to '0.9.0-2'
'python-keystoneauth1' changed from 'absent' to '3.4.0-1.0~u16.04+mcp7'
'libjs-angular-gettext' changed from 'absent' to '2.3.8-2~u16.04+mcp2'
'python-pymongo' changed from 'absent' to '3.2-1build1'
'libjs-jquery-metadata' changed from 'absent' to '10-2ubuntu2'
'libjs-rickshaw' changed from 'absent' to '1.5.1.dfsg-1'
'python-xstatic-rickshaw' changed from 'absent' to '1.5.0.2-2'
'python-cinderclient' changed from 'absent' to '1:3.5.0-1.0~u16.04+mcp1'
'python-requestsexceptions' changed from 'absent' to '1.3.0-3~u16.04+mcp2'
'python-oslo-context' changed from 'absent' to '1'
'python2.7-bson-ext' changed from 'absent' to '1'
'python-xstatic-angular-gettext' changed from 'absent' to '2.3.8.0-2~u16.04+mcp2'
'libjbig0' changed from 'absent' to '2.1-3.1'

2018-09-01 23:29:23,224 [salt.state       :905 ][INFO    ][7274] Loading fresh modules for state activity
2018-09-01 23:29:23,251 [salt.state       :1941][INFO    ][7274] Completed state [openstack-dashboard] at time 23:29:23.251527 duration_in_ms=175710.615
2018-09-01 23:29:23,256 [salt.state       :1770][INFO    ][7274] Running state [python-lesscpy] at time 23:29:23.256502
2018-09-01 23:29:23,256 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [python-lesscpy]
2018-09-01 23:29:24,561 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:29:24,589 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-lesscpy'] in directory '/root'
2018-09-01 23:29:26,891 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:26,947 [salt.state       :290 ][INFO    ][7274] Made the following changes:
'python-lesscpy' changed from 'absent' to '0.10-1'

2018-09-01 23:29:26,972 [salt.state       :905 ][INFO    ][7274] Loading fresh modules for state activity
2018-09-01 23:29:27,101 [salt.state       :1941][INFO    ][7274] Completed state [python-lesscpy] at time 23:29:27.101826 duration_in_ms=3845.324
2018-09-01 23:29:27,106 [salt.state       :1770][INFO    ][7274] Running state [python-memcache] at time 23:29:27.106121
2018-09-01 23:29:27,106 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [python-memcache]
2018-09-01 23:29:27,585 [salt.state       :290 ][INFO    ][7274] All specified packages are already installed
2018-09-01 23:29:27,585 [salt.state       :1941][INFO    ][7274] Completed state [python-memcache] at time 23:29:27.585330 duration_in_ms=479.209
2018-09-01 23:29:27,585 [salt.state       :1770][INFO    ][7274] Running state [gettext-base] at time 23:29:27.585583
2018-09-01 23:29:27,585 [salt.state       :1803][INFO    ][7274] Executing state pkg.installed for [gettext-base]
2018-09-01 23:29:27,591 [salt.state       :290 ][INFO    ][7274] All specified packages are already installed
2018-09-01 23:29:27,591 [salt.state       :1941][INFO    ][7274] Completed state [gettext-base] at time 23:29:27.591297 duration_in_ms=5.713
2018-09-01 23:29:27,591 [salt.state       :1770][INFO    ][7274] Running state [openstack-dashboard-apache] at time 23:29:27.591964
2018-09-01 23:29:27,592 [salt.state       :1803][INFO    ][7274] Executing state pkg.purged for [openstack-dashboard-apache]
2018-09-01 23:29:27,601 [salt.state       :290 ][INFO    ][7274] All specified packages are already absent
2018-09-01 23:29:27,601 [salt.state       :1941][INFO    ][7274] Completed state [openstack-dashboard-apache] at time 23:29:27.601203 duration_in_ms=9.239
2018-09-01 23:29:27,602 [salt.state       :1770][INFO    ][7274] Running state [/etc/openstack-dashboard/local_settings.py] at time 23:29:27.602710
2018-09-01 23:29:27,602 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/etc/openstack-dashboard/local_settings.py]
2018-09-01 23:29:27,620 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/local_settings/queens_settings.py'
2018-09-01 23:29:27,663 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_local_settings.py'
2018-09-01 23:29:27,709 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_horizon_settings.py'
2018-09-01 23:29:27,735 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_keystone_settings.py'
2018-09-01 23:29:27,762 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_nova_settings.py'
2018-09-01 23:29:27,777 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_glance_settings.py'
2018-09-01 23:29:27,795 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_neutron_settings.py'
2018-09-01 23:29:27,811 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_heat_settings.py'
2018-09-01 23:29:27,830 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_websso_settings.py'
2018-09-01 23:29:27,855 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_ssl_settings.py'
2018-09-01 23:29:27,864 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -1,173 +1,83 @@
-# -*- coding: utf-8 -*-
-
 import os
 
+from django.utils.translation import pgettext_lazy
 from django.utils.translation import ugettext_lazy as _
-
-from horizon.utils import secret_key
-
-from openstack_dashboard.settings import HORIZON_CONFIG
-
-DEBUG = True
-
-# This setting controls whether or not compression is enabled. Disabling
-# compression makes Horizon considerably slower, but makes it much easier
-# to debug JS and CSS changes
-#COMPRESS_ENABLED = not DEBUG
-
-# This setting controls whether compression happens on the fly, or offline
-# with `python manage.py compress`
-# See https://django-compressor.readthedocs.io/en/latest/usage/#offline-compression
-# for more information
-#COMPRESS_OFFLINE = not DEBUG
-
-# WEBROOT is the location relative to Webserver root
-# should end with a slash.
-WEBROOT = '/'
-#LOGIN_URL = WEBROOT + 'auth/login/'
-#LOGOUT_URL = WEBROOT + 'auth/logout/'
-#
-# LOGIN_REDIRECT_URL can be used as an alternative for
-# HORIZON_CONFIG.user_home, if user_home is not set.
-# Do not set it to '/home/', as this will cause circular redirect loop
-#LOGIN_REDIRECT_URL = WEBROOT
-
-# If horizon is running in production (DEBUG is False), set this
-# with the list of host/domain names that the application can serve.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
-ALLOWED_HOSTS = [ 'prx02', 'localhost', ]
-
-# Set SSL proxy settings:
-# Pass this header from the proxy after terminating the SSL,
-# and don't forget to strip it from the client's request.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
-#SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
-
-# If Horizon is being served through SSL, then uncomment the following two
-# settings to better secure the cookies from security exploits
-#CSRF_COOKIE_SECURE = True
-#SESSION_COOKIE_SECURE = True
-
-# The absolute path to the directory where message files are collected.
-# The message file must have a .json file extension. When the user logins to
-# horizon, the message files collected are processed and displayed to the user.
-#MESSAGES_PATH=None
-
-# Overrides for OpenStack API versions. Use this setting to force the
-# OpenStack dashboard to use a specific API version for a given service API.
-# Versions specified here should be integers or floats, not strings.
-# NOTE: The version should be formatted as it appears in the URL for the
-# service API. For example, The identity service APIs have inconsistent
-# use of the decimal point, so valid options would be 2.0 or 3.
-# Minimum compute version to get the instance locked status is 2.9.
-#OPENSTACK_API_VERSIONS = {
-#    "data-processing": 1.1,
-#    "identity": 3,
-#    "image": 2,
-#    "volume": 2,
-#    "compute": 2,
-#}
-
-# Set this to True if running on a multi-domain model. When this is enabled, it
-# will require the user to enter the Domain name in addition to the username
-# for login.
-#OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
-
-# Set this to True if you want available domains displayed as a dropdown menu
-# on the login screen. It is strongly advised NOT to enable this for public
-# clouds, as advertising enabled domains to unauthenticated customers
-# irresponsibly exposes private information. This should only be used for
-# private clouds where the dashboard sits behind a corporate firewall.
-#OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN = False
-
-# If OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN is enabled, this option can be used to
-# set the available domains to choose from. This is a list of pairs whose first
-# value is the domain name and the second is the display name.
-#OPENSTACK_KEYSTONE_DOMAIN_CHOICES = (
-#  ('Default', 'Default'),
-#)
-
-# Overrides the default domain used when running on single-domain model
-# with Keystone V3. All entities will be created in the default domain.
-# NOTE: This value must be the name of the default domain, NOT the ID.
-# Also, you will most likely have a value in the keystone policy file like this
-#    "cloud_admin": "rule:admin_required and domain_id:<your domain id>"
-# This value must be the name of the domain whose ID is specified there.
-#OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
-
-# Set this to True to enable panels that provide the ability for users to
-# manage Identity Providers (IdPs) and establish a set of rules to map
-# federation protocol attributes to Identity API attributes.
-# This extension requires v3.0+ of the Identity API.
-#OPENSTACK_KEYSTONE_FEDERATION_MANAGEMENT = False
-
-# Set Console type:
-# valid options are "AUTO"(default), "VNC", "SPICE", "RDP", "SERIAL", "MKS"
-# or None. Set to None explicitly if you want to deactivate the console.
-#CONSOLE_TYPE = "AUTO"
-
-# Toggle showing the openrc file for Keystone V2.
-# If set to false the link will be removed from the user dropdown menu
-# and the API Access page
-#SHOW_KEYSTONE_V2_RC = True
-
-# If provided, a "Report Bug" link will be displayed in the site header
-# which links to the value of this setting (ideally a URL containing
-# information on how to report issues).
-#HORIZON_CONFIG["bug_url"] = "http://bug-report.example.com"
-
-# Show backdrop element outside the modal, do not close the modal
-# after clicking on backdrop.
-#HORIZON_CONFIG["modal_backdrop"] = "static"
-
-# Specify a regular expression to validate user passwords.
-#HORIZON_CONFIG["password_validator"] = {
-#    "regex": '.*',
-#    "help_text": _("Your password does not meet the requirements."),
-#}
-
-# Disable simplified floating IP address management for deployments with
-# multiple floating IP pools or complex network requirements.
-#HORIZON_CONFIG["simple_ip_management"] = False
-
-# Turn off browser autocompletion for forms including the login form and
-# the database creation workflow if so desired.
-#HORIZON_CONFIG["password_autocomplete"] = "off"
-
-# Setting this to True will disable the reveal button for password fields,
-# including on the login form.
-#HORIZON_CONFIG["disable_password_reveal"] = False
+from openstack_dashboard import exceptions
+
+HORIZON_CONFIG = {
+    'user_home': 'openstack_dashboard.views.get_user_home',
+    'ajax_queue_limit': 10,
+    'auto_fade_alerts': {
+        'delay': 3000,
+        'fade_duration': 1500,
+        'types': ['alert-success', 'alert-info']
+    },
+    'help_url': "http://docs.openstack.org",
+    'exceptions': {'recoverable': exceptions.RECOVERABLE,
+                   'not_found': exceptions.NOT_FOUND,
+                   'unauthorized': exceptions.UNAUTHORIZED},
+    'modal_backdrop': 'static',
+    'angular_modules': [],
+    'js_files': [],
+    'js_spec_files': [],
+    'disable_password_reveal': True,
+    'password_autocomplete': 'off'
+}
+# 'key', 'label', 'path'
+AVAILABLE_THEMES = [
+    (
+        "default",
+        pgettext_lazy("Default style theme", "Default"),
+        "themes/default"
+    ),
+    (
+        "material",
+        pgettext_lazy("Google's Material Design style theme", "Material"),
+        "themes/material"
+    ),
+]
+
+# The default theme if no cookie is present
+DEFAULT_THEME = 'default'
+
+# Theme Static Directory
+THEME_COLLECTION_DIR = 'themes'
+
+# Theme Cookie Name
+THEME_COOKIE_NAME = 'theme'
+
+INSTALLED_APPS = (
+    'openstack_dashboard',
+    'django.contrib.contenttypes',
+    'django.contrib.auth',
+    'django.contrib.sessions',
+    'django.contrib.messages',
+    'django.contrib.staticfiles',
+    'django.contrib.humanize',
+    'compressor',
+    'horizon',
+    'openstack_auth',
+)
+
+
+
+DEBUG = False
+
+TEMPLATE_DEBUG = DEBUG
+
+ALLOWED_HOSTS = ['*']
+
+AUTHENTICATION_URLS = ['openstack_auth.urls']
 
 LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
 
-# Set custom secret key:
-# You can either set it to a specific value or you can let horizon generate a
-# default secret key that is unique on this machine, e.i. regardless of the
-# amount of Python WSGI workers (if used behind Apache+mod_wsgi): However,
-# there may be situations where you would want to set this explicitly, e.g.
-# when multiple dashboard instances are distributed on different machines
-# (usually behind a load-balancer). Either you have to make sure that a session
-# gets all requests routed to the same dashboard instance or you set the same
-# SECRET_KEY for all of them.
-SECRET_KEY = secret_key.generate_or_read_from_file(
-    os.path.join("/","var","lib","openstack-dashboard","secret-key", '.secret_key_store'))
-
-# We recommend you use memcached for development; otherwise after every reload
-# of the django development server, you will have to login again. To use
-# memcached set CACHES to something like
-#CACHES = {
-#    'default': {
-#        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-#        'LOCATION': '127.0.0.1:11211',
-#    },
-#}
+SECRET_KEY = 'opaesee8Que2yahJoh9fo0eefo1Aeyo6ahyei8zeiboh3aeth5loth7ieNa5xi5e'
 
 CACHES = {
     'default': {
-        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
-    },
+        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
+        'LOCATION': "172.30.10.103:11211"
+    }
 }
 
 # Send email to the console by default
@@ -176,76 +86,249 @@
 #EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
 
 # Configure these for your outgoing email host
-#EMAIL_HOST = 'smtp.my-company.com'
-#EMAIL_PORT = 25
-#EMAIL_HOST_USER = 'djangomail'
-#EMAIL_HOST_PASSWORD = 'top-secret!'
+# EMAIL_HOST = 'smtp.my-company.com'
+# EMAIL_PORT = 25
+# EMAIL_HOST_USER = 'djangomail'
+# EMAIL_HOST_PASSWORD = 'top-secret!'
+
+# The number of objects (Swift containers/objects or images) to display
+# on a single page before providing a paging element (a "more" link)
+# to paginate results.
+API_RESULT_LIMIT = 1000
+API_RESULT_PAGE_SIZE = 20
+
+# The timezone of the server. This should correspond with the timezone
+# of your entire OpenStack installation, and hopefully be in UTC.
+TIME_ZONE = "UTC"
+
+COMPRESS_OFFLINE = True
+
+# Trove user and database extension support. By default support for
+# creating users and databases on database instances is turned on.
+# To disable these extensions set the permission here to something
+# unusable such as ["!"].
+# TROVE_ADD_USER_PERMS = []
+# TROVE_ADD_DATABASE_PERMS = []
+
+SITE_BRANDING = 'OpenStack Dashboard'
+SESSION_COOKIE_HTTPONLY = True
+BOOT_ONLY_FROM_VOLUME = True
+
+REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
+                             'LAUNCH_INSTANCE_DEFAULTS',
+                             'OPENSTACK_IMAGE_FORMATS']
+
+
+# Specify a regular expression to validate user passwords.
+# HORIZON_CONFIG["password_validator"] = {
+#     "regex": '.*',
+#     "help_text": _("Your password does not meet the requirements.")
+# }
+
+# Turn off browser autocompletion for the login form if so desired.
+# HORIZON_CONFIG["password_autocomplete"] = "off"
+
+# The Horizon Policy Enforcement engine uses these values to load per service
+# policy rule files. The content of these files should match the files the
+# OpenStack services are using to determine role based access control in the
+# target installation.
+
+SESSION_TIMEOUT = 43200
+SESSION_ENGINE = "django.contrib.sessions.backends.cache"
+DROPDOWN_MAX_ITEMS = 30
+# A dictionary of settings which can be used to provide the default values for
+# properties found in the Launch Instance modal.
+
+# Path to directory containing policy.json files
+POLICY_FILES_PATH = "/usr/share/openstack-dashboard/openstack_dashboard/conf"
+# Map of local copy of service policy files
+POLICY_FILES = {
+    "compute": "nova_policy.json",
+    "network": "neutron_policy.json",
+    "image": "glance_policy.json",
+    "telemetry": "ceilometer_policy.json",
+    "volume": "cinder_policy.json",
+    "orchestration": "heat_policy.json",
+    "identity": "keystone_policy.json",
+}
+
+LOGGING = {
+    'version': 1,
+    # When set to True this will disable all logging except
+    # for loggers specified in this configuration dictionary. Note that
+    # if nothing is specified here and disable_existing_loggers is True,
+    # django.db.backends will still log unless it is disabled explicitly.
+    
+    'disable_existing_loggers': False,
+    'handlers': {
+        'null': {
+            'level': 'DEBUG',
+            'class': 'logging.NullHandler',
+        },
+        'console': {
+            # Set the level to "DEBUG" for verbose output logging.
+            'level': 'INFO',
+            'class': 'logging.StreamHandler',
+        },
+        'file': {
+            'level': 'DEBUG',
+            'class': 'logging.FileHandler',
+            'filename': '/var/log/horizon/horizon.log',
+        },
+    },
+    'loggers': {
+        # Logging from django.db.backends is VERY verbose, send to null
+        # by default.
+        'django.db.backends': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        # DEBUG level for django.template starting Pike has some false positive traces, set it to INFO
+        # by default. Caused by bug PROD-17558.
+        'django.template': {
+            'handlers': ['file'],
+            'level': 'INFO',
+            'propagate': True,
+        },
+        'requests': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'horizon': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_dashboard': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'novaclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'cinderclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'keystoneclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'glanceclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'neutronclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'heatclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'ceilometerclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'troveclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'mistralclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'swiftclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_auth': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'scss.expression': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'nose.plugins.manager': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'django': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'iso8601': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+    }
+}
+
+
+# Overrides for OpenStack API versions. Use this setting to force the
+# OpenStack dashboard to use a specfic API version for a given service API.
+# NOTE: The version should be formatted as it appears in the URL for the
+# service API. For example, The identity service APIs have inconsistent
+# use of the decimal point, so valid options would be "2.0" or "3".
+OPENSTACK_API_VERSIONS = {
+    "identity": 3
+}
+# Set this to True if running on multi-domain model. When this is enabled, it
+# will require user to enter the Domain name in addition to username for login.
+# OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+
+# Overrides the default domain used when running on single-domain model
+# with Keystone V3. All entities will be created in the default domain.
+# OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
 
 # For multiple regions uncomment this configuration, and add (endpoint, title).
-#AVAILABLE_REGIONS = [
-#    ('http://cluster1.example.com:5000/v3', 'cluster1'),
-#    ('http://cluster2.example.com:5000/v3', 'cluster2'),
-#]
-
-OPENSTACK_HOST = "127.0.0.1"
+# AVAILABLE_REGIONS = [
+#     ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
+#     ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
+# ]
+
+
+OPENSTACK_HOST = "10.167.4.35"
 OPENSTACK_KEYSTONE_URL = "http://%s:5000/v3" % OPENSTACK_HOST
-OPENSTACK_KEYSTONE_DEFAULT_ROLE = "_member_"
-
-# For setting the default service region on a per-endpoint basis. Note that the
-# default value for this setting is {}, and below is just an example of how it
-# should be specified.
-#DEFAULT_SERVICE_REGIONS = {
-#    OPENSTACK_KEYSTONE_URL: 'RegionOne'
-#}
-
-# Enables keystone web single-sign-on if set to True.
-#WEBSSO_ENABLED = False
-
-# Authentication mechanism to be selected as default.
-# The value must be a key from WEBSSO_CHOICES.
-#WEBSSO_INITIAL_CHOICE = "credentials"
-
-# The list of authentication mechanisms which include keystone
-# federation protocols and identity provider/federation protocol
-# mapping keys (WEBSSO_IDP_MAPPING). Current supported protocol
-# IDs are 'saml2' and 'oidc'  which represent SAML 2.0, OpenID
-# Connect respectively.
-# Do not remove the mandatory credentials mechanism.
-# Note: The last two tuples are sample mapping keys to a identity provider
-# and federation protocol combination (WEBSSO_IDP_MAPPING).
-#WEBSSO_CHOICES = (
-#    ("credentials", _("Keystone Credentials")),
-#    ("oidc", _("OpenID Connect")),
-#    ("saml2", _("Security Assertion Markup Language")),
-#    ("acme_oidc", "ACME - OpenID Connect"),
-#    ("acme_saml2", "ACME - SAML2"),
-#)
-
-# A dictionary of specific identity provider and federation protocol
-# combinations. From the selected authentication mechanism, the value
-# will be looked up as keys in the dictionary. If a match is found,
-# it will redirect the user to a identity provider and federation protocol
-# specific WebSSO endpoint in keystone, otherwise it will use the value
-# as the protocol_id when redirecting to the WebSSO by protocol endpoint.
-# NOTE: The value is expected to be a tuple formatted as: (<idp_id>, <protocol_id>).
-#WEBSSO_IDP_MAPPING = {
-#    "acme_oidc": ("acme", "oidc"),
-#    "acme_saml2": ("acme", "saml2"),
-#}
-
-# The Keystone Provider drop down uses Keystone to Keystone federation
-# to switch between Keystone service providers.
-# Set display name for Identity Provider (dropdown display name)
-#KEYSTONE_PROVIDER_IDP_NAME = "Local Keystone"
-# This id is used for only for comparison with the service provider IDs. This ID
-# should not match any service provider IDs.
-#KEYSTONE_PROVIDER_IDP_ID = "localkeystone"
+
+OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = "default"
+
+OPENSTACK_KEYSTONE_DEFAULT_ROLE = "Member"
 
 # Disable SSL certificate checks (useful for self-signed certificates):
-#OPENSTACK_SSL_NO_VERIFY = True
 
 # The CA certificate to use to verify SSL connections
-#OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+# OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+
+# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is 'publicURL'.
+OPENSTACK_ENDPOINT_TYPE = "internalURL"
+
+# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
+# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is None.  This
+# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
+#SECONDARY_ENDPOINT_TYPE = "publicURL"
 
 # The OPENSTACK_KEYSTONE_BACKEND settings can be used to identify the
 # capabilities of the auth backend for Keystone.
@@ -259,43 +342,13 @@
     'can_edit_group': True,
     'can_edit_project': True,
     'can_edit_domain': True,
-    'can_edit_role': True,
-}
-
-# Setting this to True, will add a new "Retrieve Password" action on instance,
-# allowing Admin session password retrieval/decryption.
-#OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
-
-# This setting allows deployers to control whether a token is deleted on log
-# out. This can be helpful when there are often long running processes being
-# run in the Horizon environment.
-#TOKEN_DELETION_DISABLED = False
-
-# The Launch Instance user experience has been significantly enhanced.
-# You can choose whether to enable the new launch instance experience,
-# the legacy experience, or both. The legacy experience will be removed
-# in a future release, but is available as a temporary backup setting to ensure
-# compatibility with existing deployments. Further development will not be
-# done on the legacy experience. Please report any problems with the new
-# experience via the Launchpad tracking system.
-#
-# Toggle LAUNCH_INSTANCE_LEGACY_ENABLED and LAUNCH_INSTANCE_NG_ENABLED to
-# determine the experience to enable.  Set them both to true to enable
-# both.
-#LAUNCH_INSTANCE_LEGACY_ENABLED = True
-#LAUNCH_INSTANCE_NG_ENABLED = False
-
-# A dictionary of settings which can be used to provide the default values for
-# properties found in the Launch Instance modal.
-#LAUNCH_INSTANCE_DEFAULTS = {
-#    'config_drive': False,
-#    'enable_scheduler_hints': True,
-#    'disable_image': False,
-#    'disable_instance_snapshot': False,
-#    'disable_volume': False,
-#    'disable_volume_snapshot': False,
-#    'create_volume': True,
-#}
+    'can_edit_role': True
+}
+
+
+# Set Console type:
+# valid options would be "AUTO", "VNC" or "SPICE"
+# CONSOLE_TYPE = "AUTO"
 
 # The Xen Hypervisor has the ability to set the mount point for volumes
 # attached to instances (other Hypervisors currently do not). Setting
@@ -304,102 +357,52 @@
 OPENSTACK_HYPERVISOR_FEATURES = {
     'can_set_mount_point': False,
     'can_set_password': False,
-    'requires_keypair': False,
-    'enable_quotas': True
-}
-
-# This settings controls whether IP addresses of servers are retrieved from
-# neutron in the project instance table. Setting this to ``False`` may mitigate
-# a performance issue in the project instance table in large deployments.
-#OPENSTACK_INSTANCE_RETRIEVE_IP_ADDRESSES = True
-
-# The OPENSTACK_CINDER_FEATURES settings can be used to enable optional
-# services provided by cinder that is not exposed by its extension API.
-OPENSTACK_CINDER_FEATURES = {
-    'enable_backup': False,
-}
-
-# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
-# services provided by neutron. Options currently available are load
-# balancer service, security groups, quotas, VPN service.
-OPENSTACK_NEUTRON_NETWORK = {
-    'enable_router': True,
-    'enable_quotas': True,
-    'enable_ipv6': True,
-    'enable_distributed_router': False,
-    'enable_ha_router': False,
-    'enable_fip_topology_check': True,
-
-    # Default dns servers you would like to use when a subnet is
-    # created.  This is only a default, users can still choose a different
-    # list of dns servers when creating a new subnet.
-    # The entries below are examples only, and are not appropriate for
-    # real deployments
-    # 'default_dns_nameservers': ["8.8.8.8", "8.8.4.4", "208.67.222.222"],
-
-    # Set which provider network types are supported. Only the network types
-    # in this list will be available to choose from when creating a network.
-    # Network types include local, flat, vlan, gre, vxlan and geneve.
-    # 'supported_provider_types': ['*'],
-
-    # You can configure available segmentation ID range per network type
-    # in your deployment.
-    # 'segmentation_id_range': {
-    #     'vlan': [1024, 2048],
-    #     'vxlan': [4094, 65536],
-    # },
-
-    # You can define additional provider network types here.
-    # 'extra_provider_types': {
-    #     'awesome_type': {
-    #         'display_name': 'Awesome New Type',
-    #         'require_physical_network': False,
-    #         'require_segmentation_id': True,
-    #     }
-    # },
-
-    # Set which VNIC types are supported for port binding. Only the VNIC
-    # types in this list will be available to choose from when creating a
-    # port.
-    # VNIC types include 'normal', 'direct', 'direct-physical', 'macvtap',
-    # 'baremetal' and 'virtio-forwarder'
-    # Set to empty list or None to disable VNIC type selection.
-    'supported_vnic_types': ['*'],
-
-    # Set list of available physical networks to be selected in the physical
-    # network field on the admin create network modal. If it's set to an empty
-    # list, the field will be a regular input field.
-    # e.g. ['default', 'test']
-    'physical_networks': [],
-
-}
-
-# The OPENSTACK_HEAT_STACK settings can be used to disable password
-# field required while launching the stack.
-OPENSTACK_HEAT_STACK = {
-    'enable_user_pass': True,
-}
+}
+
+# When set, enables the instance action "Retrieve password"
+# allowing password retrieval
+OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
+
+# When launching an instance, the menu of available flavors is
+# sorted by RAM usage, ascending.  Provide a callback method here
+# (and/or a flag for reverse sort) for the sorted() method if you'd
+# like a different behaviour.  For more info, see
+# http://docs.python.org/2/library/functions.html#sorted
+# CREATE_INSTANCE_FLAVOR_SORT = {
+#     'key': my_awesome_callback_method,
+#     'reverse': False,
+# }
+
+FLAVOR_EXTRA_KEYS = {
+    'flavor_keys': [
+        ('quota:read_bytes_sec', _('Quota: Read bytes')),
+        ('quota:write_bytes_sec', _('Quota: Write bytes')),
+        ('quota:cpu_quota', _('Quota: CPU')),
+        ('quota:cpu_period', _('Quota: CPU period')),
+        ('quota:inbound_average', _('Quota: Inbound average')),
+        ('quota:outbound_average', _('Quota: Outbound average')),
+    ]
+}
+
 
 # The OPENSTACK_IMAGE_BACKEND settings can be used to customize features
 # in the OpenStack Dashboard related to the Image service, such as the list
 # of supported image formats.
-#OPENSTACK_IMAGE_BACKEND = {
-#    'image_formats': [
-#        ('', _('Select format')),
-#        ('aki', _('AKI - Amazon Kernel Image')),
-#        ('ami', _('AMI - Amazon Machine Image')),
-#        ('ari', _('ARI - Amazon Ramdisk Image')),
-#        ('docker', _('Docker')),
-#        ('iso', _('ISO - Optical Disk Image')),
-#        ('ova', _('OVA - Open Virtual Appliance')),
-#        ('qcow2', _('QCOW2 - QEMU Emulator')),
-#        ('raw', _('Raw')),
-#        ('vdi', _('VDI - Virtual Disk Image')),
-#        ('vhd', _('VHD - Virtual Hard Disk')),
-#        ('vhdx', _('VHDX - Large Virtual Hard Disk')),
-#        ('vmdk', _('VMDK - Virtual Machine Disk')),
-#    ],
-#}
+OPENSTACK_IMAGE_BACKEND = {
+    'image_formats': [
+        ('', ''),
+        ('aki', _('AKI - Amazon Kernel Image')),
+        ('ami', _('AMI - Amazon Machine Image')),
+        ('ari', _('ARI - Amazon Ramdisk Image')),
+        ('iso', _('ISO - Optical Disk Image')),
+        ('qcow2', _('QCOW2 - QEMU Emulator')),
+        ('raw', _('Raw')),
+        ('vdi', _('VDI')),
+        ('vhd', _('VHD')),
+        ('vmdk', _('VMDK')),
+        ('docker', _('Docker Container'))
+    ]
+}
 
 # The IMAGE_CUSTOM_PROPERTY_TITLES settings is used to customize the titles for
 # image custom property attributes that appear on image detail pages.
@@ -409,285 +412,53 @@
     "ramdisk_id": _("Ramdisk ID"),
     "image_state": _("Euca2ools state"),
     "project_id": _("Project ID"),
-    "image_type": _("Image Type"),
-}
-
-# The IMAGE_RESERVED_CUSTOM_PROPERTIES setting is used to specify which image
-# custom properties should not be displayed in the Image Custom Properties
-# table.
-IMAGE_RESERVED_CUSTOM_PROPERTIES = []
-
-# Set to 'legacy' or 'direct' to allow users to upload images to glance via
-# Horizon server. When enabled, a file form field will appear on the create
-# image form. If set to 'off', there will be no file form field on the create
-# image form. See documentation for deployment considerations.
-#HORIZON_IMAGES_UPLOAD_MODE = 'legacy'
-
-# Allow a location to be set when creating or updating Glance images.
-# If using Glance V2, this value should be False unless the Glance
-# configuration and policies allow setting locations.
-#IMAGES_ALLOW_LOCATION = False
-
-# A dictionary of default settings for create image modal.
-#CREATE_IMAGE_DEFAULTS = {
-#    'image_visibility': "public",
-#}
-
-# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is 'publicURL'.
-#OPENSTACK_ENDPOINT_TYPE = "publicURL"
-
-# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
-# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is None. This
-# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
-#SECONDARY_ENDPOINT_TYPE = None
-
-# The number of objects (Swift containers/objects or images) to display
-# on a single page before providing a paging element (a "more" link)
-# to paginate results.
-API_RESULT_LIMIT = 1000
-API_RESULT_PAGE_SIZE = 20
-
-# The size of chunk in bytes for downloading objects from Swift
-SWIFT_FILE_TRANSFER_CHUNK_SIZE = 512 * 1024
-
-# The default number of lines displayed for instance console log.
-INSTANCE_LOG_LENGTH = 35
-
-# Specify a maximum number of items to display in a dropdown.
-DROPDOWN_MAX_ITEMS = 30
-
-# The timezone of the server. This should correspond with the timezone
-# of your entire OpenStack installation, and hopefully be in UTC.
-TIME_ZONE = "UTC"
-
-# When launching an instance, the menu of available flavors is
-# sorted by RAM usage, ascending. If you would like a different sort order,
-# you can provide another flavor attribute as sorting key. Alternatively, you
-# can provide a custom callback method to use for sorting. You can also provide
-# a flag for reverse sort. For more info, see
-# http://docs.python.org/2/library/functions.html#sorted
-#CREATE_INSTANCE_FLAVOR_SORT = {
-#    'key': 'name',
-#     # or
-#    'key': my_awesome_callback_method,
-#    'reverse': False,
-#}
-
-# Set this to True to display an 'Admin Password' field on the Change Password
-# form to verify that it is indeed the admin logged-in who wants to change
-# the password.
-#ENFORCE_PASSWORD_CHECK = False
-
-# Modules that provide /auth routes that can be used to handle different types
-# of user authentication. Add auth plugins that require extra route handling to
-# this list.
-#AUTHENTICATION_URLS = [
-#    'openstack_auth.urls',
-#]
-
-# The Horizon Policy Enforcement engine uses these values to load per service
-# policy rule files. The content of these files should match the files the
-# OpenStack services are using to determine role based access control in the
-# target installation.
-
-# Path to directory containing policy.json files
-#POLICY_FILES_PATH = os.path.join(ROOT_PATH, "conf")
-
-# Map of local copy of service policy files.
-# Please insure that your identity policy file matches the one being used on
-# your keystone servers. There is an alternate policy file that may be used
-# in the Keystone v3 multi-domain case, policy.v3cloudsample.json.
-# This file is not included in the Horizon repository by default but can be
-# found at
-# http://git.openstack.org/cgit/openstack/keystone/tree/etc/ \
-# policy.v3cloudsample.json
-# Having matching policy files on the Horizon and Keystone servers is essential
-# for normal operation. This holds true for all services and their policy files.
-#POLICY_FILES = {
-#    'identity': 'keystone_policy.json',
-#    'compute': 'nova_policy.json',
-#    'volume': 'cinder_policy.json',
-#    'image': 'glance_policy.json',
-#    'network': 'neutron_policy.json',
-#}
-
-# TODO: (david-lyle) remove when plugins support adding settings.
-# Note: Only used when trove-dashboard plugin is configured to be used by
-# Horizon.
-# Trove user and database extension support. By default support for
-# creating users and databases on database instances is turned on.
-# To disable these extensions set the permission here to something
-# unusable such as ["!"].
-#TROVE_ADD_USER_PERMS = []
-#TROVE_ADD_DATABASE_PERMS = []
-
-# Change this patch to the appropriate list of tuples containing
-# a key, label and static directory containing two files:
-# _variables.scss and _styles.scss
-#AVAILABLE_THEMES = [
-#    ('default', 'Default', 'themes/default'),
-#    ('material', 'Material', 'themes/material'),
-#]
-
-LOGGING = {
-    'version': 1,
-    # When set to True this will disable all logging except
-    # for loggers specified in this configuration dictionary. Note that
-    # if nothing is specified here and disable_existing_loggers is True,
-    # django.db.backends will still log unless it is disabled explicitly.
-    'disable_existing_loggers': False,
-    # If apache2 mod_wsgi is used to deploy OpenStack dashboard
-    # timestamp is output by mod_wsgi. If WSGI framework you use does not
-    # output timestamp for logging, add %(asctime)s in the following
-    # format definitions.
-    'formatters': {
-        'console': {
-            'format': '%(levelname)s %(name)s %(message)s'
-        },
-        'operation': {
-            # The format of "%(message)s" is defined by
-            # OPERATION_LOG_OPTIONS['format']
-            'format': '%(message)s'
-        },
-    },
-    'handlers': {
-        'null': {
-            'level': 'DEBUG',
-            'class': 'logging.NullHandler',
-        },
-        'console': {
-            # Set the level to "DEBUG" for verbose output logging.
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'console',
-        },
-        'operation': {
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'operation',
-        },
-    },
-    'loggers': {
-        'horizon': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'horizon.operation_log': {
-            'handlers': ['operation'],
-            'level': 'INFO',
-            'propagate': False,
-        },
-        'openstack_dashboard': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'novaclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'cinderclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneauth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'glanceclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'neutronclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'swiftclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'oslo_policy': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'openstack_auth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'nose.plugins.manager': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'django': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        # Logging from django.db.backends is VERY verbose, send to null
-        # by default.
-        'django.db.backends': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'requests': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'urllib3': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'chardet.charsetprober': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'iso8601': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'scss': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-    },
+    "image_type": _("Image Type")
+}
+
+HORIZON_IMAGES_UPLOAD_MODE = "legacy"
+IMAGES_ALLOW_LOCATION = True
+
+
+# Disable simplified floating IP address management for deployments with
+# multiple floating IP pools or complex network requirements.
+# HORIZON_CONFIG["simple_ip_management"] = False
+
+# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
+# services provided by neutron. Options currenly available are load
+# balancer service, security groups, quotas, VPN service.
+
+OPENSTACK_NEUTRON_NETWORK = {
+    'enable_lb': True,
+    'enable_firewall': False,
+    'enable_quotas': True,
+    'enable_security_group': True,
+    'enable_vpn': False,
+    # The profile_support option is used to detect if an externa lrouter can be
+    # configured via the dashboard. When using specific plugins the
+    # profile_support can be turned on if needed.
+    'profile_support': None,
+    'enable_fip_topology_check': True,
+
+    #'profile_support': 'cisco',
 }
 
 # 'direction' should not be specified for all_tcp/udp/icmp.
 # It is specified in the form.
 SECURITY_GROUP_RULES = {
     'all_tcp': {
-        'name': _('All TCP'),
+        'name': 'ALL TCP',
         'ip_protocol': 'tcp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_udp': {
-        'name': _('All UDP'),
+        'name': 'ALL UDP',
         'ip_protocol': 'udp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_icmp': {
-        'name': _('All ICMP'),
+        'name': 'ALL ICMP',
         'ip_protocol': 'icmp',
         'from_port': '-1',
         'to_port': '-1',
@@ -778,144 +549,12 @@
     },
 }
 
-# Deprecation Notice:
-#
-# The setting FLAVOR_EXTRA_KEYS has been deprecated.
-# Please load extra spec metadata into the Glance Metadata Definition Catalog.
-#
-# The sample quota definitions can be found in:
-# <glance_source>/etc/metadefs/compute-quota.json
-#
-# The metadata definition catalog supports CLI and API:
-#  $glance --os-image-api-version 2 help md-namespace-import
-#  $glance-manage db_load_metadefs <directory_with_definition_files>
-#
-# See Metadata Definitions on: http://docs.openstack.org/developer/glance/
-
-# TODO: (david-lyle) remove when plugins support settings natively
-# Note: This is only used when the Sahara plugin is configured and enabled
-# for use in Horizon.
-# Indicate to the Sahara data processing service whether or not
-# automatic floating IP allocation is in effect.  If it is not
-# in effect, the user will be prompted to choose a floating IP
-# pool for use in their cluster.  False by default.  You would want
-# to set this to True if you were running Nova Networking with
-# auto_assign_floating_ip = True.
-#SAHARA_AUTO_IP_ALLOCATION_ENABLED = False
-
-# The hash algorithm to use for authentication tokens. This must
-# match the hash algorithm that the identity server and the
-# auth_token middleware are using. Allowed values are the
-# algorithms supported by Python's hashlib library.
-#OPENSTACK_TOKEN_HASH_ALGORITHM = 'md5'
-
-# AngularJS requires some settings to be made available to
-# the client side. Some settings are required by in-tree / built-in horizon
-# features. These settings must be added to REST_API_REQUIRED_SETTINGS in the
-# form of ['SETTING_1','SETTING_2'], etc.
-#
-# You may remove settings from this list for security purposes, but do so at
-# the risk of breaking a built-in horizon feature. These settings are required
-# for horizon to function properly. Only remove them if you know what you
-# are doing. These settings may in the future be moved to be defined within
-# the enabled panel configuration.
-# You should not add settings to this list for out of tree extensions.
-# See: https://wiki.openstack.org/wiki/Horizon/RESTAPI
-REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
-                              'LAUNCH_INSTANCE_DEFAULTS',
-                              'OPENSTACK_IMAGE_FORMATS',
-                              'OPENSTACK_KEYSTONE_DEFAULT_DOMAIN',
-                              'CREATE_IMAGE_DEFAULTS',
-                              'ENFORCE_PASSWORD_CHECK']
-
-# Additional settings can be made available to the client side for
-# extensibility by specifying them in REST_API_ADDITIONAL_SETTINGS
-# !! Please use extreme caution as the settings are transferred via HTTP/S
-# and are not encrypted on the browser. This is an experimental API and
-# may be deprecated in the future without notice.
-#REST_API_ADDITIONAL_SETTINGS = []
-
-# DISALLOW_IFRAME_EMBED can be used to prevent Horizon from being embedded
-# within an iframe. Legacy browsers are still vulnerable to a Cross-Frame
-# Scripting (XFS) vulnerability, so this option allows extra security hardening
-# where iframes are not used in deployment. Default setting is True.
-# For more information see:
-# http://tinyurl.com/anticlickjack
-#DISALLOW_IFRAME_EMBED = True
-
-# Help URL can be made available for the client. To provide a help URL, edit the
-# following attribute to the URL of your choice.
-#HORIZON_CONFIG["help_url"] = "http://openstack.mycompany.org"
-
-# Settings for OperationLogMiddleware
-# OPERATION_LOG_ENABLED is flag to use the function to log an operation on
-# Horizon.
-# mask_targets is arrangement for appointing a target to mask.
-# method_targets is arrangement of HTTP method to output log.
-# format is the log contents.
-#OPERATION_LOG_ENABLED = False
-#OPERATION_LOG_OPTIONS = {
-#    'mask_fields': ['password'],
-#    'target_methods': ['POST'],
-#    'ignored_urls': ['/js/', '/static/', '^/api/'],
-#    'format': ("[%(client_ip)s] [%(domain_name)s]"
-#        " [%(domain_id)s] [%(project_name)s]"
-#        " [%(project_id)s] [%(user_name)s] [%(user_id)s] [%(request_scheme)s]"
-#        " [%(referer_url)s] [%(request_url)s] [%(message)s] [%(method)s]"
-#        " [%(http_status)s] [%(param)s]"),
-#}
-
-# The default date range in the Overview panel meters - either <today> minus N
-# days (if the value is integer N), or from the beginning of the current month
-# until today (if set to None). This setting should be used to limit the amount
-# of data fetched by default when rendering the Overview panel.
-#OVERVIEW_DAYS_RANGE = 1
-
-# To allow operators to require users provide a search criteria first
-# before loading any data into the views, set the following dict
-# attributes to True in each one of the panels you want to enable this feature.
-# Follow the convention <dashboard>.<view>
-#FILTER_DATA_FIRST = {
-#    'admin.instances': False,
-#    'admin.images': False,
-#    'admin.networks': False,
-#    'admin.routers': False,
-#    'admin.volumes': False,
-#    'identity.users': False,
-#    'identity.projects': False,
-#    'identity.groups': False,
-#    'identity.roles': False
-#}
-
-# Dict used to restrict user private subnet cidr range.
-# An empty list means that user input will not be restricted
-# for a corresponding IP version. By default, there is
-# no restriction for IPv4 or IPv6. To restrict
-# user private subnet cidr range set ALLOWED_PRIVATE_SUBNET_CIDR
-# to something like
-#ALLOWED_PRIVATE_SUBNET_CIDR = {
-#    'ipv4': ['10.0.0.0/8', '192.168.0.0/16'],
-#    'ipv6': ['fc00::/7']
-#}
-ALLOWED_PRIVATE_SUBNET_CIDR = {'ipv4': [], 'ipv6': []}
-
-# Projects and users can have extra attributes as defined by keystone v3.
-# Horizon has the ability to display these extra attributes via this setting.
-# If you'd like to display extra data in the project or user tables, set the
-# corresponding dict key to the attribute name, followed by the display name.
-# For more information, see horizon's customization (http://docs.openstack.org/developer/horizon/topics/customizing.html#horizon-customization-module-overrides)
-#PROJECT_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-#USER_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-
-# Password will have an expiration date when using keystone v3 and enabling the
-# feature.
-# This setting allows you to set the number of days that the user will be alerted
-# prior to the password expiration.
-# Once the password expires keystone will deny the access and users must
-# contact an admin to change their password.
-#PASSWORD_EXPIRES_WARNING_THRESHOLD_DAYS = 0
-COMPRESS_OFFLINE=True
+
+
+
+
+
+USE_SSL = True
+CSRF_COOKIE_SECURE = True
+CSRF_COOKIE_SECURE = True
+SESSION_COOKIE_HTTPONLY = True

2018-09-01 23:29:27,888 [salt.state       :905 ][INFO    ][7274] Loading fresh modules for state activity
2018-09-01 23:29:27,910 [salt.state       :1941][INFO    ][7274] Completed state [/etc/openstack-dashboard/local_settings.py] at time 23:29:27.909995 duration_in_ms=307.283
2018-09-01 23:29:27,912 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 23:29:27.912547
2018-09-01 23:29:27,912 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json]
2018-09-01 23:29:27,932 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/nova_policy.json'
2018-09-01 23:29:27,934 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -1,174 +1,500 @@
 {
-    "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or project_id:%(project_id)s",
+    "context_is_admin":  "role:admin",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
+
+    "compute:create": "rule:admin_or_owner",
+    "compute:create:attach_network": "rule:admin_or_owner",
+    "compute:create:attach_volume": "rule:admin_or_owner",
+    "compute:create:forced_host": "is_admin:True",
+
+    "compute:get": "rule:admin_or_owner",
+    "compute:get_all": "rule:admin_or_owner",
+    "compute:get_all_tenants": "is_admin:True",
+
+    "compute:update": "rule:admin_or_owner",
+
+    "compute:get_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_system_metadata": "rule:admin_or_owner",
+    "compute:update_instance_metadata": "rule:admin_or_owner",
+    "compute:delete_instance_metadata": "rule:admin_or_owner",
+
+    "compute:get_diagnostics": "rule:admin_or_owner",
+    "compute:get_instance_diagnostics": "rule:admin_or_owner",
+
+    "compute:start": "rule:admin_or_owner",
+    "compute:stop": "rule:admin_or_owner",
+
+    "compute:lock": "rule:admin_or_owner",
+    "compute:unlock": "rule:admin_or_owner",
+    "compute:unlock_override": "rule:admin_api",
+
+    "compute:get_vnc_console": "rule:admin_or_owner",
+    "compute:get_spice_console": "rule:admin_or_owner",
+    "compute:get_rdp_console": "rule:admin_or_owner",
+    "compute:get_serial_console": "rule:admin_or_owner",
+    "compute:get_mks_console": "rule:admin_or_owner",
+    "compute:get_console_output": "rule:admin_or_owner",
+
+    "compute:reset_network": "rule:admin_or_owner",
+    "compute:inject_network_info": "rule:admin_or_owner",
+    "compute:add_fixed_ip": "rule:admin_or_owner",
+    "compute:remove_fixed_ip": "rule:admin_or_owner",
+
+    "compute:attach_volume": "rule:admin_or_owner",
+    "compute:detach_volume": "rule:admin_or_owner",
+    "compute:swap_volume": "rule:admin_api",
+
+    "compute:attach_interface": "rule:admin_or_owner",
+    "compute:detach_interface": "rule:admin_or_owner",
+
+    "compute:set_admin_password": "rule:admin_or_owner",
+
+    "compute:rescue": "rule:admin_or_owner",
+    "compute:unrescue": "rule:admin_or_owner",
+
+    "compute:suspend": "rule:admin_or_owner",
+    "compute:resume": "rule:admin_or_owner",
+
+    "compute:pause": "rule:admin_or_owner",
+    "compute:unpause": "rule:admin_or_owner",
+
+    "compute:shelve": "rule:admin_or_owner",
+    "compute:shelve_offload": "rule:admin_or_owner",
+    "compute:unshelve": "rule:admin_or_owner",
+
+    "compute:snapshot": "rule:admin_or_owner",
+    "compute:snapshot_volume_backed": "rule:admin_or_owner",
+    "compute:backup": "rule:admin_or_owner",
+
+    "compute:resize": "rule:admin_or_owner",
+    "compute:confirm_resize": "rule:admin_or_owner",
+    "compute:revert_resize": "rule:admin_or_owner",
+
+    "compute:rebuild": "rule:admin_or_owner",
+    "compute:reboot": "rule:admin_or_owner",
+    "compute:delete": "rule:admin_or_owner",
+    "compute:soft_delete": "rule:admin_or_owner",
+    "compute:force_delete": "rule:admin_or_owner",
+
+    "compute:security_groups:add_to_instance": "rule:admin_or_owner",
+    "compute:security_groups:remove_from_instance": "rule:admin_or_owner",
+
+    "compute:restore": "rule:admin_or_owner",
+
+    "compute:volume_snapshot_create": "rule:admin_or_owner",
+    "compute:volume_snapshot_delete": "rule:admin_or_owner",
+
     "admin_api": "is_admin:True",
-    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
-    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
-    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
-    "os_compute_api:os-admin-password": "rule:admin_or_owner",
-    "os_compute_api:os-agents": "rule:admin_api",
-    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
-    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:create": "rule:admin_api",
-    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:update": "rule:admin_api",
-    "os_compute_api:os-aggregates:index": "rule:admin_api",
-    "os_compute_api:os-aggregates:delete": "rule:admin_api",
-    "os_compute_api:os-aggregates:show": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
-    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:create": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:delete": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
-    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
-    "os_compute_api:os-cells:update": "rule:admin_api",
-    "os_compute_api:os-cells:create": "rule:admin_api",
-    "os_compute_api:os-cells": "rule:admin_api",
-    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
-    "os_compute_api:os-cells:delete": "rule:admin_api",
-    "cells_scheduler_filter:DifferentCellFilter": "is_admin:True",
-    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
-    "os_compute_api:os-config-drive": "rule:admin_or_owner",
-    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
-    "os_compute_api:os-console-output": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
-    "os_compute_api:os-create-backup": "rule:admin_or_owner",
-    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
-    "os_compute_api:os-evacuate": "rule:admin_api",
-    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
-    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
-    "os_compute_api:os-extended-status": "rule:admin_or_owner",
-    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
-    "os_compute_api:extensions": "rule:admin_or_owner",
-    "os_compute_api:os-fixed-ips": "rule:admin_api",
-    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-manage": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:create": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-manage:update": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:delete": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
-    "os_compute_api:flavors": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
-    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
-    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
-    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
-    "os_compute_api:os-fping": "rule:admin_or_owner",
-    "os_compute_api:os-hide-server-addresses": "is_admin:False",
-    "os_compute_api:os-hosts": "rule:admin_api",
-    "os_compute_api:os-hypervisors": "rule:admin_api",
-    "os_compute_api:image-size": "rule:admin_or_owner",
-    "os_compute_api:os-instance-actions:events": "rule:admin_api",
-    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
-    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
-    "os_compute_api:ips:show": "rule:admin_or_owner",
-    "os_compute_api:ips:index": "rule:admin_or_owner",
-    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs": "rule:admin_or_owner",
-    "os_compute_api:limits": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
-    "os_compute_api:os-migrations:index": "rule:admin_api",
-    "os_compute_api:os-multinic": "rule:admin_or_owner",
-    "os_compute_api:os-networks": "rule:admin_api",
-    "os_compute_api:os-networks:view": "rule:admin_or_owner",
-    "os_compute_api:os-networks-associate": "rule:admin_api",
-    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
-    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
-    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
-    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:defaults": "@",
-    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
-    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
-    "os_compute_api:os-quota-sets:detail": "rule:admin_or_owner",
-    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
-    "os_compute_api:os-rescue": "rule:admin_or_owner",
-    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
-    "os_compute_api:os-security-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-diagnostics": "rule:admin_api",
-    "os_compute_api:os-server-external-events:create": "rule:admin_api",
-    "os_compute_api:os-server-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-groups:create": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:delete": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:index": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:show": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-password": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:index": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:show": "rule:admin_or_owner",
-    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "compute_extension:accounts": "rule:admin_api",
+    "compute_extension:admin_actions": "rule:admin_api",
+    "compute_extension:admin_actions:pause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unpause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:suspend": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resume": "rule:admin_or_owner",
+    "compute_extension:admin_actions:lock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unlock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resetNetwork": "rule:admin_api",
+    "compute_extension:admin_actions:injectNetworkInfo": "rule:admin_api",
+    "compute_extension:admin_actions:createBackup": "rule:admin_or_owner",
+    "compute_extension:admin_actions:migrateLive": "rule:admin_api",
+    "compute_extension:admin_actions:resetState": "rule:admin_api",
+    "compute_extension:admin_actions:migrate": "rule:admin_api",
+    "compute_extension:aggregates": "rule:admin_api",
+    "compute_extension:agents": "rule:admin_api",
+    "compute_extension:attach_interfaces": "rule:admin_or_owner",
+    "compute_extension:baremetal_nodes": "rule:admin_api",
+    "compute_extension:cells": "rule:admin_api",
+    "compute_extension:cells:create": "rule:admin_api",
+    "compute_extension:cells:delete": "rule:admin_api",
+    "compute_extension:cells:update": "rule:admin_api",
+    "compute_extension:cells:sync_instances": "rule:admin_api",
+    "compute_extension:certificates": "rule:admin_or_owner",
+    "compute_extension:cloudpipe": "rule:admin_api",
+    "compute_extension:cloudpipe_update": "rule:admin_api",
+    "compute_extension:config_drive": "rule:admin_or_owner",
+    "compute_extension:console_output": "rule:admin_or_owner",
+    "compute_extension:consoles": "rule:admin_or_owner",
+    "compute_extension:createserverext": "rule:admin_or_owner",
+    "compute_extension:deferred_delete": "rule:admin_or_owner",
+    "compute_extension:disk_config": "rule:admin_or_owner",
+    "compute_extension:evacuate": "rule:admin_api",
+    "compute_extension:extended_server_attributes": "rule:admin_api",
+    "compute_extension:extended_status": "rule:admin_or_owner",
+    "compute_extension:extended_availability_zone": "rule:admin_or_owner",
+    "compute_extension:extended_ips": "rule:admin_or_owner",
+    "compute_extension:extended_ips_mac": "rule:admin_or_owner",
+    "compute_extension:extended_vif_net": "rule:admin_or_owner",
+    "compute_extension:extended_volumes": "rule:admin_or_owner",
+    "compute_extension:fixed_ips": "rule:admin_api",
+    "compute_extension:flavor_access": "rule:admin_or_owner",
+    "compute_extension:flavor_access:addTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_access:removeTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_disabled": "rule:admin_or_owner",
+    "compute_extension:flavor_rxtx": "rule:admin_or_owner",
+    "compute_extension:flavor_swap": "rule:admin_or_owner",
+    "compute_extension:flavorextradata": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:index": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:show": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:create": "rule:admin_api",
+    "compute_extension:flavorextraspecs:update": "rule:admin_api",
+    "compute_extension:flavorextraspecs:delete": "rule:admin_api",
+    "compute_extension:flavormanage": "rule:admin_api",
+    "compute_extension:floating_ip_dns": "rule:admin_or_owner",
+    "compute_extension:floating_ip_pools": "rule:admin_or_owner",
+    "compute_extension:floating_ips": "rule:admin_or_owner",
+    "compute_extension:floating_ips_bulk": "rule:admin_api",
+    "compute_extension:fping": "rule:admin_or_owner",
+    "compute_extension:fping:all_tenants": "rule:admin_api",
+    "compute_extension:hide_server_addresses": "is_admin:False",
+    "compute_extension:hosts": "rule:admin_api",
+    "compute_extension:hypervisors": "rule:admin_api",
+    "compute_extension:image_size": "rule:admin_or_owner",
+    "compute_extension:instance_actions": "rule:admin_or_owner",
+    "compute_extension:instance_actions:events": "rule:admin_api",
+    "compute_extension:instance_usage_audit_log": "rule:admin_api",
+    "compute_extension:keypairs": "rule:admin_or_owner",
+    "compute_extension:keypairs:index": "rule:admin_or_owner",
+    "compute_extension:keypairs:show": "rule:admin_or_owner",
+    "compute_extension:keypairs:create": "rule:admin_or_owner",
+    "compute_extension:keypairs:delete": "rule:admin_or_owner",
+    "compute_extension:multinic": "rule:admin_or_owner",
+    "compute_extension:networks": "rule:admin_api",
+    "compute_extension:networks:view": "rule:admin_or_owner",
+    "compute_extension:networks_associate": "rule:admin_api",
+    "compute_extension:os-tenant-networks": "rule:admin_or_owner",
+    "compute_extension:quotas:show": "rule:admin_or_owner",
+    "compute_extension:quotas:update": "rule:admin_api",
+    "compute_extension:quotas:delete": "rule:admin_api",
+    "compute_extension:quota_classes": "rule:admin_or_owner",
+    "compute_extension:rescue": "rule:admin_or_owner",
+    "compute_extension:security_group_default_rules": "rule:admin_api",
+    "compute_extension:security_groups": "rule:admin_or_owner",
+    "compute_extension:server_diagnostics": "rule:admin_api",
+    "compute_extension:server_groups": "rule:admin_or_owner",
+    "compute_extension:server_password": "rule:admin_or_owner",
+    "compute_extension:server_usage": "rule:admin_or_owner",
+    "compute_extension:services": "rule:admin_api",
+    "compute_extension:shelve": "rule:admin_or_owner",
+    "compute_extension:shelveOffload": "rule:admin_api",
+    "compute_extension:simple_tenant_usage:show": "rule:admin_or_owner",
+    "compute_extension:simple_tenant_usage:list": "rule:admin_api",
+    "compute_extension:unshelve": "rule:admin_or_owner",
+    "compute_extension:users": "rule:admin_api",
+    "compute_extension:virtual_interfaces": "rule:admin_or_owner",
+    "compute_extension:virtual_storage_arrays": "rule:admin_or_owner",
+    "compute_extension:volumes": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:index": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:show": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:create": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:update": "rule:admin_api",
+    "compute_extension:volume_attachments:delete": "rule:admin_or_owner",
+    "compute_extension:volumetypes": "rule:admin_or_owner",
+    "compute_extension:availability_zone:list": "rule:admin_or_owner",
+    "compute_extension:availability_zone:detail": "rule:admin_api",
+    "compute_extension:used_limits_for_admin": "rule:admin_api",
+    "compute_extension:migrations:index": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "compute_extension:console_auth_tokens": "rule:admin_api",
+    "compute_extension:os-server-external-events:create": "rule:admin_api",
+
+    "network:get_all": "rule:admin_or_owner",
+    "network:get": "rule:admin_or_owner",
+    "network:create": "rule:admin_or_owner",
+    "network:delete": "rule:admin_or_owner",
+    "network:associate": "rule:admin_or_owner",
+    "network:disassociate": "rule:admin_or_owner",
+    "network:get_vifs_by_instance": "rule:admin_or_owner",
+    "network:allocate_for_instance": "rule:admin_or_owner",
+    "network:deallocate_for_instance": "rule:admin_or_owner",
+    "network:validate_networks": "rule:admin_or_owner",
+    "network:get_instance_uuids_by_ip_filter": "rule:admin_or_owner",
+    "network:get_instance_id_by_floating_address": "rule:admin_or_owner",
+    "network:setup_networks_on_host": "rule:admin_or_owner",
+    "network:get_backdoor_port": "rule:admin_or_owner",
+
+    "network:get_floating_ip": "rule:admin_or_owner",
+    "network:get_floating_ip_pools": "rule:admin_or_owner",
+    "network:get_floating_ip_by_address": "rule:admin_or_owner",
+    "network:get_floating_ips_by_project": "rule:admin_or_owner",
+    "network:get_floating_ips_by_fixed_address": "rule:admin_or_owner",
+    "network:allocate_floating_ip": "rule:admin_or_owner",
+    "network:associate_floating_ip": "rule:admin_or_owner",
+    "network:disassociate_floating_ip": "rule:admin_or_owner",
+    "network:release_floating_ip": "rule:admin_or_owner",
+    "network:migrate_instance_start": "rule:admin_or_owner",
+    "network:migrate_instance_finish": "rule:admin_or_owner",
+
+    "network:get_fixed_ip": "rule:admin_or_owner",
+    "network:get_fixed_ip_by_address": "rule:admin_or_owner",
+    "network:add_fixed_ip_to_instance": "rule:admin_or_owner",
+    "network:remove_fixed_ip_from_instance": "rule:admin_or_owner",
+    "network:add_network_to_project": "rule:admin_or_owner",
+    "network:get_instance_nw_info": "rule:admin_or_owner",
+
+    "network:get_dns_domains": "rule:admin_or_owner",
+    "network:add_dns_entry": "rule:admin_or_owner",
+    "network:modify_dns_entry": "rule:admin_or_owner",
+    "network:delete_dns_entry": "rule:admin_or_owner",
+    "network:get_dns_entries_by_address": "rule:admin_or_owner",
+    "network:get_dns_entries_by_name": "rule:admin_or_owner",
+    "network:create_private_dns_domain": "rule:admin_or_owner",
+    "network:create_public_dns_domain": "rule:admin_or_owner",
+    "network:delete_dns_domain": "rule:admin_or_owner",
+    "network:attach_external_network": "rule:admin_api",
+    "network:get_vif_by_mac_address": "rule:admin_or_owner",
+
+    "os_compute_api:servers:detail:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:index:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:create": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
+    "os_compute_api:servers:create:forced_host": "rule:admin_api",
+    "os_compute_api:servers:delete": "rule:admin_or_owner",
+    "os_compute_api:servers:update": "rule:admin_or_owner",
+    "os_compute_api:servers:detail": "rule:admin_or_owner",
     "os_compute_api:servers:index": "rule:admin_or_owner",
-    "os_compute_api:servers:detail": "rule:admin_or_owner",
-    "os_compute_api:servers:index:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:detail:get_all_tenants": "rule:admin_api",
+    "os_compute_api:servers:reboot": "rule:admin_or_owner",
+    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
+    "os_compute_api:servers:resize": "rule:admin_or_owner",
+    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
     "os_compute_api:servers:show": "rule:admin_or_owner",
     "os_compute_api:servers:show:host_status": "rule:admin_api",
-    "os_compute_api:servers:create": "rule:admin_or_owner",
-    "os_compute_api:servers:create:forced_host": "rule:admin_api",
-    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
-    "network:attach_external_network": "is_admin:True",
-    "os_compute_api:servers:delete": "rule:admin_or_owner",
-    "os_compute_api:servers:update": "rule:admin_or_owner",
-    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:reboot": "rule:admin_or_owner",
-    "os_compute_api:servers:resize": "rule:admin_or_owner",
-    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
     "os_compute_api:servers:create_image": "rule:admin_or_owner",
     "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
     "os_compute_api:servers:start": "rule:admin_or_owner",
     "os_compute_api:servers:stop": "rule:admin_or_owner",
     "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
-    "os_compute_api:servers:migrations:show": "rule:admin_api",
     "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
     "os_compute_api:servers:migrations:delete": "rule:admin_api",
+    "os_compute_api:servers:discoverable": "@",
     "os_compute_api:servers:migrations:index": "rule:admin_api",
+    "os_compute_api:servers:migrations:show": "rule:admin_api",
+    "os_compute_api:os-access-ips:discoverable": "@",
+    "os_compute_api:os-access-ips": "rule:admin_or_owner",
+    "os_compute_api:os-admin-actions": "rule:admin_api",
+    "os_compute_api:os-admin-actions:discoverable": "@",
+    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
+    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
+    "os_compute_api:os-admin-password": "rule:admin_or_owner",
+    "os_compute_api:os-admin-password:discoverable": "@",
+    "os_compute_api:os-aggregates:discoverable": "@",
+    "os_compute_api:os-aggregates:index": "rule:admin_api",
+    "os_compute_api:os-aggregates:create": "rule:admin_api",
+    "os_compute_api:os-aggregates:show": "rule:admin_api",
+    "os_compute_api:os-aggregates:update": "rule:admin_api",
+    "os_compute_api:os-aggregates:delete": "rule:admin_api",
+    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
+    "os_compute_api:os-agents": "rule:admin_api",
+    "os_compute_api:os-agents:discoverable": "@",
+    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-attach-interfaces:discoverable": "@",
+    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
+    "os_compute_api:os-baremetal-nodes:discoverable": "@",
+    "os_compute_api:os-block-device-mapping-v1:discoverable": "@",
+    "os_compute_api:os-cells": "rule:admin_api",
+    "os_compute_api:os-cells:create": "rule:admin_api",
+    "os_compute_api:os-cells:delete": "rule:admin_api",
+    "os_compute_api:os-cells:update": "rule:admin_api",
+    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
+    "os_compute_api:os-cells:discoverable": "@",
+    "os_compute_api:os-certificates:create": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:show": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:discoverable": "@",
+    "os_compute_api:os-cloudpipe": "rule:admin_api",
+    "os_compute_api:os-cloudpipe:discoverable": "@",
+    "os_compute_api:os-config-drive": "rule:admin_or_owner",
+    "os_compute_api:os-config-drive:discoverable": "@",
+    "os_compute_api:os-consoles:discoverable": "@",
+    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
+    "os_compute_api:os-console-output:discoverable": "@",
+    "os_compute_api:os-console-output": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles:discoverable": "@",
+    "os_compute_api:os-create-backup:discoverable": "@",
+    "os_compute_api:os-create-backup": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete:discoverable": "@",
+    "os_compute_api:os-disk-config": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config:discoverable": "@",
+    "os_compute_api:os-evacuate": "rule:admin_api",
+    "os_compute_api:os-evacuate:discoverable": "@",
+    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes:discoverable": "@",
+    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:os-extended-status:discoverable": "@",
+    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
+    "os_compute_api:os-extended-availability-zone:discoverable": "@",
+    "os_compute_api:extensions": "rule:admin_or_owner",
+    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:extension_info:discoverable": "@",
+    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-extended-volumes:discoverable": "@",
+    "os_compute_api:os-fixed-ips": "rule:admin_api",
+    "os_compute_api:os-fixed-ips:discoverable": "@",
+    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-access:discoverable": "@",
+    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-rxtx:discoverable": "@",
+    "os_compute_api:flavors": "rule:admin_or_owner",
+    "os_compute_api:flavors:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
+    "os_compute_api:os-flavor-manage:discoverable": "@",
+    "os_compute_api:os-flavor-manage": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-dns:discoverable": "@",
+    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
+    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-pools:discoverable": "@",
+    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ips:discoverable": "@",
+    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
+    "os_compute_api:os-floating-ips-bulk:discoverable": "@",
+    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-fping:discoverable": "@",
+    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
+    "os_compute_api:os-hide-server-addresses": "is_admin:False",
+    "os_compute_api:os-hide-server-addresses:discoverable": "@",
+    "os_compute_api:os-hosts": "rule:admin_api",
+    "os_compute_api:os-hosts:discoverable": "@",
+    "os_compute_api:os-hypervisors": "rule:admin_api",
+    "os_compute_api:os-hypervisors:discoverable": "@",
+    "os_compute_api:images:discoverable": "@",
+    "os_compute_api:image-size": "rule:admin_or_owner",
+    "os_compute_api:image-size:discoverable": "@",
+    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
+    "os_compute_api:os-instance-actions:discoverable": "@",
+    "os_compute_api:os-instance-actions:events": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log:discoverable": "@",
+    "os_compute_api:ips:discoverable": "@",
+    "os_compute_api:ips:index": "rule:admin_or_owner",
+    "os_compute_api:ips:show": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:discoverable": "@",
+    "os_compute_api:os-keypairs": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:limits:discoverable": "@",
+    "os_compute_api:limits": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:discoverable": "@",
+    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
+    "os_compute_api:os-migrate-server:discoverable": "@",
+    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
+    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
+    "os_compute_api:os-multinic": "rule:admin_or_owner",
+    "os_compute_api:os-multinic:discoverable": "@",
+    "os_compute_api:os-networks": "rule:admin_api",
+    "os_compute_api:os-networks:view": "rule:admin_or_owner",
+    "os_compute_api:os-networks:discoverable": "@",
+    "os_compute_api:os-networks-associate": "rule:admin_api",
+    "os_compute_api:os-networks-associate:discoverable": "@",
+    "os_compute_api:os-pause-server:discoverable": "@",
+    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
+    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
+    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
+    "os_compute_api:os-pci:discoverable": "@",
+    "os_compute_api:os-pci:index": "rule:admin_api",
+    "os_compute_api:os-pci:detail": "rule:admin_api",
+    "os_compute_api:os-pci:show": "rule:admin_api",
+    "os_compute_api:os-personality:discoverable": "@",
+    "os_compute_api:os-preserve-ephemeral-rebuild:discoverable": "@",
+    "os_compute_api:os-quota-sets:discoverable": "@",
+    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
+    "os_compute_api:os-quota-sets:defaults": "@",
+    "os_compute_api:os-quota-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
+    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
+    "os_compute_api:os-quota-class-sets:discoverable": "@",
+    "os_compute_api:os-rescue": "rule:admin_or_owner",
+    "os_compute_api:os-rescue:discoverable": "@",
+    "os_compute_api:os-scheduler-hints:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
+    "os_compute_api:os-security-groups": "rule:admin_or_owner",
+    "os_compute_api:os-security-groups:discoverable": "@",
+    "os_compute_api:os-server-diagnostics": "rule:admin_api",
+    "os_compute_api:os-server-diagnostics:discoverable": "@",
+    "os_compute_api:os-server-password": "rule:admin_or_owner",
+    "os_compute_api:os-server-password:discoverable": "@",
+    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "os_compute_api:os-server-usage:discoverable": "@",
+    "os_compute_api:os-server-groups": "rule:admin_or_owner",
+    "os_compute_api:os-server-groups:discoverable": "@",
+    "os_compute_api:os-server-tags:index": "@",
+    "os_compute_api:os-server-tags:show": "@",
+    "os_compute_api:os-server-tags:update": "@",
+    "os_compute_api:os-server-tags:update_all": "@",
+    "os_compute_api:os-server-tags:delete": "@",
+    "os_compute_api:os-server-tags:delete_all": "@",
     "os_compute_api:os-services": "rule:admin_api",
+    "os_compute_api:os-services:discoverable": "@",
+    "os_compute_api:server-metadata:discoverable": "@",
+    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
     "os_compute_api:os-shelve:shelve": "rule:admin_or_owner",
-    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-shelve:shelve:discoverable": "@",
     "os_compute_api:os-shelve:shelve_offload": "rule:admin_api",
+    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
     "os_compute_api:os-simple-tenant-usage:show": "rule:admin_or_owner",
     "os_compute_api:os-simple-tenant-usage:list": "rule:admin_api",
+    "os_compute_api:os-suspend-server:discoverable": "@",
+    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-suspend-server:resume": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-tenant-networks": "rule:admin_or_owner",
+    "os_compute_api:os-tenant-networks:discoverable": "@",
+    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-user-data:discoverable": "@",
+    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-virtual-interfaces:discoverable": "@",
+    "os_compute_api:os-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-volumes:discoverable": "@",
+    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
+    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:discoverable": "@",
+    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
+    "os_compute_api:os-availability-zone:discoverable": "@",
+    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
     "os_compute_api:os-used-limits": "rule:admin_api",
-    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-volumes": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
-    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner"
+    "os_compute_api:os-used-limits:discoverable": "@",
+    "os_compute_api:os-migrations:index": "rule:admin_api",
+    "os_compute_api:os-migrations:discoverable": "@",
+    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
+    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-console-auth-tokens:discoverable": "@",
+    "os_compute_api:os-server-external-events:create": "rule:admin_api",
+    "os_compute_api:os-server-external-events:discoverable": "@"
 }

2018-09-01 23:29:27,935 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 23:29:27.935147 duration_in_ms=22.599
2018-09-01 23:29:27,935 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 23:29:27.935498
2018-09-01 23:29:27,935 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json]
2018-09-01 23:29:27,950 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/neutron_policy.json'
2018-09-01 23:29:27,952 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -7,8 +7,9 @@
     "admin_owner_or_network_owner": "rule:owner or rule:admin_or_network_owner",
     "admin_only": "rule:context_is_admin",
     "regular_user": "",
-    "admin_or_data_plane_int": "rule:context_is_admin or role:data_plane_integrator",
     "shared": "field:networks:shared=True",
+    "shared_firewalls": "field:firewalls:shared=True",
+    "shared_firewall_policies": "field:firewall_policies:shared=True",
     "shared_subnetpools": "field:subnetpools:shared=True",
     "shared_address_scopes": "field:address_scopes:shared=True",
     "external": "field:networks:router:external=True",
@@ -16,11 +17,9 @@
 
     "create_subnet": "rule:admin_or_network_owner",
     "create_subnet:segment_id": "rule:admin_only",
-    "create_subnet:service_types": "rule:admin_only",
     "get_subnet": "rule:admin_or_owner or rule:shared",
     "get_subnet:segment_id": "rule:admin_only",
     "update_subnet": "rule:admin_or_network_owner",
-    "update_subnet:service_types": "rule:admin_only",
     "delete_subnet": "rule:admin_or_network_owner",
 
     "create_subnetpool": "",
@@ -94,7 +93,6 @@
     "update_port:binding:profile": "rule:admin_only",
     "update_port:mac_learning_enabled": "rule:context_is_advsvc or rule:admin_or_network_owner",
     "update_port:allowed_address_pairs": "rule:admin_or_network_owner",
-    "update_port:data_plane_status": "rule:admin_or_data_plane_int",
     "delete_port": "rule:context_is_advsvc or rule:admin_owner_or_network_owner",
 
     "get_router:ha": "rule:admin_only",
@@ -104,9 +102,6 @@
     "create_router:ha": "rule:admin_only",
     "get_router": "rule:admin_or_owner",
     "get_router:distributed": "rule:admin_only",
-    "update_router": "rule:admin_or_owner",
-    "update_router:external_gateway_info": "rule:admin_or_owner",
-    "update_router:external_gateway_info:network_id": "rule:admin_or_owner",
     "update_router:external_gateway_info:enable_snat": "rule:admin_only",
     "update_router:distributed": "rule:admin_only",
     "update_router:ha": "rule:admin_only",
@@ -117,6 +112,28 @@
 
     "create_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
     "update_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
+
+    "create_firewall": "",
+    "get_firewall": "rule:admin_or_owner",
+    "create_firewall:shared": "rule:admin_only",
+    "get_firewall:shared": "rule:admin_only",
+    "update_firewall": "rule:admin_or_owner",
+    "update_firewall:shared": "rule:admin_only",
+    "delete_firewall": "rule:admin_or_owner",
+
+    "create_firewall_policy": "",
+    "get_firewall_policy": "rule:admin_or_owner or rule:shared_firewall_policies",
+    "create_firewall_policy:shared": "rule:admin_or_owner",
+    "update_firewall_policy": "rule:admin_or_owner",
+    "delete_firewall_policy": "rule:admin_or_owner",
+
+    "insert_rule": "rule:admin_or_owner",
+    "remove_rule": "rule:admin_or_owner",
+
+    "create_firewall_rule": "",
+    "get_firewall_rule": "rule:admin_or_owner or rule:shared_firewalls",
+    "update_firewall_rule": "rule:admin_or_owner",
+    "delete_firewall_rule": "rule:admin_or_owner",
 
     "create_qos_queue": "rule:admin_only",
     "get_qos_queue": "rule:admin_only",
@@ -189,10 +206,6 @@
     "delete_policy_dscp_marking_rule": "rule:admin_only",
     "update_policy_dscp_marking_rule": "rule:admin_only",
     "get_rule_type": "rule:regular_user",
-    "get_policy_minimum_bandwidth_rule": "rule:regular_user",
-    "create_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "delete_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "update_policy_minimum_bandwidth_rule": "rule:admin_only",
 
     "restrict_wildcard": "(not field:rbac_policy:target_tenant=*) or rule:admin_only",
     "create_rbac_policy": "",
@@ -205,29 +218,5 @@
     "create_flavor_service_profile": "rule:admin_only",
     "delete_flavor_service_profile": "rule:admin_only",
     "get_flavor_service_profile": "rule:regular_user",
-    "get_auto_allocated_topology": "rule:admin_or_owner",
-
-    "create_trunk": "rule:regular_user",
-    "get_trunk": "rule:admin_or_owner",
-    "delete_trunk": "rule:admin_or_owner",
-    "get_subports": "",
-    "add_subports": "rule:admin_or_owner",
-    "remove_subports": "rule:admin_or_owner",
-
-    "get_security_groups": "rule:admin_or_owner",
-    "get_security_group": "rule:admin_or_owner",
-    "create_security_group": "rule:admin_or_owner",
-    "update_security_group": "rule:admin_or_owner",
-    "delete_security_group": "rule:admin_or_owner",
-    "get_security_group_rules": "rule:admin_or_owner",
-    "get_security_group_rule": "rule:admin_or_owner",
-    "create_security_group_rule": "rule:admin_or_owner",
-    "delete_security_group_rule": "rule:admin_or_owner",
-
-    "get_loggable_resources": "rule:admin_only",
-    "create_log": "rule:admin_only",
-    "update_log": "rule:admin_only",
-    "delete_log": "rule:admin_only",
-    "get_logs": "rule:admin_only",
-    "get_log": "rule:admin_only"
+    "get_auto_allocated_topology": "rule:admin_or_owner"
 }

2018-09-01 23:29:27,952 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 23:29:27.952260 duration_in_ms=16.762
2018-09-01 23:29:27,952 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 23:29:27.952577
2018-09-01 23:29:27,952 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json]
2018-09-01 23:29:27,967 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/glance_policy.json'
2018-09-01 23:29:27,968 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -8,7 +8,6 @@
     "get_images": "",
     "modify_image": "",
     "publicize_image": "role:admin",
-    "communitize_image": "",
     "copy_from": "",
 
     "download_image": "",
@@ -26,11 +25,10 @@
 
     "manage_image_cache": "role:admin",
 
-    "get_task": "",
-    "get_tasks": "",
-    "add_task": "",
-    "modify_task": "",
-    "tasks_api_access": "role:admin",
+    "get_task": "role:admin",
+    "get_tasks": "role:admin",
+    "add_task": "role:admin",
+    "modify_task": "role:admin",
 
     "deactivate": "",
     "reactivate": "",

2018-09-01 23:29:27,968 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 23:29:27.968836 duration_in_ms=16.259
2018-09-01 23:29:27,969 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 23:29:27.969153
2018-09-01 23:29:27,969 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json]
2018-09-01 23:29:27,983 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/ceilometer_policy.json'
2018-09-01 23:29:27,984 [salt.state       :290 ][INFO    ][7274] File changed:
New file
2018-09-01 23:29:27,984 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 23:29:27.984350 duration_in_ms=15.197
2018-09-01 23:29:27,984 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 23:29:27.984674
2018-09-01 23:29:27,984 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json]
2018-09-01 23:29:27,999 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/cinder_policy.json'
2018-09-01 23:29:28,000 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -1,136 +1,113 @@
 {
     "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or (role:admin and is_admin_project:True) or  project_id:%(project_id)s",
-    "admin_api": "is_admin:True or (role:admin and is_admin_project:True)",
-    "volume:attachment_create": "",
-    "volume:attachment_update": "rule:admin_or_owner",
-    "volume:attachment_delete": "rule:admin_or_owner",
-    "message:get_all": "rule:admin_or_owner",
-    "message:get": "rule:admin_or_owner",
-    "message:delete": "rule:admin_or_owner",
-    "clusters:get_all": "rule:admin_api",
-    "clusters:get": "rule:admin_api",
-    "clusters:update": "rule:admin_api",
-    "workers:cleanup": "rule:admin_api",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "admin_api": "is_admin:True",
+
+    "volume:create": "",
+    "volume:delete": "rule:admin_or_owner",
+    "volume:get": "rule:admin_or_owner",
+    "volume:get_all": "rule:admin_or_owner",
+    "volume:get_volume_metadata": "rule:admin_or_owner",
+    "volume:delete_volume_metadata": "rule:admin_or_owner",
+    "volume:update_volume_metadata": "rule:admin_or_owner",
+    "volume:get_volume_admin_metadata": "rule:admin_api",
+    "volume:update_volume_admin_metadata": "rule:admin_api",
+    "volume:get_snapshot": "rule:admin_or_owner",
+    "volume:get_all_snapshots": "rule:admin_or_owner",
+    "volume:create_snapshot": "rule:admin_or_owner",
+    "volume:delete_snapshot": "rule:admin_or_owner",
+    "volume:update_snapshot": "rule:admin_or_owner",
     "volume:get_snapshot_metadata": "rule:admin_or_owner",
+    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
     "volume:update_snapshot_metadata": "rule:admin_or_owner",
-    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
-    "volume:get_all_snapshots": "rule:admin_or_owner",
-    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
-    "volume:create_snapshot": "rule:admin_or_owner",
-    "volume:get_snapshot": "rule:admin_or_owner",
-    "volume:update_snapshot": "rule:admin_or_owner",
-    "volume:delete_snapshot": "rule:admin_or_owner",
-    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
-    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
-    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
-    "snapshot_extension:list_manageable": "rule:admin_api",
-    "snapshot_extension:snapshot_manage": "rule:admin_api",
-    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
-    "backup:get_all": "rule:admin_or_owner",
-    "backup:backup_project_attribute": "rule:admin_api",
-    "backup:create": "",
-    "backup:get": "rule:admin_or_owner",
-    "backup:update": "rule:admin_or_owner",
-    "backup:delete": "rule:admin_or_owner",
-    "backup:restore": "rule:admin_or_owner",
-    "backup:backup-import": "rule:admin_api",
-    "backup:export-import": "rule:admin_api",
-    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
-    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
-    "group:get_all": "rule:admin_or_owner",
-    "group:create": "",
-    "group:get": "rule:admin_or_owner",
-    "group:update": "rule:admin_or_owner",
-    "group:group_types_manage": "rule:admin_api",
-    "group:access_group_types_specs": "rule:admin_api",
-    "group:group_types_specs": "rule:admin_api",
-    "group:get_all_group_snapshots": "rule:admin_or_owner",
-    "group:create_group_snapshot": "",
-    "group:get_group_snapshot": "rule:admin_or_owner",
-    "group:delete_group_snapshot": "rule:admin_or_owner",
-    "group:update_group_snapshot": "rule:admin_or_owner",
-    "group:reset_group_snapshot_status": "rule:admin_or_owner",
-    "group:delete": "rule:admin_or_owner",
-    "group:reset_status": "rule:admin_api",
-    "group:enable_replication": "rule:admin_or_owner",
-    "group:disable_replication": "rule:admin_or_owner",
-    "group:failover_replication": "rule:admin_or_owner",
-    "group:list_replication_targets": "rule:admin_or_owner",
-    "volume_extension:qos_specs_manage:get_all": "rule:admin_api",
-    "volume_extension:qos_specs_manage:get": "rule:admin_api",
-    "volume_extension:qos_specs_manage:create": "rule:admin_api",
-    "volume_extension:qos_specs_manage:update": "rule:admin_api",
-    "volume_extension:qos_specs_manage:delete": "rule:admin_api",
-    "volume_extension:quota_classes": "rule:admin_api",
-    "volume_extension:quotas:show": "rule:admin_or_owner",
-    "volume_extension:quotas:update": "rule:admin_api",
-    "volume_extension:quotas:delete": "rule:admin_api",
-    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
-    "volume_extension:capabilities": "rule:admin_api",
-    "volume_extension:services:index": "rule:admin_api",
-    "volume_extension:services:update": "rule:admin_api",
-    "volume:freeze_host": "rule:admin_api",
-    "volume:thaw_host": "rule:admin_api",
-    "volume:failover_host": "rule:admin_api",
-    "scheduler_extension:scheduler_stats:get_pools": "rule:admin_api",
-    "volume_extension:hosts": "rule:admin_api",
-    "limits_extension:used_limits": "rule:admin_or_owner",
-    "volume_extension:list_manageable": "rule:admin_api",
-    "volume_extension:volume_manage": "rule:admin_api",
-    "volume_extension:volume_unmanage": "rule:admin_api",
+    "volume:extend": "rule:admin_or_owner",
+    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume:retype": "rule:admin_or_owner",
+    "volume:update": "rule:admin_or_owner",
+
     "volume_extension:types_manage": "rule:admin_api",
-    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:types_extra_specs": "rule:admin_api",
+    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:access_types_extra_specs": "rule:admin_api",
-    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:volume_type_access": "rule:admin_or_owner",
     "volume_extension:volume_type_access:addProjectAccess": "rule:admin_api",
     "volume_extension:volume_type_access:removeProjectAccess": "rule:admin_api",
-    "volume:extend": "rule:admin_or_owner",
-    "volume:extend_attached_volume": "rule:admin_or_owner",
-    "volume:revert_to_snapshot": "rule:admin_or_owner",
+    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner",
+    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
+    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
+
+    "volume_extension:quotas:show": "",
+    "volume_extension:quotas:update": "rule:admin_api",
+    "volume_extension:quotas:delete": "rule:admin_api",
+    "volume_extension:quota_classes": "rule:admin_api",
+    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
+
     "volume_extension:volume_admin_actions:reset_status": "rule:admin_api",
-    "volume:retype": "rule:admin_or_owner",
-    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
+    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
     "volume_extension:volume_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
+    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
+
     "volume_extension:volume_actions:upload_public": "rule:admin_api",
     "volume_extension:volume_actions:upload_image": "rule:admin_or_owner",
-    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
-    "volume_extension:volume_actions:initialize_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:terminate_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:roll_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:reserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:unreserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:begin_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:attach": "rule:admin_or_owner",
-    "volume_extension:volume_actions:detach": "rule:admin_or_owner",
-    "volume:get_all_transfers": "rule:admin_or_owner",
-    "volume:create_transfer": "rule:admin_or_owner",
-    "volume:get_transfer": "rule:admin_or_owner",
-    "volume:accept_transfer": "",
-    "volume:delete_transfer": "rule:admin_or_owner",
-    "volume:get_volume_metadata": "rule:admin_or_owner",
-    "volume:create_volume_metadata": "rule:admin_or_owner",
-    "volume:update_volume_metadata": "rule:admin_or_owner",
-    "volume:delete_volume_metadata": "rule:admin_or_owner",
-    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
-    "volume:update_volume_admin_metadata": "rule:admin_api",
-    "volume_extension:types_extra_specs:index": "rule:admin_api",
-    "volume_extension:types_extra_specs:create": "rule:admin_api",
-    "volume_extension:types_extra_specs:show": "rule:admin_api",
-    "volume_extension:types_extra_specs:update": "rule:admin_api",
-    "volume_extension:types_extra_specs:delete": "rule:admin_api",
-    "volume:create": "",
-    "volume:create_from_image": "",
-    "volume:get": "rule:admin_or_owner",
-    "volume:get_all": "rule:admin_or_owner",
-    "volume:update": "rule:admin_or_owner",
-    "volume:delete": "rule:admin_or_owner",
-    "volume:force_delete": "rule:admin_api",
+
     "volume_extension:volume_host_attribute": "rule:admin_api",
     "volume_extension:volume_tenant_attribute": "rule:admin_or_owner",
     "volume_extension:volume_mig_status_attribute": "rule:admin_api",
-    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner"
+    "volume_extension:hosts": "rule:admin_api",
+    "volume_extension:services:index": "rule:admin_api",
+    "volume_extension:services:update" : "rule:admin_api",
+
+    "volume_extension:volume_manage": "rule:admin_api",
+    "volume_extension:volume_unmanage": "rule:admin_api",
+
+    "volume_extension:capabilities": "rule:admin_api",
+
+    "volume:create_transfer": "rule:admin_or_owner",
+    "volume:accept_transfer": "",
+    "volume:delete_transfer": "rule:admin_or_owner",
+    "volume:get_transfer": "rule:admin_or_owner",
+    "volume:get_all_transfers": "rule:admin_or_owner",
+
+    "volume_extension:replication:promote": "rule:admin_api",
+    "volume_extension:replication:reenable": "rule:admin_api",
+
+    "volume:failover_host": "rule:admin_api",
+    "volume:freeze_host": "rule:admin_api",
+    "volume:thaw_host": "rule:admin_api",
+
+    "backup:create" : "",
+    "backup:delete": "rule:admin_or_owner",
+    "backup:get": "rule:admin_or_owner",
+    "backup:get_all": "rule:admin_or_owner",
+    "backup:restore": "rule:admin_or_owner",
+    "backup:backup-import": "rule:admin_api",
+    "backup:backup-export": "rule:admin_api",
+
+    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
+    "snapshot_extension:snapshot_manage": "rule:admin_api",
+    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
+
+    "consistencygroup:create" : "group:nobody",
+    "consistencygroup:delete": "group:nobody",
+    "consistencygroup:update": "group:nobody",
+    "consistencygroup:get": "group:nobody",
+    "consistencygroup:get_all": "group:nobody",
+
+    "consistencygroup:create_cgsnapshot" : "group:nobody",
+    "consistencygroup:delete_cgsnapshot": "group:nobody",
+    "consistencygroup:get_cgsnapshot": "group:nobody",
+    "consistencygroup:get_all_cgsnapshots": "group:nobody",
+
+    "scheduler_extension:scheduler_stats:get_pools" : "rule:admin_api",
+    "message:delete": "rule:admin_or_owner",
+    "message:get": "rule:admin_or_owner",
+    "message:get_all": "rule:admin_or_owner"
 }

2018-09-01 23:29:28,001 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 23:29:28.001309 duration_in_ms=16.635
2018-09-01 23:29:28,001 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 23:29:28.001622
2018-09-01 23:29:28,001 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json]
2018-09-01 23:29:28,016 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/heat_policy.json'
2018-09-01 23:29:28,017 [salt.state       :290 ][INFO    ][7274] File changed:
New file
2018-09-01 23:29:28,017 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 23:29:28.017601 duration_in_ms=15.978
2018-09-01 23:29:28,017 [salt.state       :1770][INFO    ][7274] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 23:29:28.017922
2018-09-01 23:29:28,018 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json]
2018-09-01 23:29:28,039 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/keystone_policy.json'
2018-09-01 23:29:28,041 [salt.state       :290 ][INFO    ][7274] File changed:
--- 
+++ 
@@ -2,50 +2,137 @@
     "admin_required": "role:admin or is_admin:1",
     "service_role": "role:service",
     "service_or_admin": "rule:admin_required or rule:service_role",
-    "owner": "user_id:%(user_id)s",
+    "owner" : "user_id:%(user_id)s",
     "admin_or_owner": "rule:admin_required or rule:owner",
     "token_subject": "user_id:%(target.token.user_id)s",
     "admin_or_token_subject": "rule:admin_required or rule:token_subject",
     "service_admin_or_token_subject": "rule:service_or_admin or rule:token_subject",
-    "identity:authorize_request_token": "rule:admin_required",
-    "identity:get_access_token": "rule:admin_required",
-    "identity:get_access_token_role": "rule:admin_required",
-    "identity:list_access_tokens": "rule:admin_required",
-    "identity:list_access_token_roles": "rule:admin_required",
-    "identity:delete_access_token": "rule:admin_required",
-    "identity:get_auth_catalog": "",
-    "identity:get_auth_projects": "",
-    "identity:get_auth_domains": "",
-    "identity:get_consumer": "rule:admin_required",
-    "identity:list_consumers": "rule:admin_required",
-    "identity:create_consumer": "rule:admin_required",
-    "identity:update_consumer": "rule:admin_required",
-    "identity:delete_consumer": "rule:admin_required",
+
+    "default": "rule:admin_required",
+
+    "identity:get_region": "",
+    "identity:list_regions": "",
+    "identity:create_region": "rule:admin_required",
+    "identity:update_region": "rule:admin_required",
+    "identity:delete_region": "rule:admin_required",
+
+    "identity:get_service": "rule:admin_required",
+    "identity:list_services": "rule:admin_required",
+    "identity:create_service": "rule:admin_required",
+    "identity:update_service": "rule:admin_required",
+    "identity:delete_service": "rule:admin_required",
+
+    "identity:get_endpoint": "rule:admin_required",
+    "identity:list_endpoints": "rule:admin_required",
+    "identity:create_endpoint": "rule:admin_required",
+    "identity:update_endpoint": "rule:admin_required",
+    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_domain": "rule:admin_required",
+    "identity:list_domains": "rule:admin_required",
+    "identity:create_domain": "rule:admin_required",
+    "identity:update_domain": "rule:admin_required",
+    "identity:delete_domain": "rule:admin_required",
+
+    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
+    "identity:list_projects": "rule:admin_required",
+    "identity:list_user_projects": "rule:admin_or_owner",
+    "identity:create_project": "rule:admin_required",
+    "identity:update_project": "rule:admin_required",
+    "identity:delete_project": "rule:admin_required",
+
+    "identity:get_user": "rule:admin_required",
+    "identity:list_users": "rule:admin_required",
+    "identity:create_user": "rule:admin_required",
+    "identity:update_user": "rule:admin_required",
+    "identity:delete_user": "rule:admin_required",
+    "identity:change_password": "rule:admin_or_owner",
+
+    "identity:get_group": "rule:admin_required",
+    "identity:list_groups": "rule:admin_required",
+    "identity:list_groups_for_user": "rule:admin_or_owner",
+    "identity:create_group": "rule:admin_required",
+    "identity:update_group": "rule:admin_required",
+    "identity:delete_group": "rule:admin_required",
+    "identity:list_users_in_group": "rule:admin_required",
+    "identity:remove_user_from_group": "rule:admin_required",
+    "identity:check_user_in_group": "rule:admin_required",
+    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:get_credential": "rule:admin_required",
     "identity:list_credentials": "rule:admin_required",
     "identity:create_credential": "rule:admin_required",
     "identity:update_credential": "rule:admin_required",
     "identity:delete_credential": "rule:admin_required",
-    "identity:get_domain": "rule:admin_required or token.project.domain.id:%(target.domain.id)s",
-    "identity:list_domains": "rule:admin_required",
-    "identity:create_domain": "rule:admin_required",
-    "identity:update_domain": "rule:admin_required",
-    "identity:delete_domain": "rule:admin_required",
-    "identity:create_domain_config": "rule:admin_required",
-    "identity:get_domain_config": "rule:admin_required",
-    "identity:get_security_compliance_domain_config": "",
-    "identity:update_domain_config": "rule:admin_required",
-    "identity:delete_domain_config": "rule:admin_required",
-    "identity:get_domain_config_default": "rule:admin_required",
+
     "identity:ec2_get_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
     "identity:ec2_list_credentials": "rule:admin_or_owner",
     "identity:ec2_create_credential": "rule:admin_or_owner",
     "identity:ec2_delete_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
-    "identity:get_endpoint": "rule:admin_required",
-    "identity:list_endpoints": "rule:admin_required",
-    "identity:create_endpoint": "rule:admin_required",
-    "identity:update_endpoint": "rule:admin_required",
-    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_role": "rule:admin_required",
+    "identity:list_roles": "rule:admin_required",
+    "identity:create_role": "rule:admin_required",
+    "identity:update_role": "rule:admin_required",
+    "identity:delete_role": "rule:admin_required",
+    "identity:get_domain_role": "rule:admin_required",
+    "identity:list_domain_roles": "rule:admin_required",
+    "identity:create_domain_role": "rule:admin_required",
+    "identity:update_domain_role": "rule:admin_required",
+    "identity:delete_domain_role": "rule:admin_required",
+
+    "identity:get_implied_role": "rule:admin_required ",
+    "identity:list_implied_roles": "rule:admin_required",
+    "identity:create_implied_role": "rule:admin_required",
+    "identity:delete_implied_role": "rule:admin_required",
+    "identity:list_role_inference_rules": "rule:admin_required",
+    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:check_grant": "rule:admin_required",
+    "identity:list_grants": "rule:admin_required",
+    "identity:create_grant": "rule:admin_required",
+    "identity:revoke_grant": "rule:admin_required",
+
+    "identity:list_role_assignments": "rule:admin_required",
+    "identity:list_role_assignments_for_tree": "rule:admin_required",
+
+    "identity:get_policy": "rule:admin_required",
+    "identity:list_policies": "rule:admin_required",
+    "identity:create_policy": "rule:admin_required",
+    "identity:update_policy": "rule:admin_required",
+    "identity:delete_policy": "rule:admin_required",
+
+    "identity:check_token": "rule:admin_or_token_subject",
+    "identity:validate_token": "rule:service_admin_or_token_subject",
+    "identity:validate_token_head": "rule:service_or_admin",
+    "identity:revocation_list": "rule:service_or_admin",
+    "identity:revoke_token": "rule:admin_or_token_subject",
+
+    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
+    "identity:list_trusts": "",
+    "identity:list_roles_for_trust": "",
+    "identity:get_role_for_trust": "",
+    "identity:delete_trust": "",
+
+    "identity:create_consumer": "rule:admin_required",
+    "identity:get_consumer": "rule:admin_required",
+    "identity:list_consumers": "rule:admin_required",
+    "identity:delete_consumer": "rule:admin_required",
+    "identity:update_consumer": "rule:admin_required",
+
+    "identity:authorize_request_token": "rule:admin_required",
+    "identity:list_access_token_roles": "rule:admin_required",
+    "identity:get_access_token_role": "rule:admin_required",
+    "identity:list_access_tokens": "rule:admin_required",
+    "identity:get_access_token": "rule:admin_required",
+    "identity:delete_access_token": "rule:admin_required",
+
+    "identity:list_projects_for_endpoint": "rule:admin_required",
+    "identity:add_endpoint_to_project": "rule:admin_required",
+    "identity:check_endpoint_in_project": "rule:admin_required",
+    "identity:list_endpoints_for_project": "rule:admin_required",
+    "identity:remove_endpoint_from_project": "rule:admin_required",
+
     "identity:create_endpoint_group": "rule:admin_required",
     "identity:list_endpoint_groups": "rule:admin_required",
     "identity:get_endpoint_group": "rule:admin_required",
@@ -57,41 +144,40 @@
     "identity:list_endpoint_groups_for_project": "rule:admin_required",
     "identity:add_endpoint_group_to_project": "rule:admin_required",
     "identity:remove_endpoint_group_from_project": "rule:admin_required",
-    "identity:check_grant": "rule:admin_required",
-    "identity:list_grants": "rule:admin_required",
-    "identity:create_grant": "rule:admin_required",
-    "identity:revoke_grant": "rule:admin_required",
-    "identity:get_group": "rule:admin_required",
-    "identity:list_groups": "rule:admin_required",
-    "identity:list_groups_for_user": "rule:admin_or_owner",
-    "identity:create_group": "rule:admin_required",
-    "identity:update_group": "rule:admin_required",
-    "identity:delete_group": "rule:admin_required",
-    "identity:list_users_in_group": "rule:admin_required",
-    "identity:remove_user_from_group": "rule:admin_required",
-    "identity:check_user_in_group": "rule:admin_required",
-    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:create_identity_provider": "rule:admin_required",
     "identity:list_identity_providers": "rule:admin_required",
-    "identity:get_identity_provider": "rule:admin_required",
+    "identity:get_identity_providers": "rule:admin_required",
     "identity:update_identity_provider": "rule:admin_required",
     "identity:delete_identity_provider": "rule:admin_required",
-    "identity:get_implied_role": "rule:admin_required",
-    "identity:list_implied_roles": "rule:admin_required",
-    "identity:create_implied_role": "rule:admin_required",
-    "identity:delete_implied_role": "rule:admin_required",
-    "identity:list_role_inference_rules": "rule:admin_required",
-    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:create_protocol": "rule:admin_required",
+    "identity:update_protocol": "rule:admin_required",
+    "identity:get_protocol": "rule:admin_required",
+    "identity:list_protocols": "rule:admin_required",
+    "identity:delete_protocol": "rule:admin_required",
+
     "identity:create_mapping": "rule:admin_required",
     "identity:get_mapping": "rule:admin_required",
     "identity:list_mappings": "rule:admin_required",
     "identity:delete_mapping": "rule:admin_required",
     "identity:update_mapping": "rule:admin_required",
-    "identity:get_policy": "rule:admin_required",
-    "identity:list_policies": "rule:admin_required",
-    "identity:create_policy": "rule:admin_required",
-    "identity:update_policy": "rule:admin_required",
-    "identity:delete_policy": "rule:admin_required",
+
+    "identity:create_service_provider": "rule:admin_required",
+    "identity:list_service_providers": "rule:admin_required",
+    "identity:get_service_provider": "rule:admin_required",
+    "identity:update_service_provider": "rule:admin_required",
+    "identity:delete_service_provider": "rule:admin_required",
+
+    "identity:get_auth_catalog": "",
+    "identity:get_auth_projects": "",
+    "identity:get_auth_domains": "",
+
+    "identity:list_projects_for_groups": "",
+    "identity:list_domains_for_groups": "",
+
+    "identity:list_revoke_events": "",
+
     "identity:create_policy_association_for_endpoint": "rule:admin_required",
     "identity:check_policy_association_for_endpoint": "rule:admin_required",
     "identity:delete_policy_association_for_endpoint": "rule:admin_required",
@@ -103,72 +189,10 @@
     "identity:delete_policy_association_for_region_and_service": "rule:admin_required",
     "identity:get_policy_for_endpoint": "rule:admin_required",
     "identity:list_endpoints_for_policy": "rule:admin_required",
-    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:list_projects": "rule:admin_required",
-    "identity:list_user_projects": "rule:admin_or_owner",
-    "identity:create_project": "rule:admin_required",
-    "identity:update_project": "rule:admin_required",
-    "identity:delete_project": "rule:admin_required",
-    "identity:list_project_tags": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:get_project_tag": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:update_project_tags": "rule:admin_required",
-    "identity:create_project_tag": "rule:admin_required",
-    "identity:delete_project_tags": "rule:admin_required",
-    "identity:delete_project_tag": "rule:admin_required",
-    "identity:list_projects_for_endpoint": "rule:admin_required",
-    "identity:add_endpoint_to_project": "rule:admin_required",
-    "identity:check_endpoint_in_project": "rule:admin_required",
-    "identity:list_endpoints_for_project": "rule:admin_required",
-    "identity:remove_endpoint_from_project": "rule:admin_required",
-    "identity:create_protocol": "rule:admin_required",
-    "identity:update_protocol": "rule:admin_required",
-    "identity:get_protocol": "rule:admin_required",
-    "identity:list_protocols": "rule:admin_required",
-    "identity:delete_protocol": "rule:admin_required",
-    "identity:get_region": "",
-    "identity:list_regions": "",
-    "identity:create_region": "rule:admin_required",
-    "identity:update_region": "rule:admin_required",
-    "identity:delete_region": "rule:admin_required",
-    "identity:list_revoke_events": "rule:service_or_admin",
-    "identity:get_role": "rule:admin_required",
-    "identity:list_roles": "rule:admin_required",
-    "identity:create_role": "rule:admin_required",
-    "identity:update_role": "rule:admin_required",
-    "identity:delete_role": "rule:admin_required",
-    "identity:get_domain_role": "rule:admin_required",
-    "identity:list_domain_roles": "rule:admin_required",
-    "identity:create_domain_role": "rule:admin_required",
-    "identity:update_domain_role": "rule:admin_required",
-    "identity:delete_domain_role": "rule:admin_required",
-    "identity:list_role_assignments": "rule:admin_required",
-    "identity:list_role_assignments_for_tree": "rule:admin_required",
-    "identity:get_service": "rule:admin_required",
-    "identity:list_services": "rule:admin_required",
-    "identity:create_service": "rule:admin_required",
-    "identity:update_service": "rule:admin_required",
-    "identity:delete_service": "rule:admin_required",
-    "identity:create_service_provider": "rule:admin_required",
-    "identity:list_service_providers": "rule:admin_required",
-    "identity:get_service_provider": "rule:admin_required",
-    "identity:update_service_provider": "rule:admin_required",
-    "identity:delete_service_provider": "rule:admin_required",
-    "identity:revocation_list": "rule:service_or_admin",
-    "identity:check_token": "rule:admin_or_token_subject",
-    "identity:validate_token": "rule:service_admin_or_token_subject",
-    "identity:validate_token_head": "rule:service_or_admin",
-    "identity:revoke_token": "rule:admin_or_token_subject",
-    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
-    "identity:list_trusts": "",
-    "identity:list_roles_for_trust": "",
-    "identity:get_role_for_trust": "",
-    "identity:delete_trust": "",
-    "identity:get_trust": "",
-    "identity:get_user": "rule:admin_or_owner",
-    "identity:list_users": "rule:admin_required",
-    "identity:list_projects_for_user": "",
-    "identity:list_domains_for_user": "",
-    "identity:create_user": "rule:admin_required",
-    "identity:update_user": "rule:admin_required",
-    "identity:delete_user": "rule:admin_required"
+
+    "identity:create_domain_config": "rule:admin_required",
+    "identity:get_domain_config": "rule:admin_required",
+    "identity:update_domain_config": "rule:admin_required",
+    "identity:delete_domain_config": "rule:admin_required",
+    "identity:get_domain_config_default": "rule:admin_required"
 }

2018-09-01 23:29:28,041 [salt.state       :1941][INFO    ][7274] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 23:29:28.041766 duration_in_ms=23.845
2018-09-01 23:29:28,042 [salt.state       :1770][INFO    ][7274] Running state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 23:29:28.042131
2018-09-01 23:29:28,042 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/etc/apache2/conf-available/openstack-dashboard.conf]
2018-09-01 23:29:28,058 [salt.fileclient  :1215][INFO    ][7274] Fetching file from saltenv 'base', ** done ** 'horizon/files/openstack-dashboard.conf.Debian'
2018-09-01 23:29:28,101 [salt.state       :290 ][INFO    ][7274] File changed:
New file
2018-09-01 23:29:28,101 [salt.state       :1941][INFO    ][7274] Completed state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 23:29:28.101904 duration_in_ms=59.773
2018-09-01 23:29:28,106 [salt.state       :1770][INFO    ][7274] Running state [wsgi] at time 23:29:28.106623
2018-09-01 23:29:28,106 [salt.state       :1803][INFO    ][7274] Executing state apache_module.enabled for [wsgi]
2018-09-01 23:29:28,108 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['a2enmod', 'wsgi'] in directory '/root'
2018-09-01 23:29:28,191 [salt.state       :290 ][INFO    ][7274] {'new': 'wsgi', 'old': None}
2018-09-01 23:29:28,192 [salt.state       :1941][INFO    ][7274] Completed state [wsgi] at time 23:29:28.192278 duration_in_ms=85.655
2018-09-01 23:29:28,204 [salt.state       :1770][INFO    ][7274] Running state [openstack-dashboard] at time 23:29:28.204172
2018-09-01 23:29:28,204 [salt.state       :1803][INFO    ][7274] Executing state apache_conf.enabled for [openstack-dashboard]
2018-09-01 23:29:28,205 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['a2enconf', 'openstack-dashboard'] in directory '/root'
2018-09-01 23:29:28,253 [salt.state       :290 ][INFO    ][7274] {'new': 'openstack-dashboard', 'old': None}
2018-09-01 23:29:28,253 [salt.state       :1941][INFO    ][7274] Completed state [openstack-dashboard] at time 23:29:28.253696 duration_in_ms=49.524
2018-09-01 23:29:28,687 [salt.state       :1770][INFO    ][7274] Running state [/var/log/horizon] at time 23:29:28.687457
2018-09-01 23:29:28,688 [salt.state       :1803][INFO    ][7274] Executing state file.directory for [/var/log/horizon]
2018-09-01 23:29:28,689 [salt.state       :290 ][INFO    ][7274] {'/var/log/horizon': 'New Dir'}
2018-09-01 23:29:28,690 [salt.state       :1941][INFO    ][7274] Completed state [/var/log/horizon] at time 23:29:28.690016 duration_in_ms=2.559
2018-09-01 23:29:28,690 [salt.state       :1770][INFO    ][7274] Running state [/var/log/horizon/horizon.log] at time 23:29:28.690480
2018-09-01 23:29:28,690 [salt.state       :1803][INFO    ][7274] Executing state file.managed for [/var/log/horizon/horizon.log]
2018-09-01 23:29:28,691 [salt.loaded.int.states.file:2150][WARNING ][7274] State for file: /var/log/horizon/horizon.log - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-09-01 23:29:28,692 [salt.state       :290 ][INFO    ][7274] {'new': 'file /var/log/horizon/horizon.log created', 'group': 'adm', 'mode': '0640', 'user': 'horizon'}
2018-09-01 23:29:28,692 [salt.state       :1941][INFO    ][7274] Completed state [/var/log/horizon/horizon.log] at time 23:29:28.692552 duration_in_ms=2.072
2018-09-01 23:29:28,693 [salt.state       :1770][INFO    ][7274] Running state [apache2] at time 23:29:28.693330
2018-09-01 23:29:28,693 [salt.state       :1803][INFO    ][7274] Executing state service.running for [apache2]
2018-09-01 23:29:28,694 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:29:28,708 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:28,725 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'apache2.service'] in directory '/root'
2018-09-01 23:29:30,086 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:29:30,103 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:30,122 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7274] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:30,143 [salt.state       :290 ][INFO    ][7274] {'apache2': True}
2018-09-01 23:29:30,144 [salt.state       :1941][INFO    ][7274] Completed state [apache2] at time 23:29:30.143999 duration_in_ms=1450.669
2018-09-01 23:29:30,147 [salt.minion      :1708][INFO    ][7274] Returning information for job: 20180901232603765218
2018-09-01 23:29:30,819 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command state.sls with jid 20180901232930804446
2018-09-01 23:29:30,838 [salt.minion      :1431][INFO    ][12934] Starting a new job with PID 12934
2018-09-01 23:29:34,706 [salt.state       :905 ][INFO    ][12934] Loading fresh modules for state activity
2018-09-01 23:29:35,058 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/init.sls'
2018-09-01 23:29:35,086 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/server.sls'
2018-09-01 23:29:35,521 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/server/users.sls'
2018-09-01 23:29:35,557 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/server/sites.sls'
2018-09-01 23:29:35,615 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-09-01 23:29:35,646 [salt.loaded.int.module.cmdmod:722 ][ERROR   ][12934] Command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' failed with return code: 1
2018-09-01 23:29:35,647 [salt.loaded.int.module.cmdmod:724 ][ERROR   ][12934] stdout: cat: /etc/ssl/certs/172.30.10.101-with-chain.crt: No such file or directory
2018-09-01 23:29:35,648 [salt.loaded.int.module.cmdmod:728 ][ERROR   ][12934] retcode: 1
2018-09-01 23:29:35,649 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt' in directory '/root'
2018-09-01 23:29:35,731 [salt.state       :1770][INFO    ][12934] Running state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 23:29:35.731688
2018-09-01 23:29:35,732 [salt.state       :1803][INFO    ][12934] Executing state cmd.run for [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt]
2018-09-01 23:29:35,732 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command '/bin/true' in directory '/root'
2018-09-01 23:29:35,761 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-09-01 23:29:35,777 [salt.state       :290 ][INFO    ][12934] {'pid': 12957, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-09-01 23:29:35,778 [salt.state       :1941][INFO    ][12934] Completed state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 23:29:35.778616 duration_in_ms=46.929
2018-09-01 23:29:35,924 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232935906239
2018-09-01 23:29:35,945 [salt.minion      :1431][INFO    ][12961] Starting a new job with PID 12961
2018-09-01 23:29:35,957 [salt.minion      :1708][INFO    ][12961] Returning information for job: 20180901232935906239
2018-09-01 23:29:37,043 [salt.state       :1770][INFO    ][12934] Running state [nginx] at time 23:29:37.043039
2018-09-01 23:29:37,043 [salt.state       :1803][INFO    ][12934] Executing state pkg.installed for [nginx]
2018-09-01 23:29:37,044 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:37,431 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['apt-cache', '-q', 'policy', 'nginx'] in directory '/root'
2018-09-01 23:29:37,543 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 23:29:39,265 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:29:39,295 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'nginx'] in directory '/root'
2018-09-01 23:29:46,118 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232946100957
2018-09-01 23:29:46,137 [salt.minion      :1431][INFO    ][13724] Starting a new job with PID 13724
2018-09-01 23:29:46,156 [salt.minion      :1708][INFO    ][13724] Returning information for job: 20180901232946100957
2018-09-01 23:29:51,573 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:51,617 [salt.state       :290 ][INFO    ][12934] Made the following changes:
'libgd3' changed from 'absent' to '2.1.1-4ubuntu0.16.04.10'
'nginx-core' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libxpm4' changed from 'absent' to '1:3.5.11-1ubuntu0.16.04.1'
'nginx' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'nginx-common' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libfontconfig' changed from 'absent' to '1'
'fonts-dejavu-core' changed from 'absent' to '2.35-1'
'fontconfig-config' changed from 'absent' to '2.11.94-0ubuntu1.1'
'libvpx3' changed from 'absent' to '1.5.0-2ubuntu1'
'libfontconfig1' changed from 'absent' to '2.11.94-0ubuntu1.1'

2018-09-01 23:29:51,639 [salt.state       :905 ][INFO    ][12934] Loading fresh modules for state activity
2018-09-01 23:29:51,763 [salt.state       :1941][INFO    ][12934] Completed state [nginx] at time 23:29:51.763870 duration_in_ms=14720.831
2018-09-01 23:29:51,774 [salt.state       :1770][INFO    ][12934] Running state [apache2-utils] at time 23:29:51.774284
2018-09-01 23:29:51,775 [salt.state       :1803][INFO    ][12934] Executing state pkg.installed for [apache2-utils]
2018-09-01 23:29:52,239 [salt.state       :290 ][INFO    ][12934] All specified packages are already installed
2018-09-01 23:29:52,239 [salt.state       :1941][INFO    ][12934] Completed state [apache2-utils] at time 23:29:52.239930 duration_in_ms=465.647
2018-09-01 23:29:52,240 [salt.state       :1770][INFO    ][12934] Running state [openssl] at time 23:29:52.240344
2018-09-01 23:29:52,240 [salt.state       :1803][INFO    ][12934] Executing state pkg.installed for [openssl]
2018-09-01 23:29:52,245 [salt.state       :290 ][INFO    ][12934] All specified packages are already installed
2018-09-01 23:29:52,246 [salt.state       :1941][INFO    ][12934] Completed state [openssl] at time 23:29:52.246251 duration_in_ms=5.908
2018-09-01 23:29:52,248 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:52.248150
2018-09-01 23:29:52,248 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf]
2018-09-01 23:29:52,271 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/proxy.conf'
2018-09-01 23:29:52,319 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_limit.conf'
2018-09-01 23:29:52,346 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/headers/_strict_transport_security.conf'
2018-09-01 23:29:52,365 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_name.conf'
2018-09-01 23:29:52,386 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl.conf'
2018-09-01 23:29:52,436 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl_secure.conf'
2018-09-01 23:29:52,454 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_auth.conf'
2018-09-01 23:29:52,471 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_access_policy.conf'
2018-09-01 23:29:52,479 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:52,480 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:52.480161 duration_in_ms=232.01
2018-09-01 23:29:52,480 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:52.480812
2018-09-01 23:29:52,481 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf]
2018-09-01 23:29:52,483 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf'}
2018-09-01 23:29:52,483 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:52.483785 duration_in_ms=2.974
2018-09-01 23:29:52,484 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:52.484147
2018-09-01 23:29:52,484 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2018-09-01 23:29:52,484 [salt.state       :290 ][INFO    ][12934] File /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2018-09-01 23:29:52,485 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:52.485286 duration_in_ms=1.14
2018-09-01 23:29:52,485 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:52.485646
2018-09-01 23:29:52,486 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2018-09-01 23:29:52,486 [salt.state       :290 ][INFO    ][12934] File /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2018-09-01 23:29:52,486 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:52.486751 duration_in_ms=1.105
2018-09-01 23:29:52,487 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 23:29:52.487310
2018-09-01 23:29:52,487 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf]
2018-09-01 23:29:52,639 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:52,639 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 23:29:52.639795 duration_in_ms=152.484
2018-09-01 23:29:52,640 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 23:29:52.640243
2018-09-01 23:29:52,640 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf]
2018-09-01 23:29:52,642 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf'}
2018-09-01 23:29:52,642 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 23:29:52.642598 duration_in_ms=2.355
2018-09-01 23:29:52,643 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 23:29:52.643308
2018-09-01 23:29:52,643 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf]
2018-09-01 23:29:52,795 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:52,795 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 23:29:52.795696 duration_in_ms=152.387
2018-09-01 23:29:52,796 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 23:29:52.796155
2018-09-01 23:29:52,796 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf]
2018-09-01 23:29:52,798 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf'}
2018-09-01 23:29:52,798 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 23:29:52.798528 duration_in_ms=2.373
2018-09-01 23:29:52,799 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 23:29:52.799242
2018-09-01 23:29:52,799 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf]
2018-09-01 23:29:52,949 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:52,950 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 23:29:52.950535 duration_in_ms=151.291
2018-09-01 23:29:52,951 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 23:29:52.951011
2018-09-01 23:29:52,951 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf]
2018-09-01 23:29:52,952 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf'}
2018-09-01 23:29:52,953 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 23:29:52.953241 duration_in_ms=2.23
2018-09-01 23:29:52,953 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 23:29:52.953864
2018-09-01 23:29:52,954 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf]
2018-09-01 23:29:53,110 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,111 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.111527 duration_in_ms=157.661
2018-09-01 23:29:53,112 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.112034
2018-09-01 23:29:53,112 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf]
2018-09-01 23:29:53,114 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf'}
2018-09-01 23:29:53,114 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.114628 duration_in_ms=2.594
2018-09-01 23:29:53,115 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 23:29:53.115426
2018-09-01 23:29:53,115 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_novnc.conf]
2018-09-01 23:29:53,269 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,270 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 23:29:53.270644 duration_in_ms=155.217
2018-09-01 23:29:53,271 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 23:29:53.271204
2018-09-01 23:29:53,271 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf]
2018-09-01 23:29:53,273 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_novnc.conf'}
2018-09-01 23:29:53,273 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 23:29:53.273863 duration_in_ms=2.659
2018-09-01 23:29:53,274 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 23:29:53.274637
2018-09-01 23:29:53,275 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf]
2018-09-01 23:29:53,428 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,428 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 23:29:53.428751 duration_in_ms=154.113
2018-09-01 23:29:53,429 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 23:29:53.429212
2018-09-01 23:29:53,429 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf]
2018-09-01 23:29:53,431 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf'}
2018-09-01 23:29:53,431 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 23:29:53.431636 duration_in_ms=2.424
2018-09-01 23:29:53,432 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 23:29:53.432284
2018-09-01 23:29:53,432 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf]
2018-09-01 23:29:53,588 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,589 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 23:29:53.588966 duration_in_ms=156.682
2018-09-01 23:29:53,589 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 23:29:53.589425
2018-09-01 23:29:53,589 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf]
2018-09-01 23:29:53,591 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf'}
2018-09-01 23:29:53,591 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 23:29:53.591780 duration_in_ms=2.355
2018-09-01 23:29:53,592 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:53.592421
2018-09-01 23:29:53,592 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf]
2018-09-01 23:29:53,749 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,750 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:53.749967 duration_in_ms=157.545
2018-09-01 23:29:53,750 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:53.750407
2018-09-01 23:29:53,750 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf]
2018-09-01 23:29:53,752 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf'}
2018-09-01 23:29:53,752 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:53.752756 duration_in_ms=2.348
2018-09-01 23:29:53,753 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 23:29:53.753403
2018-09-01 23:29:53,753 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf]
2018-09-01 23:29:53,772 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/redirect.conf'
2018-09-01 23:29:53,787 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,787 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 23:29:53.787616 duration_in_ms=34.213
2018-09-01 23:29:53,787 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 23:29:53.787832
2018-09-01 23:29:53,788 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf]
2018-09-01 23:29:53,789 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf'}
2018-09-01 23:29:53,789 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 23:29:53.789440 duration_in_ms=1.608
2018-09-01 23:29:53,789 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 23:29:53.789843
2018-09-01 23:29:53,790 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_static_reclass_doc.conf]
2018-09-01 23:29:53,805 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/static.conf'
2018-09-01 23:29:53,844 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/_log.conf'
2018-09-01 23:29:53,937 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:53,937 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 23:29:53.937218 duration_in_ms=147.375
2018-09-01 23:29:53,937 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 23:29:53.937486
2018-09-01 23:29:53,937 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf]
2018-09-01 23:29:53,939 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf'}
2018-09-01 23:29:53,939 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 23:29:53.939567 duration_in_ms=2.081
2018-09-01 23:29:53,940 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 23:29:53.940115
2018-09-01 23:29:53,940 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf]
2018-09-01 23:29:54,098 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:54,099 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 23:29:54.099215 duration_in_ms=159.1
2018-09-01 23:29:54,099 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 23:29:54.099476
2018-09-01 23:29:54,099 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf]
2018-09-01 23:29:54,101 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf'}
2018-09-01 23:29:54,101 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 23:29:54.101466 duration_in_ms=1.99
2018-09-01 23:29:54,102 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 23:29:54.101983
2018-09-01 23:29:54,102 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_stats_stats.conf]
2018-09-01 23:29:54,118 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/stats.conf'
2018-09-01 23:29:54,127 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:54,127 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 23:29:54.127524 duration_in_ms=25.541
2018-09-01 23:29:54,127 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 23:29:54.127732
2018-09-01 23:29:54,127 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_stats_stats.conf]
2018-09-01 23:29:54,129 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_stats_stats.conf'}
2018-09-01 23:29:54,129 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 23:29:54.129268 duration_in_ms=1.535
2018-09-01 23:29:54,129 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 23:29:54.129662
2018-09-01 23:29:54,129 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf]
2018-09-01 23:29:54,285 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:54,285 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 23:29:54.285707 duration_in_ms=156.044
2018-09-01 23:29:54,285 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 23:29:54.285933
2018-09-01 23:29:54,286 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf]
2018-09-01 23:29:54,287 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf'}
2018-09-01 23:29:54,287 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 23:29:54.287689 duration_in_ms=1.756
2018-09-01 23:29:54,288 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:54.288151
2018-09-01 23:29:54,288 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf]
2018-09-01 23:29:54,441 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:54,441 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:54.441379 duration_in_ms=153.227
2018-09-01 23:29:54,441 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:54.441599
2018-09-01 23:29:54,441 [salt.state       :1803][INFO    ][12934] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf]
2018-09-01 23:29:54,443 [salt.state       :290 ][INFO    ][12934] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf'}
2018-09-01 23:29:54,443 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:54.443313 duration_in_ms=1.714
2018-09-01 23:29:54,443 [salt.state       :1770][INFO    ][12934] Running state [/usr/sbin/policy-rc.d] at time 23:29:54.443536
2018-09-01 23:29:54,443 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/usr/sbin/policy-rc.d]
2018-09-01 23:29:54,450 [salt.state       :290 ][INFO    ][12934] File changed:
New file
2018-09-01 23:29:54,450 [salt.state       :1941][INFO    ][12934] Completed state [/usr/sbin/policy-rc.d] at time 23:29:54.450858 duration_in_ms=7.322
2018-09-01 23:29:54,451 [salt.state       :1770][INFO    ][12934] Running state [/usr/sbin/policy-rc.d] at time 23:29:54.451312
2018-09-01 23:29:54,451 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/usr/sbin/policy-rc.d]
2018-09-01 23:29:54,451 [salt.state       :290 ][INFO    ][12934] {'removed': '/usr/sbin/policy-rc.d'}
2018-09-01 23:29:54,452 [salt.state       :1941][INFO    ][12934] Completed state [/usr/sbin/policy-rc.d] at time 23:29:54.452063 duration_in_ms=0.751
2018-09-01 23:29:54,452 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/conf.d/default.conf] at time 23:29:54.452510
2018-09-01 23:29:54,452 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/etc/nginx/conf.d/default.conf]
2018-09-01 23:29:54,453 [salt.state       :290 ][INFO    ][12934] File /etc/nginx/conf.d/default.conf is not present
2018-09-01 23:29:54,453 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/conf.d/default.conf] at time 23:29:54.453205 duration_in_ms=0.695
2018-09-01 23:29:54,453 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-enabled/default] at time 23:29:54.453639
2018-09-01 23:29:54,453 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/etc/nginx/sites-enabled/default]
2018-09-01 23:29:54,454 [salt.state       :290 ][INFO    ][12934] {'removed': '/etc/nginx/sites-enabled/default'}
2018-09-01 23:29:54,454 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-enabled/default] at time 23:29:54.454363 duration_in_ms=0.723
2018-09-01 23:29:54,454 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/sites-available/default] at time 23:29:54.454842
2018-09-01 23:29:54,455 [salt.state       :1803][INFO    ][12934] Executing state file.absent for [/etc/nginx/sites-available/default]
2018-09-01 23:29:54,455 [salt.state       :290 ][INFO    ][12934] {'removed': '/etc/nginx/sites-available/default'}
2018-09-01 23:29:54,455 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/sites-available/default] at time 23:29:54.455673 duration_in_ms=0.831
2018-09-01 23:29:54,456 [salt.state       :1770][INFO    ][12934] Running state [/etc/nginx/nginx.conf] at time 23:29:54.456151
2018-09-01 23:29:54,456 [salt.state       :1803][INFO    ][12934] Executing state file.managed for [/etc/nginx/nginx.conf]
2018-09-01 23:29:54,474 [salt.fileclient  :1215][INFO    ][12934] Fetching file from saltenv 'base', ** done ** 'nginx/files/nginx.conf'
2018-09-01 23:29:54,499 [salt.state       :290 ][INFO    ][12934] File changed:
--- 
+++ 
@@ -1,85 +1,102 @@
 user www-data;
 worker_processes auto;
+worker_rlimit_nofile 20000;
 pid /run/nginx.pid;
 
+
 events {
-	worker_connections 768;
-	# multi_accept on;
+        worker_connections 1024;
+        # multi_accept on;
 }
 
 http {
 
-	##
-	# Basic Settings
-	##
+        ##
+        # Basic Settings
+        ##
 
-	sendfile on;
-	tcp_nopush on;
-	tcp_nodelay on;
-	keepalive_timeout 65;
-	types_hash_max_size 2048;
-	# server_tokens off;
+        sendfile on;
+        tcp_nopush on;
+        tcp_nodelay on;
+        keepalive_timeout 65;
+        types_hash_max_size 2048;
+        server_tokens off;
 
-	# server_names_hash_bucket_size 64;
-	# server_name_in_redirect off;
+        server_names_hash_bucket_size 128;
+        # server_name_in_redirect off;
 
-	include /etc/nginx/mime.types;
-	default_type application/octet-stream;
+        variables_hash_bucket_size 128;
 
-	##
-	# SSL Settings
-	##
+        include /etc/nginx/mime.types;
+        default_type application/octet-stream;
 
-	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
-	ssl_prefer_server_ciphers on;
+        ##
+        # Logging Settings
+        ##
 
-	##
-	# Logging Settings
-	##
+        access_log /var/log/nginx/access.log;
+        error_log /var/log/nginx/error.log;
 
-	access_log /var/log/nginx/access.log;
-	error_log /var/log/nginx/error.log;
+        ##
+        # Gzip Settings
+        ##
 
-	##
-	# Gzip Settings
-	##
+        gzip on;
+        gzip_disable "msie6";
 
-	gzip on;
-	gzip_disable "msie6";
+        # gzip_vary on;
+        # gzip_proxied any;
+        # gzip_comp_level 6;
+        # gzip_buffers 16 8k;
+        # gzip_http_version 1.1;
+        # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
 
-	# gzip_vary on;
-	# gzip_proxied any;
-	# gzip_comp_level 6;
-	# gzip_buffers 16 8k;
-	# gzip_http_version 1.1;
-	# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
+        ##
+        # nginx-naxsi config
+        ##
+        # Uncomment it if you installed nginx-naxsi
+        ##
 
-	##
-	# Virtual Host Configs
-	##
+        #include /etc/nginx/naxsi_core.rules;
 
-	include /etc/nginx/conf.d/*.conf;
-	include /etc/nginx/sites-enabled/*;
+        ##
+        # nginx-passenger config
+        ##
+        # Uncomment it if you installed nginx-passenger
+        ##
+
+        #passenger_root /usr;
+        #passenger_ruby /usr/bin/ruby;
+
+
+
+        ##
+        # Virtual Host Configs
+        ##
+
+        include /etc/nginx/conf.d/*.conf;
+        include /etc/nginx/sites-enabled/*.conf;
 }
 
 
+
 #mail {
-#	# See sample authentication script at:
-#	# http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
-# 
-#	# auth_http localhost/auth.php;
-#	# pop3_capabilities "TOP" "USER";
-#	# imap_capabilities "IMAP4rev1" "UIDPLUS";
-# 
-#	server {
-#		listen     localhost:110;
-#		protocol   pop3;
-#		proxy      on;
-#	}
-# 
-#	server {
-#		listen     localhost:143;
-#		protocol   imap;
-#		proxy      on;
-#	}
+#       # See sample authentication script at:
+#       # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
+#
+#       # auth_http localhost/auth.php;
+#       # pop3_capabilities "TOP" "USER";
+#       # imap_capabilities "IMAP4rev1" "UIDPLUS";
+#
+#       server {
+#               listen     localhost:110;
+#               protocol   pop3;
+#               proxy      on;
+#       }
+#
+#       server {
+#               listen     localhost:143;
+#               protocol   imap;
+#               proxy      on;
+#       }
 #}

2018-09-01 23:29:54,499 [salt.state       :1941][INFO    ][12934] Completed state [/etc/nginx/nginx.conf] at time 23:29:54.499248 duration_in_ms=43.097
2018-09-01 23:29:54,499 [salt.state       :1770][INFO    ][12934] Running state [/etc/ssl/private] at time 23:29:54.499664
2018-09-01 23:29:54,499 [salt.state       :1803][INFO    ][12934] Executing state file.directory for [/etc/ssl/private]
2018-09-01 23:29:54,500 [salt.state       :290 ][INFO    ][12934] Directory /etc/ssl/private is in the correct state
Directory /etc/ssl/private updated
2018-09-01 23:29:54,500 [salt.state       :1941][INFO    ][12934] Completed state [/etc/ssl/private] at time 23:29:54.500602 duration_in_ms=0.938
2018-09-01 23:29:54,509 [salt.state       :1770][INFO    ][12934] Running state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 23:29:54.508974
2018-09-01 23:29:54,509 [salt.state       :1803][INFO    ][12934] Executing state cmd.run for [openssl dhparam -out /etc/ssl/dhparams.pem 2048]
2018-09-01 23:29:54,509 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command 'openssl dhparam -out /etc/ssl/dhparams.pem 2048' in directory '/root'
2018-09-01 23:29:56,314 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232956296028
2018-09-01 23:29:56,372 [salt.minion      :1431][INFO    ][14010] Starting a new job with PID 14010
2018-09-01 23:29:56,384 [salt.minion      :1708][INFO    ][14010] Returning information for job: 20180901232956296028
2018-09-01 23:30:06,334 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233006315402
2018-09-01 23:30:06,356 [salt.minion      :1431][INFO    ][14019] Starting a new job with PID 14019
2018-09-01 23:30:06,370 [salt.minion      :1708][INFO    ][14019] Returning information for job: 20180901233006315402
2018-09-01 23:30:16,533 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233016514281
2018-09-01 23:30:16,555 [salt.minion      :1431][INFO    ][14028] Starting a new job with PID 14028
2018-09-01 23:30:16,569 [salt.minion      :1708][INFO    ][14028] Returning information for job: 20180901233016514281
2018-09-01 23:30:26,730 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233026712245
2018-09-01 23:30:26,750 [salt.minion      :1431][INFO    ][14037] Starting a new job with PID 14037
2018-09-01 23:30:26,764 [salt.minion      :1708][INFO    ][14037] Returning information for job: 20180901233026712245
2018-09-01 23:30:36,851 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233036833259
2018-09-01 23:30:36,874 [salt.minion      :1431][INFO    ][14046] Starting a new job with PID 14046
2018-09-01 23:30:36,890 [salt.minion      :1708][INFO    ][14046] Returning information for job: 20180901233036833259
2018-09-01 23:30:46,430 [salt.state       :290 ][INFO    ][12934] {'pid': 14006, 'retcode': 0, 'stderr': "Generating DH parameters, 2048 bit long safe prime, generator 2\nThis is going to take a long time\n................+.......................................+..........................................................+...........................................................................................................................+..................................................+................................................................................................................................................................................................................................................+..+....+.......................................................+...............+................+............+........................................................................+....+...............................+...............................................................................................................................................+...............................................................................................................................+..............+.................................................................................................................+.....................................................................+...+.............+....................+.......+.........+...................................+...........................................................................................................................................................................................+......................+.................................+...................................................................................................................................................................+...........................................+.............................................................................................................................................................................+.........................+...........................................................................................................+.......................................+..+................................................................................................................................................................................................................................................+....................................................................................................................................................................+............................................................................................................+.......................................+.................................................................................+...............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................+...............................................................................................................+............................................................................................+.............................................................................................................+..................+...............+.............................................................................................................................................................................................................................................................................................................................................................................+..............................................................................................................................................+..............+.......................................................................................................+....................+...........................................................................................................................................+.........................................+...................................+.....................................................................................................................................+..........................................................................................................................................................................................+.....+....................................................................................................+............................................................................................................................................................................+........................+..............................+..............+..............................................................................+..+.........................+.................................................+.................................................................................................................................................+...............+..................................................................................................................................................................................+............+..............................+............................................................................................+............................................+...............................................................................................+...................+..+....................................................................................+.+....................................+.................................................................................................................................+..............................+.........................+....................................................................................+...............................................+..............................+................................................+....................................................................................................+.................................................................................................................................................................................................+........................................................................+...+......................................................................................................................................+...............................................................................................................+.............................+..............................+......................................................................................................+............................................................................................................................................................................................................................+.....................+...+........................................................................................................................................................................................................+...........................++*++*\nunable to write 'random state'", 'stdout': ''}
2018-09-01 23:30:46,431 [salt.state       :1941][INFO    ][12934] Completed state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 23:30:46.430785 duration_in_ms=51921.811
2018-09-01 23:30:46,433 [salt.state       :1770][INFO    ][12934] Running state [nginx] at time 23:30:46.433907
2018-09-01 23:30:46,434 [salt.state       :1803][INFO    ][12934] Executing state service.running for [nginx]
2018-09-01 23:30:46,434 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemctl', 'status', 'nginx.service', '-n', '0'] in directory '/root'
2018-09-01 23:30:46,450 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-09-01 23:30:46,466 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-09-01 23:30:46,483 [salt.state       :290 ][INFO    ][12934] The service nginx is already running
2018-09-01 23:30:46,483 [salt.state       :1941][INFO    ][12934] Completed state [nginx] at time 23:30:46.483415 duration_in_ms=49.507
2018-09-01 23:30:46,483 [salt.state       :1770][INFO    ][12934] Running state [nginx] at time 23:30:46.483623
2018-09-01 23:30:46,483 [salt.state       :1803][INFO    ][12934] Executing state service.mod_watch for [nginx]
2018-09-01 23:30:46,484 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-09-01 23:30:46,496 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12934] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'nginx.service'] in directory '/root'
2018-09-01 23:30:46,617 [salt.state       :290 ][INFO    ][12934] {'nginx': True}
2018-09-01 23:30:46,618 [salt.state       :1941][INFO    ][12934] Completed state [nginx] at time 23:30:46.618021 duration_in_ms=134.397
2018-09-01 23:30:46,622 [salt.minion      :1708][INFO    ][12934] Returning information for job: 20180901232930804446
2018-09-01 23:31:53,886 [salt.minion      :1307][INFO    ][1683] User sudo_ubuntu Executing command cp.push_dir with jid 20180901233153869249
2018-09-01 23:31:53,907 [salt.minion      :1431][INFO    ][14142] Starting a new job with PID 14142
