2018-09-01 22:08:28,611 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:28,964 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:28,966 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:28,969 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:28,971 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:30,625 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:30,629 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:30,634 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,463 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,469 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,472 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:31,475 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,266 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,583 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,587 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,589 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,593 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,596 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,598 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,601 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,604 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,607 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,610 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,613 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,616 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,619 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,621 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,624 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,627 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,629 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,632 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,635 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,638 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,641 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,643 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,646 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,649 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,651 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,654 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,657 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,660 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,663 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,665 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,669 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,671 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,674 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,677 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,680 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,682 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,685 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,687 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,690 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,692 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,695 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,697 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,700 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:08:33,702 [salt.utils.decorators:82  ][ERROR   ][1866] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2018-09-01 22:09:21,246 [salt.utils.decorators:613 ][WARNING ][1866] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:09:25,368 [salt.loaded.int.states.file:2150][WARNING ][1866] State for file: /etc/ssl/certs/ca-salt_master_ca.crt - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-09-01 22:09:28,932 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3201] Executing command ['systemctl', 'status', 'salt-minion.service', '-n', '0'] in directory '/root'
2018-09-01 22:09:28,963 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3201] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'salt-minion.service'] in directory '/root'
2018-09-01 22:09:28,981 [salt.utils.parsers:1051][WARNING ][1485] Minion received a SIGTERM. Exiting.
2018-09-01 22:09:29,833 [salt.cli.daemons :293 ][INFO    ][3251] Setting up the Salt Minion "prx01.mcp-ovs-ha.local"
2018-09-01 22:09:29,924 [salt.cli.daemons :82  ][INFO    ][3251] Starting up the Salt Minion
2018-09-01 22:09:29,925 [salt.utils.event :1017][INFO    ][3251] Starting pull socket on /var/run/salt/minion/minion_event_ff902ec8d4_pull.ipc
2018-09-01 22:09:30,524 [salt.minion      :976 ][INFO    ][3251] Creating minion process manager
2018-09-01 22:09:31,543 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][3251] Executing command ['date', '+%z'] in directory '/root'
2018-09-01 22:09:31,571 [salt.utils.schedule:568 ][INFO    ][3251] Updating job settings for scheduled job: __mine_interval
2018-09-01 22:09:31,671 [salt.minion      :1107][INFO    ][3251] Added mine.update to scheduler
2018-09-01 22:09:31,684 [salt.minion      :1965][INFO    ][3251] Minion is starting as user 'root'
2018-09-01 22:09:31,697 [salt.minion      :2324][INFO    ][3251] Minion is ready to receive requests!
2018-09-01 22:10:24,284 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command state.apply with jid 20180901221024268410
2018-09-01 22:10:24,308 [salt.minion      :1431][INFO    ][3345] Starting a new job with PID 3345
2018-09-01 22:10:29,888 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221029878664
2018-09-01 22:10:29,897 [salt.state       :905 ][INFO    ][3345] Loading fresh modules for state activity
2018-09-01 22:10:29,919 [salt.minion      :1431][INFO    ][3352] Starting a new job with PID 3352
2018-09-01 22:10:29,939 [salt.minion      :1708][INFO    ][3352] Returning information for job: 20180901221029878664
2018-09-01 22:10:30,751 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/init.sls'
2018-09-01 22:10:30,790 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/init.sls'
2018-09-01 22:10:30,882 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/env.sls'
2018-09-01 22:10:30,954 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/profile.sls'
2018-09-01 22:10:31,040 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/repo.sls'
2018-09-01 22:10:31,207 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/package.sls'
2018-09-01 22:10:31,350 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/timezone.sls'
2018-09-01 22:10:31,449 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/kernel.sls'
2018-09-01 22:10:31,553 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/cpu.sls'
2018-09-01 22:10:31,638 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/sysfs.sls'
2018-09-01 22:10:31,717 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/locale.sls'
2018-09-01 22:10:32,644 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/user.sls'
2018-09-01 22:10:32,788 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/group.sls'
2018-09-01 22:10:32,864 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/limit.sls'
2018-09-01 22:10:32,933 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/systemd.sls'
2018-09-01 22:10:33,002 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/apt.sls'
2018-09-01 22:10:33,073 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/system/banner.sls'
2018-09-01 22:10:33,147 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/network/init.sls'
2018-09-01 22:10:33,218 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/network/hostname.sls'
2018-09-01 22:10:33,287 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/network/host.sls'
2018-09-01 22:10:33,399 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/network/interface.sls'
2018-09-01 22:10:33,534 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/network/proxy.sls'
2018-09-01 22:10:33,603 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/storage/init.sls'
2018-09-01 22:10:33,687 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'ntp/init.sls'
2018-09-01 22:10:34,414 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'ntp/client.sls'
2018-09-01 22:10:34,490 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'ntp/server.sls'
2018-09-01 22:10:34,535 [salt.state       :1770][INFO    ][3345] Running state [/etc/environment] at time 22:10:34.534942
2018-09-01 22:10:34,535 [salt.state       :1803][INFO    ][3345] Executing state file.blockreplace for [/etc/environment]
2018-09-01 22:10:34,542 [salt.state       :290 ][INFO    ][3345] File changed:
--- 
+++ 
@@ -1 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
+# SALT MANAGED VARIABLES - DO NOT EDIT - START
+# 
+# SALT MANAGED VARIABLES - END

2018-09-01 22:10:34,542 [salt.state       :1941][INFO    ][3345] Completed state [/etc/environment] at time 22:10:34.542360 duration_in_ms=7.419
2018-09-01 22:10:34,542 [salt.state       :1770][INFO    ][3345] Running state [/etc/profile.d] at time 22:10:34.542558
2018-09-01 22:10:34,542 [salt.state       :1803][INFO    ][3345] Executing state file.directory for [/etc/profile.d]
2018-09-01 22:10:34,582 [salt.state       :290 ][INFO    ][3345] Directory /etc/profile.d is in the correct state
Directory /etc/profile.d updated
2018-09-01 22:10:34,583 [salt.state       :1941][INFO    ][3345] Completed state [/etc/profile.d] at time 22:10:34.583175 duration_in_ms=40.617
2018-09-01 22:10:35,125 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 22:10:35.125539
2018-09-01 22:10:35,134 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/apt/apt.conf.d/99prefer_ipv4-salt]
2018-09-01 22:10:35,155 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf'
2018-09-01 22:10:35,166 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:10:35,166 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 22:10:35.166763 duration_in_ms=41.224
2018-09-01 22:10:35,167 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 22:10:35.167005
2018-09-01 22:10:35,167 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/apt/apt.conf.d/99allow_downgrades-salt]
2018-09-01 22:10:35,187 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:10:35,188 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 22:10:35.188045 duration_in_ms=21.04
2018-09-01 22:10:35,189 [salt.state       :1770][INFO    ][3345] Running state [linux_repo_prereq_pkgs] at time 22:10:35.189037
2018-09-01 22:10:35,189 [salt.state       :1803][INFO    ][3345] Executing state pkg.installed for [linux_repo_prereq_pkgs]
2018-09-01 22:10:35,189 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:10:35,451 [salt.state       :290 ][INFO    ][3345] All specified packages are already installed
2018-09-01 22:10:35,452 [salt.state       :1941][INFO    ][3345] Completed state [linux_repo_prereq_pkgs] at time 22:10:35.451972 duration_in_ms=262.934
2018-09-01 22:10:35,452 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/99proxies-salt] at time 22:10:35.452242
2018-09-01 22:10:35,452 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/apt/apt.conf.d/99proxies-salt]
2018-09-01 22:10:35,469 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf.d_proxies'
2018-09-01 22:10:35,482 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:10:35,482 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/99proxies-salt] at time 22:10:35.482415 duration_in_ms=30.172
2018-09-01 22:10:35,482 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 22:10:35.482583
2018-09-01 22:10:35,482 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack]
2018-09-01 22:10:35,483 [salt.state       :290 ][INFO    ][3345] File /etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack is not present
2018-09-01 22:10:35,483 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 22:10:35.483136 duration_in_ms=0.553
2018-09-01 22:10:35,483 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/preferences.d/mirantis_openstack] at time 22:10:35.483296
2018-09-01 22:10:35,483 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/apt/preferences.d/mirantis_openstack]
2018-09-01 22:10:35,498 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/preferences_repo'
2018-09-01 22:10:35,556 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:10:35,556 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/preferences.d/mirantis_openstack] at time 22:10:35.556503 duration_in_ms=73.207
2018-09-01 22:10:35,558 [salt.state       :1770][INFO    ][3345] Running state [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main] at time 22:10:35.558429
2018-09-01 22:10:35,558 [salt.state       :1803][INFO    ][3345] Executing state pkgrepo.managed for [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main]
2018-09-01 22:10:36,349 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-key', 'add', '/var/cache/salt/minion/extrn_files/base/mirror.mirantis.com/nightly/openstack-queens/xenial/archive-queens.key'] in directory '/root'
2018-09-01 22:10:36,653 [salt.state       :290 ][INFO    ][3345] {'repo': 'deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main'}
2018-09-01 22:10:36,653 [salt.state       :1941][INFO    ][3345] Completed state [deb http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial main] at time 22:10:36.653760 duration_in_ms=1095.33
2018-09-01 22:10:36,654 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 22:10:36.654219
2018-09-01 22:10:36,654 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-uca]
2018-09-01 22:10:36,655 [salt.state       :290 ][INFO    ][3345] File /etc/apt/apt.conf.d/99proxies-salt-uca is not present
2018-09-01 22:10:36,655 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/99proxies-salt-uca] at time 22:10:36.655336 duration_in_ms=1.117
2018-09-01 22:10:36,655 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/preferences.d/uca] at time 22:10:36.655640
2018-09-01 22:10:36,655 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/apt/preferences.d/uca]
2018-09-01 22:10:36,730 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:10:36,730 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/preferences.d/uca] at time 22:10:36.730387 duration_in_ms=74.748
2018-09-01 22:10:36,734 [salt.state       :1770][INFO    ][3345] Running state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 22:10:36.734095
2018-09-01 22:10:36,734 [salt.state       :1803][INFO    ][3345] Executing state cmd.run for [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA]
2018-09-01 22:10:36,734 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'test -e /etc/apt/sources.list.d/uca.list' in directory '/root'
2018-09-01 22:10:36,750 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA' in directory '/root'
2018-09-01 22:10:37,153 [salt.state       :290 ][INFO    ][3345] {'pid': 3554, 'retcode': 0, 'stderr': 'gpg: requesting key EC4926EA from hkp server keyserver.ubuntu.com\ngpg: key EC4926EA: public key "Canonical Cloud Archive Signing Key <ftpmaster@canonical.com>" imported\ngpg: Total number processed: 1\ngpg:               imported: 1  (RSA: 1)', 'stdout': 'Executing: /tmp/tmp.AX23NXWjNP/gpg.1.sh --keyserver\nkeyserver.ubuntu.com\n--recv\nEC4926EA'}
2018-09-01 22:10:37,154 [salt.state       :1941][INFO    ][3345] Completed state [apt-key adv --keyserver keyserver.ubuntu.com --recv EC4926EA] at time 22:10:37.154344 duration_in_ms=420.247
2018-09-01 22:10:37,157 [salt.state       :1770][INFO    ][3345] Running state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main] at time 22:10:37.157469
2018-09-01 22:10:37,157 [salt.state       :1803][INFO    ][3345] Executing state pkgrepo.managed for [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main]
2018-09-01 22:10:37,247 [salt.state       :290 ][INFO    ][3345] {'repo': 'deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main'}
2018-09-01 22:10:37,248 [salt.state       :1941][INFO    ][3345] Completed state [deb http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens main] at time 22:10:37.248220 duration_in_ms=90.752
2018-09-01 22:10:37,249 [salt.state       :1770][INFO    ][3345] Running state [pkg.refresh_db] at time 22:10:37.249111
2018-09-01 22:10:37,249 [salt.state       :1803][INFO    ][3345] Executing state module.run for [pkg.refresh_db]
2018-09-01 22:10:37,249 [salt.utils.decorators:613 ][WARNING ][3345] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:10:37,250 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:10:39,931 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221039915966
2018-09-01 22:10:39,950 [salt.minion      :1431][INFO    ][4028] Starting a new job with PID 4028
2018-09-01 22:10:39,971 [salt.minion      :1708][INFO    ][4028] Returning information for job: 20180901221039915966
2018-09-01 22:10:41,469 [salt.state       :290 ][INFO    ][3345] {'ret': {'http://security.ubuntu.com/ubuntu xenial-security InRelease': True, 'http://archive.ubuntu.com/ubuntu xenial-backports InRelease': None, 'http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial/main amd64 Packages': True, 'http://archive.ubuntu.com/ubuntu xenial-updates InRelease': None, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens/main amd64 Packages': True, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens Release.gpg': True, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens InRelease': False, 'http://repo.saltstack.com/apt/ubuntu/16.04/amd64/2017.7 xenial InRelease': None, 'http://ubuntu-cloud.archive.canonical.com/ubuntu xenial-updates/queens Release': True, 'http://archive.ubuntu.com/ubuntu xenial InRelease': None, 'http://mirror.mirantis.com/nightly/openstack-queens/xenial xenial InRelease': True}}
2018-09-01 22:10:41,470 [salt.state       :1941][INFO    ][3345] Completed state [pkg.refresh_db] at time 22:10:41.470769 duration_in_ms=4221.656
2018-09-01 22:10:41,471 [salt.state       :1770][INFO    ][3345] Running state [linux_extra_packages_latest] at time 22:10:41.471097
2018-09-01 22:10:41,471 [salt.state       :1803][INFO    ][3345] Executing state pkg.latest for [linux_extra_packages_latest]
2018-09-01 22:10:41,482 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2018-09-01 22:10:41,551 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-cache', '-q', 'policy', 'python-tornado'] in directory '/root'
2018-09-01 22:10:41,644 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:10:41,665 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'libapache2-mod-wsgi', 'python-tornado'] in directory '/root'
2018-09-01 22:10:49,990 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221049974773
2018-09-01 22:10:50,008 [salt.minion      :1431][INFO    ][4135] Starting a new job with PID 4135
2018-09-01 22:10:50,030 [salt.minion      :1708][INFO    ][4135] Returning information for job: 20180901221049974773
2018-09-01 22:11:00,044 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221100027387
2018-09-01 22:11:00,071 [salt.minion      :1431][INFO    ][4416] Starting a new job with PID 4416
2018-09-01 22:11:00,097 [salt.minion      :1708][INFO    ][4416] Returning information for job: 20180901221100027387
2018-09-01 22:11:05,459 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:05,491 [salt.state       :290 ][INFO    ][3345] Made the following changes:
'python-tornado' changed from '4.2.1-2~ds+1' to '4.5.3-1.0~u16.04+mcp1'
'libaprutil1-ldap' changed from 'absent' to '1.5.4-1build1'
'libapr1' changed from 'absent' to '1.5.2-3'
'libpython2.7' changed from 'absent' to '2.7.12-1ubuntu0~16.04.3'
'libapache2-mod-wsgi' changed from 'absent' to '4.4.15-0.1.1~u16.04+mcp2'
'apache2-api-20120211' changed from 'absent' to '1'
'libaprutil1' changed from 'absent' to '1.5.4-1build1'
'httpd-wsgi' changed from 'absent' to '1'
'python-singledispatch' changed from 'absent' to '3.4.0.3-2'
'liblua5.1-0' changed from 'absent' to '5.1.5-8ubuntu1'
'libaprutil1-dbd-sqlite3' changed from 'absent' to '1.5.4-1build1'
'python-backports-abc' changed from 'absent' to '0.5-2.0~u16.04+mcp1'
'apache2-bin' changed from 'absent' to '2.4.18-2ubuntu3.9'

2018-09-01 22:11:05,507 [salt.state       :905 ][INFO    ][3345] Loading fresh modules for state activity
2018-09-01 22:11:05,531 [salt.state       :1941][INFO    ][3345] Completed state [linux_extra_packages_latest] at time 22:11:05.531661 duration_in_ms=24060.563
2018-09-01 22:11:05,537 [salt.state       :1770][INFO    ][3345] Running state [UTC] at time 22:11:05.537444
2018-09-01 22:11:05,537 [salt.state       :1803][INFO    ][3345] Executing state timezone.system for [UTC]
2018-09-01 22:11:05,543 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['timedatectl'] in directory '/root'
2018-09-01 22:11:05,773 [salt.state       :290 ][INFO    ][3345] Timezone UTC already set, UTC already set to UTC
2018-09-01 22:11:05,775 [salt.state       :1941][INFO    ][3345] Completed state [UTC] at time 22:11:05.775025 duration_in_ms=237.582
2018-09-01 22:11:05,779 [salt.state       :1770][INFO    ][3345] Running state [nf_conntrack] at time 22:11:05.779517
2018-09-01 22:11:05,780 [salt.state       :1803][INFO    ][3345] Executing state kmod.present for [nf_conntrack]
2018-09-01 22:11:05,781 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:06,780 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:06,796 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'modprobe nf_conntrack' in directory '/root'
2018-09-01 22:11:06,919 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'lsmod' in directory '/root'
2018-09-01 22:11:07,011 [salt.state       :290 ][INFO    ][3345] {'nf_conntrack': 'loaded'}
2018-09-01 22:11:07,012 [salt.state       :1941][INFO    ][3345] Completed state [nf_conntrack] at time 22:11:07.012458 duration_in_ms=1232.941
2018-09-01 22:11:07,015 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_keepalive_probes] at time 22:11:07.015027
2018-09-01 22:11:07,015 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_keepalive_probes]
2018-09-01 22:11:07,016 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_keepalive_probes="8"' in directory '/root'
2018-09-01 22:11:07,246 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_keepalive_probes': 8}
2018-09-01 22:11:07,248 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_keepalive_probes] at time 22:11:07.248116 duration_in_ms=233.088
2018-09-01 22:11:07,249 [salt.state       :1770][INFO    ][3345] Running state [fs.file-max] at time 22:11:07.249667
2018-09-01 22:11:07,251 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [fs.file-max]
2018-09-01 22:11:07,253 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w fs.file-max="124165"' in directory '/root'
2018-09-01 22:11:07,268 [salt.state       :290 ][INFO    ][3345] {'fs.file-max': 124165}
2018-09-01 22:11:07,270 [salt.state       :1941][INFO    ][3345] Completed state [fs.file-max] at time 22:11:07.269868 duration_in_ms=20.201
2018-09-01 22:11:07,271 [salt.state       :1770][INFO    ][3345] Running state [net.core.somaxconn] at time 22:11:07.271248
2018-09-01 22:11:07,272 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.core.somaxconn]
2018-09-01 22:11:09,987 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.core.somaxconn="4096"' in directory '/root'
2018-09-01 22:11:10,036 [salt.state       :290 ][INFO    ][3345] {'net.core.somaxconn': 4096}
2018-09-01 22:11:10,037 [salt.state       :1941][INFO    ][3345] Completed state [net.core.somaxconn] at time 22:11:10.037028 duration_in_ms=2765.78
2018-09-01 22:11:10,037 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_max_syn_backlog] at time 22:11:10.037696
2018-09-01 22:11:10,038 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_max_syn_backlog]
2018-09-01 22:11:10,039 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_max_syn_backlog="8192"' in directory '/root'
2018-09-01 22:11:10,050 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_max_syn_backlog': 8192}
2018-09-01 22:11:10,051 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_max_syn_backlog] at time 22:11:10.051293 duration_in_ms=13.597
2018-09-01 22:11:10,051 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_tw_reuse] at time 22:11:10.051899
2018-09-01 22:11:10,052 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_tw_reuse]
2018-09-01 22:11:10,154 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221110138402
2018-09-01 22:11:10,174 [salt.minion      :1431][INFO    ][4559] Starting a new job with PID 4559
2018-09-01 22:11:10,369 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_tw_reuse="1"' in directory '/root'
2018-09-01 22:11:10,387 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_tw_reuse': 1}
2018-09-01 22:11:10,388 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_tw_reuse] at time 22:11:10.388085 duration_in_ms=336.184
2018-09-01 22:11:10,388 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_congestion_control] at time 22:11:10.388754
2018-09-01 22:11:10,389 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_congestion_control]
2018-09-01 22:11:10,390 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_congestion_control="yeah"' in directory '/root'
2018-09-01 22:11:10,393 [salt.minion      :1708][INFO    ][4559] Returning information for job: 20180901221110138402
2018-09-01 22:11:10,678 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_congestion_control': 'yeah'}
2018-09-01 22:11:10,679 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_congestion_control] at time 22:11:10.679283 duration_in_ms=290.528
2018-09-01 22:11:10,680 [salt.state       :1770][INFO    ][3345] Running state [net.nf_conntrack_max] at time 22:11:10.680022
2018-09-01 22:11:10,680 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.nf_conntrack_max]
2018-09-01 22:11:10,682 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.nf_conntrack_max="1048576"' in directory '/root'
2018-09-01 22:11:10,703 [salt.state       :290 ][INFO    ][3345] {'net.nf_conntrack_max': 1048576}
2018-09-01 22:11:10,704 [salt.state       :1941][INFO    ][3345] Completed state [net.nf_conntrack_max] at time 22:11:10.704446 duration_in_ms=24.422
2018-09-01 22:11:10,705 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_retries2] at time 22:11:10.705264
2018-09-01 22:11:10,705 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_retries2]
2018-09-01 22:11:10,707 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_retries2="5"' in directory '/root'
2018-09-01 22:11:10,721 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_retries2': 5}
2018-09-01 22:11:10,722 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_retries2] at time 22:11:10.722506 duration_in_ms=17.241
2018-09-01 22:11:10,723 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_fin_timeout] at time 22:11:10.723076
2018-09-01 22:11:10,723 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_fin_timeout]
2018-09-01 22:11:10,737 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_fin_timeout="30"' in directory '/root'
2018-09-01 22:11:10,754 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_fin_timeout': 30}
2018-09-01 22:11:10,754 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_fin_timeout] at time 22:11:10.754681 duration_in_ms=31.604
2018-09-01 22:11:10,755 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_slow_start_after_idle] at time 22:11:10.755387
2018-09-01 22:11:10,755 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_slow_start_after_idle]
2018-09-01 22:11:10,756 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_slow_start_after_idle="0"' in directory '/root'
2018-09-01 22:11:10,767 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_slow_start_after_idle': 0}
2018-09-01 22:11:10,768 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 22:11:10.768216 duration_in_ms=12.829
2018-09-01 22:11:10,768 [salt.state       :1770][INFO    ][3345] Running state [vm.swappiness] at time 22:11:10.768699
2018-09-01 22:11:10,769 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [vm.swappiness]
2018-09-01 22:11:10,770 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w vm.swappiness="10"' in directory '/root'
2018-09-01 22:11:10,780 [salt.state       :290 ][INFO    ][3345] {'vm.swappiness': 10}
2018-09-01 22:11:10,781 [salt.state       :1941][INFO    ][3345] Completed state [vm.swappiness] at time 22:11:10.780999 duration_in_ms=12.3
2018-09-01 22:11:10,781 [salt.state       :1770][INFO    ][3345] Running state [net.core.netdev_max_backlog] at time 22:11:10.781487
2018-09-01 22:11:10,781 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.core.netdev_max_backlog]
2018-09-01 22:11:10,875 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.core.netdev_max_backlog="261144"' in directory '/root'
2018-09-01 22:11:10,895 [salt.state       :290 ][INFO    ][3345] {'net.core.netdev_max_backlog': 261144}
2018-09-01 22:11:10,896 [salt.state       :1941][INFO    ][3345] Completed state [net.core.netdev_max_backlog] at time 22:11:10.895908 duration_in_ms=114.421
2018-09-01 22:11:10,896 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.neigh.default.gc_thresh1] at time 22:11:10.896551
2018-09-01 22:11:10,897 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh1]
2018-09-01 22:11:10,898 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh1="4096"' in directory '/root'
2018-09-01 22:11:10,909 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.neigh.default.gc_thresh1': 4096}
2018-09-01 22:11:10,910 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 22:11:10.910116 duration_in_ms=13.564
2018-09-01 22:11:10,910 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.neigh.default.gc_thresh2] at time 22:11:10.910717
2018-09-01 22:11:10,911 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh2]
2018-09-01 22:11:10,912 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh2="8192"' in directory '/root'
2018-09-01 22:11:10,925 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.neigh.default.gc_thresh2': 8192}
2018-09-01 22:11:10,926 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 22:11:10.926056 duration_in_ms=15.338
2018-09-01 22:11:10,926 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.neigh.default.gc_thresh3] at time 22:11:10.926651
2018-09-01 22:11:10,927 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh3]
2018-09-01 22:11:10,928 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh3="16384"' in directory '/root'
2018-09-01 22:11:10,942 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.neigh.default.gc_thresh3': 16384}
2018-09-01 22:11:10,943 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 22:11:10.943358 duration_in_ms=16.706
2018-09-01 22:11:10,943 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_keepalive_intvl] at time 22:11:10.943920
2018-09-01 22:11:10,944 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_keepalive_intvl]
2018-09-01 22:11:10,945 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_keepalive_intvl="3"' in directory '/root'
2018-09-01 22:11:10,958 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_keepalive_intvl': 3}
2018-09-01 22:11:10,958 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_keepalive_intvl] at time 22:11:10.958736 duration_in_ms=14.817
2018-09-01 22:11:10,959 [salt.state       :1770][INFO    ][3345] Running state [net.ipv4.tcp_keepalive_time] at time 22:11:10.959245
2018-09-01 22:11:10,959 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [net.ipv4.tcp_keepalive_time]
2018-09-01 22:11:11,055 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w net.ipv4.tcp_keepalive_time="30"' in directory '/root'
2018-09-01 22:11:11,075 [salt.state       :290 ][INFO    ][3345] {'net.ipv4.tcp_keepalive_time': 30}
2018-09-01 22:11:11,076 [salt.state       :1941][INFO    ][3345] Completed state [net.ipv4.tcp_keepalive_time] at time 22:11:11.076118 duration_in_ms=116.871
2018-09-01 22:11:11,076 [salt.state       :1770][INFO    ][3345] Running state [kernel.panic] at time 22:11:11.076652
2018-09-01 22:11:11,077 [salt.state       :1803][INFO    ][3345] Executing state sysctl.present for [kernel.panic]
2018-09-01 22:11:11,078 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'sysctl -w kernel.panic="60"' in directory '/root'
2018-09-01 22:11:11,090 [salt.state       :290 ][INFO    ][3345] {'kernel.panic': 60}
2018-09-01 22:11:11,090 [salt.state       :1941][INFO    ][3345] Completed state [kernel.panic] at time 22:11:11.090705 duration_in_ms=14.053
2018-09-01 22:11:11,095 [salt.state       :1770][INFO    ][3345] Running state [linux_sysfs_package] at time 22:11:11.095566
2018-09-01 22:11:11,095 [salt.state       :1803][INFO    ][3345] Executing state pkg.installed for [linux_sysfs_package]
2018-09-01 22:11:11,615 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-cache', '-q', 'policy', 'sysfsutils'] in directory '/root'
2018-09-01 22:11:11,711 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:11:13,387 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:11:13,409 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'sysfsutils'] in directory '/root'
2018-09-01 22:11:20,432 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221120415125
2018-09-01 22:11:20,455 [salt.minion      :1431][INFO    ][5332] Starting a new job with PID 5332
2018-09-01 22:11:20,469 [salt.minion      :1708][INFO    ][5332] Returning information for job: 20180901221120415125
2018-09-01 22:11:21,054 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:21,092 [salt.state       :290 ][INFO    ][3345] Made the following changes:
'libsysfs2' changed from 'absent' to '2.1.0+repack-4'
'sysfsutils' changed from 'absent' to '2.1.0+repack-4'

2018-09-01 22:11:21,106 [salt.state       :905 ][INFO    ][3345] Loading fresh modules for state activity
2018-09-01 22:11:21,138 [salt.state       :1941][INFO    ][3345] Completed state [linux_sysfs_package] at time 22:11:21.138445 duration_in_ms=10042.878
2018-09-01 22:11:21,142 [salt.state       :1770][INFO    ][3345] Running state [/etc/sysfs.d] at time 22:11:21.142236
2018-09-01 22:11:21,142 [salt.state       :1803][INFO    ][3345] Executing state file.directory for [/etc/sysfs.d]
2018-09-01 22:11:21,145 [salt.state       :290 ][INFO    ][3345] Directory /etc/sysfs.d is in the correct state
Directory /etc/sysfs.d updated
2018-09-01 22:11:21,146 [salt.state       :1941][INFO    ][3345] Completed state [/etc/sysfs.d] at time 22:11:21.146283 duration_in_ms=4.047
2018-09-01 22:11:21,489 [salt.state       :1770][INFO    ][3345] Running state [ondemand] at time 22:11:21.489901
2018-09-01 22:11:21,490 [salt.state       :1803][INFO    ][3345] Executing state service.dead for [ondemand]
2018-09-01 22:11:21,491 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:21,507 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,519 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,535 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,585 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,601 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,623 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:21,642 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', '/usr/sbin/update-rc.d', '-f', 'ondemand', 'remove'] in directory '/root'
2018-09-01 22:11:22,000 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2018-09-01 22:11:22,022 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'runlevel' in directory '/root'
2018-09-01 22:11:22,034 [salt.state       :290 ][INFO    ][3345] {'ondemand': True}
2018-09-01 22:11:22,034 [salt.state       :1941][INFO    ][3345] Completed state [ondemand] at time 22:11:22.034708 duration_in_ms=544.807
2018-09-01 22:11:22,035 [salt.state       :1770][INFO    ][3345] Running state [cs_CZ.UTF-8] at time 22:11:22.035950
2018-09-01 22:11:22,036 [salt.state       :1803][INFO    ][3345] Executing state locale.present for [cs_CZ.UTF-8]
2018-09-01 22:11:22,037 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'locale -a' in directory '/root'
2018-09-01 22:11:22,051 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['locale-gen', 'cs_CZ.utf8'] in directory '/root'
2018-09-01 22:11:22,921 [salt.state       :290 ][INFO    ][3345] {'locale': 'cs_CZ.UTF-8'}
2018-09-01 22:11:22,923 [salt.state       :1941][INFO    ][3345] Completed state [cs_CZ.UTF-8] at time 22:11:22.923156 duration_in_ms=887.204
2018-09-01 22:11:22,924 [salt.state       :1770][INFO    ][3345] Running state [en_US.UTF-8] at time 22:11:22.924393
2018-09-01 22:11:22,925 [salt.state       :1803][INFO    ][3345] Executing state locale.present for [en_US.UTF-8]
2018-09-01 22:11:22,927 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'locale -a' in directory '/root'
2018-09-01 22:11:22,941 [salt.state       :290 ][INFO    ][3345] Locale en_US.UTF-8 is already present
2018-09-01 22:11:22,942 [salt.state       :1941][INFO    ][3345] Completed state [en_US.UTF-8] at time 22:11:22.942433 duration_in_ms=18.041
2018-09-01 22:11:22,944 [salt.state       :1770][INFO    ][3345] Running state [en_US.UTF-8] at time 22:11:22.943981
2018-09-01 22:11:22,944 [salt.state       :1803][INFO    ][3345] Executing state locale.system for [en_US.UTF-8]
2018-09-01 22:11:22,945 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'localectl' in directory '/root'
2018-09-01 22:11:23,357 [salt.state       :290 ][INFO    ][3345] System locale en_US.UTF-8 already set
2018-09-01 22:11:23,358 [salt.state       :1941][INFO    ][3345] Completed state [en_US.UTF-8] at time 22:11:23.358341 duration_in_ms=414.359
2018-09-01 22:11:23,360 [salt.state       :1770][INFO    ][3345] Running state [root] at time 22:11:23.360310
2018-09-01 22:11:23,360 [salt.state       :1803][INFO    ][3345] Executing state group.present for [root]
2018-09-01 22:11:23,361 [salt.state       :290 ][INFO    ][3345] Group root is present and up to date
2018-09-01 22:11:23,362 [salt.state       :1941][INFO    ][3345] Completed state [root] at time 22:11:23.362232 duration_in_ms=1.923
2018-09-01 22:11:23,366 [salt.state       :1770][INFO    ][3345] Running state [root] at time 22:11:23.366776
2018-09-01 22:11:23,367 [salt.state       :1803][INFO    ][3345] Executing state user.present for [root]
2018-09-01 22:11:23,371 [salt.state       :290 ][INFO    ][3345] User root is present and up to date
2018-09-01 22:11:23,372 [salt.state       :1941][INFO    ][3345] Completed state [root] at time 22:11:23.372033 duration_in_ms=5.258
2018-09-01 22:11:23,372 [salt.state       :1770][INFO    ][3345] Running state [/root] at time 22:11:23.372859
2018-09-01 22:11:23,373 [salt.state       :1803][INFO    ][3345] Executing state file.directory for [/root]
2018-09-01 22:11:23,373 [salt.state       :290 ][INFO    ][3345] Directory /root is in the correct state
Directory /root updated
2018-09-01 22:11:23,374 [salt.state       :1941][INFO    ][3345] Completed state [/root] at time 22:11:23.374122 duration_in_ms=1.263
2018-09-01 22:11:23,374 [salt.state       :1770][INFO    ][3345] Running state [/etc/sudoers.d/90-salt-user-root] at time 22:11:23.374413
2018-09-01 22:11:23,374 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/sudoers.d/90-salt-user-root]
2018-09-01 22:11:23,375 [salt.state       :290 ][INFO    ][3345] File /etc/sudoers.d/90-salt-user-root is not present
2018-09-01 22:11:23,375 [salt.state       :1941][INFO    ][3345] Completed state [/etc/sudoers.d/90-salt-user-root] at time 22:11:23.375180 duration_in_ms=0.767
2018-09-01 22:11:23,375 [salt.state       :1770][INFO    ][3345] Running state [ubuntu] at time 22:11:23.375418
2018-09-01 22:11:23,375 [salt.state       :1803][INFO    ][3345] Executing state group.present for [ubuntu]
2018-09-01 22:11:23,376 [salt.state       :290 ][INFO    ][3345] Group ubuntu is present and up to date
2018-09-01 22:11:23,376 [salt.state       :1941][INFO    ][3345] Completed state [ubuntu] at time 22:11:23.376221 duration_in_ms=0.803
2018-09-01 22:11:23,377 [salt.state       :1770][INFO    ][3345] Running state [ubuntu] at time 22:11:23.377027
2018-09-01 22:11:23,377 [salt.state       :1803][INFO    ][3345] Executing state user.present for [ubuntu]
2018-09-01 22:11:23,380 [salt.state       :290 ][INFO    ][3345] {'passwd': 'XXX-REDACTED-XXX'}
2018-09-01 22:11:23,380 [salt.state       :1941][INFO    ][3345] Completed state [ubuntu] at time 22:11:23.380621 duration_in_ms=3.593
2018-09-01 22:11:23,381 [salt.state       :1770][INFO    ][3345] Running state [/home/ubuntu] at time 22:11:23.381456
2018-09-01 22:11:23,381 [salt.state       :1803][INFO    ][3345] Executing state file.directory for [/home/ubuntu]
2018-09-01 22:11:23,382 [salt.state       :290 ][INFO    ][3345] {'mode': '0700'}
2018-09-01 22:11:23,382 [salt.state       :1941][INFO    ][3345] Completed state [/home/ubuntu] at time 22:11:23.382816 duration_in_ms=1.359
2018-09-01 22:11:23,383 [salt.state       :1770][INFO    ][3345] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 22:11:23.383506
2018-09-01 22:11:23,383 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/sudoers.d/90-salt-user-ubuntu]
2018-09-01 22:11:23,399 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/sudoer'
2018-09-01 22:11:23,410 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command '/usr/sbin/visudo -c -f /tmp/__salt.tmp.JO8uju' in directory '/root'
2018-09-01 22:11:23,470 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:11:23,471 [salt.state       :1941][INFO    ][3345] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 22:11:23.471385 duration_in_ms=87.876
2018-09-01 22:11:23,472 [salt.state       :1770][INFO    ][3345] Running state [/etc/security/limits.d/90-salt-default.conf] at time 22:11:23.472142
2018-09-01 22:11:23,472 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/security/limits.d/90-salt-default.conf]
2018-09-01 22:11:23,492 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/limits.conf'
2018-09-01 22:11:23,558 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:11:23,558 [salt.state       :1941][INFO    ][3345] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 22:11:23.558469 duration_in_ms=86.328
2018-09-01 22:11:23,558 [salt.state       :1770][INFO    ][3345] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 22:11:23.558694
2018-09-01 22:11:23,558 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/systemd/system.conf.d/90-salt.conf]
2018-09-01 22:11:23,573 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/systemd.conf'
2018-09-01 22:11:23,636 [salt.state       :290 ][INFO    ][3345] File changed:
New file
2018-09-01 22:11:23,636 [salt.state       :1941][INFO    ][3345] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 22:11:23.636543 duration_in_ms=77.849
2018-09-01 22:11:23,638 [salt.state       :1770][INFO    ][3345] Running state [service.systemctl_reload] at time 22:11:23.638280
2018-09-01 22:11:23,638 [salt.state       :1803][INFO    ][3345] Executing state module.wait for [service.systemctl_reload]
2018-09-01 22:11:23,638 [salt.state       :290 ][INFO    ][3345] No changes made for service.systemctl_reload
2018-09-01 22:11:23,638 [salt.state       :1941][INFO    ][3345] Completed state [service.systemctl_reload] at time 22:11:23.638864 duration_in_ms=0.585
2018-09-01 22:11:23,639 [salt.state       :1770][INFO    ][3345] Running state [service.systemctl_reload] at time 22:11:23.639019
2018-09-01 22:11:23,639 [salt.state       :1803][INFO    ][3345] Executing state module.mod_watch for [service.systemctl_reload]
2018-09-01 22:11:23,639 [salt.utils.decorators:613 ][WARNING ][3345] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2018-09-01 22:11:23,639 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', '--system', 'daemon-reload'] in directory '/root'
2018-09-01 22:11:23,751 [salt.state       :290 ][INFO    ][3345] {'ret': True}
2018-09-01 22:11:23,752 [salt.state       :1941][INFO    ][3345] Completed state [service.systemctl_reload] at time 22:11:23.752047 duration_in_ms=113.026
2018-09-01 22:11:23,752 [salt.state       :1770][INFO    ][3345] Running state [/etc/issue] at time 22:11:23.752609
2018-09-01 22:11:23,753 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/issue]
2018-09-01 22:11:23,793 [salt.state       :290 ][INFO    ][3345] File changed:
--- 
+++ 
@@ -1,2 +1,9 @@
-Ubuntu 16.04.5 LTS \n \l
-
+=================================== WARNING ====================================
+You have accessed a computer managed by OPNFV.
+You are required to have authorisation from OPNFV
+before you proceed and you are strictly limited to use set out within that
+authorisation. Unauthorised access to or misuse of this system is prohibited
+and constitutes an offence under the Computer Misuse Act 1990.
+If you disclose any information obtained through this system without authority
+OPNFV may take legal action against you.
+================================================================================

2018-09-01 22:11:23,793 [salt.state       :1941][INFO    ][3345] Completed state [/etc/issue] at time 22:11:23.793948 duration_in_ms=41.339
2018-09-01 22:11:23,794 [salt.state       :1770][INFO    ][3345] Running state [/etc/hostname] at time 22:11:23.794216
2018-09-01 22:11:23,794 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/hostname]
2018-09-01 22:11:23,812 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'linux/files/hostname'
2018-09-01 22:11:23,822 [salt.state       :290 ][INFO    ][3345] File changed:
--- 
+++ 
@@ -1 +1 @@
-ubuntu
+prx01

2018-09-01 22:11:23,822 [salt.state       :1941][INFO    ][3345] Completed state [/etc/hostname] at time 22:11:23.822433 duration_in_ms=28.217
2018-09-01 22:11:23,824 [salt.state       :1770][INFO    ][3345] Running state [hostname prx01] at time 22:11:23.824235
2018-09-01 22:11:23,824 [salt.state       :1803][INFO    ][3345] Executing state cmd.run for [hostname prx01]
2018-09-01 22:11:23,825 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'test "$(hostname)" = "prx01"' in directory '/root'
2018-09-01 22:11:23,837 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command 'hostname prx01' in directory '/root'
2018-09-01 22:11:23,848 [salt.state       :290 ][INFO    ][3345] {'pid': 5537, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-09-01 22:11:23,848 [salt.state       :1941][INFO    ][3345] Completed state [hostname prx01] at time 22:11:23.848418 duration_in_ms=24.183
2018-09-01 22:11:23,849 [salt.state       :1770][INFO    ][3345] Running state [mdb02] at time 22:11:23.849485
2018-09-01 22:11:23,849 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb02]
2018-09-01 22:11:23,850 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb02'}
2018-09-01 22:11:23,850 [salt.state       :1941][INFO    ][3345] Completed state [mdb02] at time 22:11:23.850955 duration_in_ms=1.47
2018-09-01 22:11:23,851 [salt.state       :1770][INFO    ][3345] Running state [mdb02.mcp-ovs-ha.local] at time 22:11:23.851233
2018-09-01 22:11:23,851 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb02.mcp-ovs-ha.local]
2018-09-01 22:11:23,949 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb02.mcp-ovs-ha.local'}
2018-09-01 22:11:23,949 [salt.state       :1941][INFO    ][3345] Completed state [mdb02.mcp-ovs-ha.local] at time 22:11:23.949747 duration_in_ms=98.512
2018-09-01 22:11:23,950 [salt.state       :1770][INFO    ][3345] Running state [mdb03] at time 22:11:23.950469
2018-09-01 22:11:23,951 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb03]
2018-09-01 22:11:23,961 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb03'}
2018-09-01 22:11:23,961 [salt.state       :1941][INFO    ][3345] Completed state [mdb03] at time 22:11:23.961630 duration_in_ms=11.161
2018-09-01 22:11:23,962 [salt.state       :1770][INFO    ][3345] Running state [mdb03.mcp-ovs-ha.local] at time 22:11:23.962314
2018-09-01 22:11:23,962 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb03.mcp-ovs-ha.local]
2018-09-01 22:11:23,967 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb03.mcp-ovs-ha.local'}
2018-09-01 22:11:23,967 [salt.state       :1941][INFO    ][3345] Completed state [mdb03.mcp-ovs-ha.local] at time 22:11:23.967442 duration_in_ms=5.13
2018-09-01 22:11:23,967 [salt.state       :1770][INFO    ][3345] Running state [mdb01] at time 22:11:23.967886
2018-09-01 22:11:23,968 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb01]
2018-09-01 22:11:23,973 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb01'}
2018-09-01 22:11:23,973 [salt.state       :1941][INFO    ][3345] Completed state [mdb01] at time 22:11:23.973456 duration_in_ms=5.57
2018-09-01 22:11:23,974 [salt.state       :1770][INFO    ][3345] Running state [mdb01.mcp-ovs-ha.local] at time 22:11:23.974087
2018-09-01 22:11:23,974 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb01.mcp-ovs-ha.local]
2018-09-01 22:11:23,979 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb01.mcp-ovs-ha.local'}
2018-09-01 22:11:23,980 [salt.state       :1941][INFO    ][3345] Completed state [mdb01.mcp-ovs-ha.local] at time 22:11:23.980079 duration_in_ms=5.992
2018-09-01 22:11:23,980 [salt.state       :1770][INFO    ][3345] Running state [mdb] at time 22:11:23.980787
2018-09-01 22:11:23,981 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb]
2018-09-01 22:11:23,985 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb'}
2018-09-01 22:11:23,985 [salt.state       :1941][INFO    ][3345] Completed state [mdb] at time 22:11:23.985724 duration_in_ms=4.937
2018-09-01 22:11:23,986 [salt.state       :1770][INFO    ][3345] Running state [mdb.mcp-ovs-ha.local] at time 22:11:23.986441
2018-09-01 22:11:23,986 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mdb.mcp-ovs-ha.local]
2018-09-01 22:11:23,993 [salt.state       :290 ][INFO    ][3345] {'host': 'mdb.mcp-ovs-ha.local'}
2018-09-01 22:11:23,994 [salt.state       :1941][INFO    ][3345] Completed state [mdb.mcp-ovs-ha.local] at time 22:11:23.994035 duration_in_ms=7.595
2018-09-01 22:11:23,995 [salt.state       :1770][INFO    ][3345] Running state [cfg01] at time 22:11:23.995268
2018-09-01 22:11:23,995 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cfg01]
2018-09-01 22:11:23,999 [salt.state       :290 ][INFO    ][3345] {'host': 'cfg01'}
2018-09-01 22:11:24,000 [salt.state       :1941][INFO    ][3345] Completed state [cfg01] at time 22:11:24.000534 duration_in_ms=5.266
2018-09-01 22:11:24,001 [salt.state       :1770][INFO    ][3345] Running state [cfg01.mcp-ovs-ha.local] at time 22:11:24.001504
2018-09-01 22:11:24,001 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2018-09-01 22:11:24,005 [salt.state       :290 ][INFO    ][3345] {'host': 'cfg01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,006 [salt.state       :1941][INFO    ][3345] Completed state [cfg01.mcp-ovs-ha.local] at time 22:11:24.006039 duration_in_ms=4.534
2018-09-01 22:11:24,006 [salt.state       :1770][INFO    ][3345] Running state [prx01] at time 22:11:24.006583
2018-09-01 22:11:24,007 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx01]
2018-09-01 22:11:24,011 [salt.state       :290 ][INFO    ][3345] {'host': 'prx01'}
2018-09-01 22:11:24,012 [salt.state       :1941][INFO    ][3345] Completed state [prx01] at time 22:11:24.012039 duration_in_ms=5.456
2018-09-01 22:11:24,012 [salt.state       :1770][INFO    ][3345] Running state [prx01.mcp-ovs-ha.local] at time 22:11:24.012555
2018-09-01 22:11:24,013 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx01.mcp-ovs-ha.local]
2018-09-01 22:11:24,017 [salt.state       :290 ][INFO    ][3345] {'host': 'prx01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,018 [salt.state       :1941][INFO    ][3345] Completed state [prx01.mcp-ovs-ha.local] at time 22:11:24.018175 duration_in_ms=5.62
2018-09-01 22:11:24,018 [salt.state       :1770][INFO    ][3345] Running state [kvm01] at time 22:11:24.018792
2018-09-01 22:11:24,019 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm01]
2018-09-01 22:11:24,023 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm01'}
2018-09-01 22:11:24,024 [salt.state       :1941][INFO    ][3345] Completed state [kvm01] at time 22:11:24.023983 duration_in_ms=5.191
2018-09-01 22:11:24,024 [salt.state       :1770][INFO    ][3345] Running state [kvm01.mcp-ovs-ha.local] at time 22:11:24.024600
2018-09-01 22:11:24,025 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm01.mcp-ovs-ha.local]
2018-09-01 22:11:24,221 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,222 [salt.state       :1941][INFO    ][3345] Completed state [kvm01.mcp-ovs-ha.local] at time 22:11:24.222307 duration_in_ms=197.706
2018-09-01 22:11:24,223 [salt.state       :1770][INFO    ][3345] Running state [kvm03] at time 22:11:24.223489
2018-09-01 22:11:24,224 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm03]
2018-09-01 22:11:24,227 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm03'}
2018-09-01 22:11:24,228 [salt.state       :1941][INFO    ][3345] Completed state [kvm03] at time 22:11:24.228310 duration_in_ms=4.821
2018-09-01 22:11:24,229 [salt.state       :1770][INFO    ][3345] Running state [kvm03.mcp-ovs-ha.local] at time 22:11:24.229557
2018-09-01 22:11:24,230 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm03.mcp-ovs-ha.local]
2018-09-01 22:11:24,233 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,234 [salt.state       :1941][INFO    ][3345] Completed state [kvm03.mcp-ovs-ha.local] at time 22:11:24.234248 duration_in_ms=4.691
2018-09-01 22:11:24,235 [salt.state       :1770][INFO    ][3345] Running state [kvm02] at time 22:11:24.235339
2018-09-01 22:11:24,236 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm02]
2018-09-01 22:11:24,239 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm02'}
2018-09-01 22:11:24,240 [salt.state       :1941][INFO    ][3345] Completed state [kvm02] at time 22:11:24.240214 duration_in_ms=4.875
2018-09-01 22:11:24,241 [salt.state       :1770][INFO    ][3345] Running state [kvm02.mcp-ovs-ha.local] at time 22:11:24.241356
2018-09-01 22:11:24,242 [salt.state       :1803][INFO    ][3345] Executing state host.present for [kvm02.mcp-ovs-ha.local]
2018-09-01 22:11:24,245 [salt.state       :290 ][INFO    ][3345] {'host': 'kvm02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,246 [salt.state       :1941][INFO    ][3345] Completed state [kvm02.mcp-ovs-ha.local] at time 22:11:24.246262 duration_in_ms=4.905
2018-09-01 22:11:24,247 [salt.state       :1770][INFO    ][3345] Running state [dbs] at time 22:11:24.247375
2018-09-01 22:11:24,248 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs]
2018-09-01 22:11:24,251 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs'}
2018-09-01 22:11:24,252 [salt.state       :1941][INFO    ][3345] Completed state [dbs] at time 22:11:24.252298 duration_in_ms=4.923
2018-09-01 22:11:24,253 [salt.state       :1770][INFO    ][3345] Running state [dbs.mcp-ovs-ha.local] at time 22:11:24.253476
2018-09-01 22:11:24,254 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs.mcp-ovs-ha.local]
2018-09-01 22:11:24,257 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs.mcp-ovs-ha.local'}
2018-09-01 22:11:24,258 [salt.state       :1941][INFO    ][3345] Completed state [dbs.mcp-ovs-ha.local] at time 22:11:24.258234 duration_in_ms=4.758
2018-09-01 22:11:24,259 [salt.state       :1770][INFO    ][3345] Running state [prx] at time 22:11:24.259275
2018-09-01 22:11:24,260 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx]
2018-09-01 22:11:24,263 [salt.state       :290 ][INFO    ][3345] {'host': 'prx'}
2018-09-01 22:11:24,264 [salt.state       :1941][INFO    ][3345] Completed state [prx] at time 22:11:24.264251 duration_in_ms=4.976
2018-09-01 22:11:24,265 [salt.state       :1770][INFO    ][3345] Running state [prx.mcp-ovs-ha.local] at time 22:11:24.265392
2018-09-01 22:11:24,266 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx.mcp-ovs-ha.local]
2018-09-01 22:11:24,269 [salt.state       :290 ][INFO    ][3345] {'host': 'prx.mcp-ovs-ha.local'}
2018-09-01 22:11:24,270 [salt.state       :1941][INFO    ][3345] Completed state [prx.mcp-ovs-ha.local] at time 22:11:24.270223 duration_in_ms=4.831
2018-09-01 22:11:24,271 [salt.state       :1770][INFO    ][3345] Running state [prx02] at time 22:11:24.271299
2018-09-01 22:11:24,272 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx02]
2018-09-01 22:11:24,274 [salt.state       :290 ][INFO    ][3345] {'host': 'prx02'}
2018-09-01 22:11:24,275 [salt.state       :1941][INFO    ][3345] Completed state [prx02] at time 22:11:24.275487 duration_in_ms=4.188
2018-09-01 22:11:24,276 [salt.state       :1770][INFO    ][3345] Running state [prx02.mcp-ovs-ha.local] at time 22:11:24.276515
2018-09-01 22:11:24,277 [salt.state       :1803][INFO    ][3345] Executing state host.present for [prx02.mcp-ovs-ha.local]
2018-09-01 22:11:24,278 [salt.state       :290 ][INFO    ][3345] {'host': 'prx02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,278 [salt.state       :1941][INFO    ][3345] Completed state [prx02.mcp-ovs-ha.local] at time 22:11:24.278705 duration_in_ms=2.191
2018-09-01 22:11:24,279 [salt.state       :1770][INFO    ][3345] Running state [msg02] at time 22:11:24.279194
2018-09-01 22:11:24,279 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg02]
2018-09-01 22:11:24,282 [salt.state       :290 ][INFO    ][3345] {'host': 'msg02'}
2018-09-01 22:11:24,283 [salt.state       :1941][INFO    ][3345] Completed state [msg02] at time 22:11:24.283177 duration_in_ms=3.983
2018-09-01 22:11:24,283 [salt.state       :1770][INFO    ][3345] Running state [msg02.mcp-ovs-ha.local] at time 22:11:24.283672
2018-09-01 22:11:24,284 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg02.mcp-ovs-ha.local]
2018-09-01 22:11:24,426 [salt.state       :290 ][INFO    ][3345] {'host': 'msg02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,427 [salt.state       :1941][INFO    ][3345] Completed state [msg02.mcp-ovs-ha.local] at time 22:11:24.427660 duration_in_ms=143.987
2018-09-01 22:11:24,428 [salt.state       :1770][INFO    ][3345] Running state [msg03] at time 22:11:24.428777
2018-09-01 22:11:24,429 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg03]
2018-09-01 22:11:24,432 [salt.state       :290 ][INFO    ][3345] {'host': 'msg03'}
2018-09-01 22:11:24,433 [salt.state       :1941][INFO    ][3345] Completed state [msg03] at time 22:11:24.433605 duration_in_ms=4.828
2018-09-01 22:11:24,434 [salt.state       :1770][INFO    ][3345] Running state [msg03.mcp-ovs-ha.local] at time 22:11:24.434764
2018-09-01 22:11:24,435 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg03.mcp-ovs-ha.local]
2018-09-01 22:11:24,438 [salt.state       :290 ][INFO    ][3345] {'host': 'msg03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,439 [salt.state       :1941][INFO    ][3345] Completed state [msg03.mcp-ovs-ha.local] at time 22:11:24.439684 duration_in_ms=4.919
2018-09-01 22:11:24,440 [salt.state       :1770][INFO    ][3345] Running state [msg01] at time 22:11:24.440837
2018-09-01 22:11:24,441 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg01]
2018-09-01 22:11:24,444 [salt.state       :290 ][INFO    ][3345] {'host': 'msg01'}
2018-09-01 22:11:24,445 [salt.state       :1941][INFO    ][3345] Completed state [msg01] at time 22:11:24.445721 duration_in_ms=4.884
2018-09-01 22:11:24,446 [salt.state       :1770][INFO    ][3345] Running state [msg01.mcp-ovs-ha.local] at time 22:11:24.446875
2018-09-01 22:11:24,447 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg01.mcp-ovs-ha.local]
2018-09-01 22:11:24,450 [salt.state       :290 ][INFO    ][3345] {'host': 'msg01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,451 [salt.state       :1941][INFO    ][3345] Completed state [msg01.mcp-ovs-ha.local] at time 22:11:24.451560 duration_in_ms=4.685
2018-09-01 22:11:24,452 [salt.state       :1770][INFO    ][3345] Running state [msg] at time 22:11:24.452539
2018-09-01 22:11:24,453 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg]
2018-09-01 22:11:24,456 [salt.state       :290 ][INFO    ][3345] {'host': 'msg'}
2018-09-01 22:11:24,457 [salt.state       :1941][INFO    ][3345] Completed state [msg] at time 22:11:24.457744 duration_in_ms=5.205
2018-09-01 22:11:24,459 [salt.state       :1770][INFO    ][3345] Running state [msg.mcp-ovs-ha.local] at time 22:11:24.458933
2018-09-01 22:11:24,459 [salt.state       :1803][INFO    ][3345] Executing state host.present for [msg.mcp-ovs-ha.local]
2018-09-01 22:11:24,462 [salt.state       :290 ][INFO    ][3345] {'host': 'msg.mcp-ovs-ha.local'}
2018-09-01 22:11:24,463 [salt.state       :1941][INFO    ][3345] Completed state [msg.mcp-ovs-ha.local] at time 22:11:24.463680 duration_in_ms=4.747
2018-09-01 22:11:24,464 [salt.state       :1770][INFO    ][3345] Running state [cfg01] at time 22:11:24.464839
2018-09-01 22:11:24,465 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cfg01]
2018-09-01 22:11:24,467 [salt.state       :290 ][INFO    ][3345] Host cfg01 (10.167.4.11) already present
2018-09-01 22:11:24,468 [salt.state       :1941][INFO    ][3345] Completed state [cfg01] at time 22:11:24.468175 duration_in_ms=3.336
2018-09-01 22:11:24,469 [salt.state       :1770][INFO    ][3345] Running state [cfg01.mcp-ovs-ha.local] at time 22:11:24.469384
2018-09-01 22:11:24,470 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2018-09-01 22:11:24,471 [salt.state       :290 ][INFO    ][3345] Host cfg01.mcp-ovs-ha.local (10.167.4.11) already present
2018-09-01 22:11:24,472 [salt.state       :1941][INFO    ][3345] Completed state [cfg01.mcp-ovs-ha.local] at time 22:11:24.472699 duration_in_ms=3.315
2018-09-01 22:11:24,473 [salt.state       :1770][INFO    ][3345] Running state [cmp002] at time 22:11:24.473452
2018-09-01 22:11:24,473 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cmp002]
2018-09-01 22:11:24,475 [salt.state       :290 ][INFO    ][3345] {'host': 'cmp002'}
2018-09-01 22:11:24,475 [salt.state       :1941][INFO    ][3345] Completed state [cmp002] at time 22:11:24.475637 duration_in_ms=2.185
2018-09-01 22:11:24,476 [salt.state       :1770][INFO    ][3345] Running state [cmp002.mcp-ovs-ha.local] at time 22:11:24.476155
2018-09-01 22:11:24,476 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cmp002.mcp-ovs-ha.local]
2018-09-01 22:11:24,480 [salt.state       :290 ][INFO    ][3345] {'host': 'cmp002.mcp-ovs-ha.local'}
2018-09-01 22:11:24,481 [salt.state       :1941][INFO    ][3345] Completed state [cmp002.mcp-ovs-ha.local] at time 22:11:24.481004 duration_in_ms=4.849
2018-09-01 22:11:24,481 [salt.state       :1770][INFO    ][3345] Running state [cmp001] at time 22:11:24.481533
2018-09-01 22:11:24,482 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cmp001]
2018-09-01 22:11:24,486 [salt.state       :290 ][INFO    ][3345] {'host': 'cmp001'}
2018-09-01 22:11:24,487 [salt.state       :1941][INFO    ][3345] Completed state [cmp001] at time 22:11:24.487691 duration_in_ms=6.157
2018-09-01 22:11:24,488 [salt.state       :1770][INFO    ][3345] Running state [cmp001.mcp-ovs-ha.local] at time 22:11:24.488827
2018-09-01 22:11:24,489 [salt.state       :1803][INFO    ][3345] Executing state host.present for [cmp001.mcp-ovs-ha.local]
2018-09-01 22:11:24,570 [salt.state       :290 ][INFO    ][3345] {'host': 'cmp001.mcp-ovs-ha.local'}
2018-09-01 22:11:24,571 [salt.state       :1941][INFO    ][3345] Completed state [cmp001.mcp-ovs-ha.local] at time 22:11:24.571556 duration_in_ms=82.729
2018-09-01 22:11:24,572 [salt.state       :1770][INFO    ][3345] Running state [dbs01] at time 22:11:24.572694
2018-09-01 22:11:24,573 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs01]
2018-09-01 22:11:24,576 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs01'}
2018-09-01 22:11:24,577 [salt.state       :1941][INFO    ][3345] Completed state [dbs01] at time 22:11:24.577564 duration_in_ms=4.87
2018-09-01 22:11:24,578 [salt.state       :1770][INFO    ][3345] Running state [dbs01.mcp-ovs-ha.local] at time 22:11:24.578801
2018-09-01 22:11:24,579 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs01.mcp-ovs-ha.local]
2018-09-01 22:11:24,582 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,583 [salt.state       :1941][INFO    ][3345] Completed state [dbs01.mcp-ovs-ha.local] at time 22:11:24.583437 duration_in_ms=4.637
2018-09-01 22:11:24,584 [salt.state       :1770][INFO    ][3345] Running state [dbs02] at time 22:11:24.584568
2018-09-01 22:11:24,585 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs02]
2018-09-01 22:11:24,586 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs02'}
2018-09-01 22:11:24,587 [salt.state       :1941][INFO    ][3345] Completed state [dbs02] at time 22:11:24.587014 duration_in_ms=2.447
2018-09-01 22:11:24,587 [salt.state       :1770][INFO    ][3345] Running state [dbs02.mcp-ovs-ha.local] at time 22:11:24.587524
2018-09-01 22:11:24,587 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs02.mcp-ovs-ha.local]
2018-09-01 22:11:24,592 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,593 [salt.state       :1941][INFO    ][3345] Completed state [dbs02.mcp-ovs-ha.local] at time 22:11:24.593593 duration_in_ms=6.069
2018-09-01 22:11:24,594 [salt.state       :1770][INFO    ][3345] Running state [dbs03] at time 22:11:24.594715
2018-09-01 22:11:24,595 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs03]
2018-09-01 22:11:24,598 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs03'}
2018-09-01 22:11:24,599 [salt.state       :1941][INFO    ][3345] Completed state [dbs03] at time 22:11:24.599654 duration_in_ms=4.939
2018-09-01 22:11:24,600 [salt.state       :1770][INFO    ][3345] Running state [dbs03.mcp-ovs-ha.local] at time 22:11:24.600851
2018-09-01 22:11:24,601 [salt.state       :1803][INFO    ][3345] Executing state host.present for [dbs03.mcp-ovs-ha.local]
2018-09-01 22:11:24,604 [salt.state       :290 ][INFO    ][3345] {'host': 'dbs03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,605 [salt.state       :1941][INFO    ][3345] Completed state [dbs03.mcp-ovs-ha.local] at time 22:11:24.605797 duration_in_ms=4.947
2018-09-01 22:11:24,607 [salt.state       :1770][INFO    ][3345] Running state [mas01] at time 22:11:24.607022
2018-09-01 22:11:24,608 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mas01]
2018-09-01 22:11:24,610 [salt.state       :290 ][INFO    ][3345] {'host': 'mas01'}
2018-09-01 22:11:24,611 [salt.state       :1941][INFO    ][3345] Completed state [mas01] at time 22:11:24.611062 duration_in_ms=4.043
2018-09-01 22:11:24,611 [salt.state       :1770][INFO    ][3345] Running state [mas01.mcp-ovs-ha.local] at time 22:11:24.611597
2018-09-01 22:11:24,612 [salt.state       :1803][INFO    ][3345] Executing state host.present for [mas01.mcp-ovs-ha.local]
2018-09-01 22:11:24,616 [salt.state       :290 ][INFO    ][3345] {'host': 'mas01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,617 [salt.state       :1941][INFO    ][3345] Completed state [mas01.mcp-ovs-ha.local] at time 22:11:24.616964 duration_in_ms=5.367
2018-09-01 22:11:24,617 [salt.state       :1770][INFO    ][3345] Running state [ctl02] at time 22:11:24.617513
2018-09-01 22:11:24,617 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl02]
2018-09-01 22:11:24,622 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl02'}
2018-09-01 22:11:24,623 [salt.state       :1941][INFO    ][3345] Completed state [ctl02] at time 22:11:24.622981 duration_in_ms=5.468
2018-09-01 22:11:24,623 [salt.state       :1770][INFO    ][3345] Running state [ctl02.mcp-ovs-ha.local] at time 22:11:24.623569
2018-09-01 22:11:24,624 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl02.mcp-ovs-ha.local]
2018-09-01 22:11:24,628 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl02.mcp-ovs-ha.local'}
2018-09-01 22:11:24,629 [salt.state       :1941][INFO    ][3345] Completed state [ctl02.mcp-ovs-ha.local] at time 22:11:24.629264 duration_in_ms=5.695
2018-09-01 22:11:24,629 [salt.state       :1770][INFO    ][3345] Running state [ctl03] at time 22:11:24.629852
2018-09-01 22:11:24,630 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl03]
2018-09-01 22:11:24,634 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl03'}
2018-09-01 22:11:24,635 [salt.state       :1941][INFO    ][3345] Completed state [ctl03] at time 22:11:24.635131 duration_in_ms=5.279
2018-09-01 22:11:24,635 [salt.state       :1770][INFO    ][3345] Running state [ctl03.mcp-ovs-ha.local] at time 22:11:24.635800
2018-09-01 22:11:24,636 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl03.mcp-ovs-ha.local]
2018-09-01 22:11:24,640 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl03.mcp-ovs-ha.local'}
2018-09-01 22:11:24,641 [salt.state       :1941][INFO    ][3345] Completed state [ctl03.mcp-ovs-ha.local] at time 22:11:24.641291 duration_in_ms=5.491
2018-09-01 22:11:24,642 [salt.state       :1770][INFO    ][3345] Running state [ctl01] at time 22:11:24.641963
2018-09-01 22:11:24,642 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl01]
2018-09-01 22:11:24,646 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl01'}
2018-09-01 22:11:24,647 [salt.state       :1941][INFO    ][3345] Completed state [ctl01] at time 22:11:24.647677 duration_in_ms=5.713
2018-09-01 22:11:24,648 [salt.state       :1770][INFO    ][3345] Running state [ctl01.mcp-ovs-ha.local] at time 22:11:24.648853
2018-09-01 22:11:24,649 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl01.mcp-ovs-ha.local]
2018-09-01 22:11:24,874 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl01.mcp-ovs-ha.local'}
2018-09-01 22:11:24,875 [salt.state       :1941][INFO    ][3345] Completed state [ctl01.mcp-ovs-ha.local] at time 22:11:24.875457 duration_in_ms=226.604
2018-09-01 22:11:24,876 [salt.state       :1770][INFO    ][3345] Running state [ctl] at time 22:11:24.876627
2018-09-01 22:11:24,877 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl]
2018-09-01 22:11:24,880 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl'}
2018-09-01 22:11:24,881 [salt.state       :1941][INFO    ][3345] Completed state [ctl] at time 22:11:24.881170 duration_in_ms=4.543
2018-09-01 22:11:24,882 [salt.state       :1770][INFO    ][3345] Running state [ctl.mcp-ovs-ha.local] at time 22:11:24.882394
2018-09-01 22:11:24,883 [salt.state       :1803][INFO    ][3345] Executing state host.present for [ctl.mcp-ovs-ha.local]
2018-09-01 22:11:24,886 [salt.state       :290 ][INFO    ][3345] {'host': 'ctl.mcp-ovs-ha.local'}
2018-09-01 22:11:24,887 [salt.state       :1941][INFO    ][3345] Completed state [ctl.mcp-ovs-ha.local] at time 22:11:24.887224 duration_in_ms=4.83
2018-09-01 22:11:24,888 [salt.state       :1770][INFO    ][3345] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 22:11:24.888178
2018-09-01 22:11:24,889 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/network/interfaces.d/50-cloud-init.cfg]
2018-09-01 22:11:24,889 [salt.state       :290 ][INFO    ][3345] {'removed': '/etc/network/interfaces.d/50-cloud-init.cfg'}
2018-09-01 22:11:24,890 [salt.state       :1941][INFO    ][3345] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 22:11:24.890226 duration_in_ms=2.05
2018-09-01 22:11:24,891 [salt.state       :1770][INFO    ][3345] Running state [ens3] at time 22:11:24.891848
2018-09-01 22:11:24,892 [salt.state       :1803][INFO    ][3345] Executing state network.managed for [ens3]
2018-09-01 22:11:25,101 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['ifup', 'ens3'] in directory '/root'
2018-09-01 22:11:25,124 [salt.loaded.int.module.cmdmod:722 ][ERROR   ][3345] Command '['ifup', 'ens3']' failed with return code: 1
2018-09-01 22:11:25,180 [salt.loaded.int.module.cmdmod:724 ][ERROR   ][3345] stdout: RTNETLINK answers: File exists
Failed to bring up ens3.
2018-09-01 22:11:25,181 [salt.loaded.int.module.cmdmod:728 ][ERROR   ][3345] retcode: 1
2018-09-01 22:11:25,849 [salt.state       :290 ][INFO    ][3345] {'interface': 'Added network interface.', 'status': 'Interface ens3 is up'}
2018-09-01 22:11:25,849 [salt.state       :1941][INFO    ][3345] Completed state [ens3] at time 22:11:25.849803 duration_in_ms=957.954
2018-09-01 22:11:25,850 [salt.state       :1770][INFO    ][3345] Running state [linux_system_network] at time 22:11:25.850167
2018-09-01 22:11:25,850 [salt.state       :1803][INFO    ][3345] Executing state network.system for [linux_system_network]
2018-09-01 22:11:25,851 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'status', 'networking.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:25,867 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-active', 'networking.service'] in directory '/root'
2018-09-01 22:11:26,028 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'status', 'NetworkManager.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:26,046 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'systemctl', 'enable', 'networking.service'] in directory '/root'
2018-09-01 22:11:26,700 [salt.loaded.int.module.debian_ip:1970][WARNING ][3345] The network state sls is requiring a reboot of the system to properly apply network configuration.
2018-09-01 22:11:26,701 [salt.state       :290 ][INFO    ][3345] {'network_settings': u'--- \n+++ \n@@ -1,2 +1,4 @@\n NETWORKING=yes\n\n HOSTNAME=prx01\n\n+DOMAIN=mcp-ovs-ha.local\n\n+SEARCH=maas\n'}
2018-09-01 22:11:26,701 [salt.state       :1941][INFO    ][3345] Completed state [linux_system_network] at time 22:11:26.701717 duration_in_ms=851.549
2018-09-01 22:11:26,702 [salt.state       :1770][INFO    ][3345] Running state [ens2] at time 22:11:26.702050
2018-09-01 22:11:26,702 [salt.state       :1803][INFO    ][3345] Executing state network.managed for [ens2]
2018-09-01 22:11:26,727 [salt.state       :290 ][INFO    ][3345] {'interface': 'Added network interface.'}
2018-09-01 22:11:26,728 [salt.state       :1941][INFO    ][3345] Completed state [ens2] at time 22:11:26.728127 duration_in_ms=26.076
2018-09-01 22:11:26,728 [salt.state       :1770][INFO    ][3345] Running state [ens4] at time 22:11:26.728591
2018-09-01 22:11:26,729 [salt.state       :1803][INFO    ][3345] Executing state network.managed for [ens4]
2018-09-01 22:11:26,778 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['ifup', 'ens4'] in directory '/root'
2018-09-01 22:11:27,945 [salt.state       :290 ][INFO    ][3345] {'interface': 'Added network interface.', 'status': 'Interface ens4 is up'}
2018-09-01 22:11:27,945 [salt.state       :1941][INFO    ][3345] Completed state [ens4] at time 22:11:27.945679 duration_in_ms=1217.087
2018-09-01 22:11:27,945 [salt.state       :1770][INFO    ][3345] Running state [/etc/profile.d/proxy.sh] at time 22:11:27.945958
2018-09-01 22:11:27,946 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/profile.d/proxy.sh]
2018-09-01 22:11:27,946 [salt.state       :290 ][INFO    ][3345] File /etc/profile.d/proxy.sh is not present
2018-09-01 22:11:27,946 [salt.state       :1941][INFO    ][3345] Completed state [/etc/profile.d/proxy.sh] at time 22:11:27.946766 duration_in_ms=0.807
2018-09-01 22:11:27,946 [salt.state       :1770][INFO    ][3345] Running state [/etc/apt/apt.conf.d/95proxies] at time 22:11:27.946956
2018-09-01 22:11:27,947 [salt.state       :1803][INFO    ][3345] Executing state file.absent for [/etc/apt/apt.conf.d/95proxies]
2018-09-01 22:11:27,947 [salt.state       :290 ][INFO    ][3345] File /etc/apt/apt.conf.d/95proxies is not present
2018-09-01 22:11:27,947 [salt.state       :1941][INFO    ][3345] Completed state [/etc/apt/apt.conf.d/95proxies] at time 22:11:27.947507 duration_in_ms=0.552
2018-09-01 22:11:27,948 [salt.state       :1770][INFO    ][3345] Running state [ntp] at time 22:11:27.948726
2018-09-01 22:11:27,948 [salt.state       :1803][INFO    ][3345] Executing state pkg.installed for [ntp]
2018-09-01 22:11:28,108 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:11:28,131 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'ntp'] in directory '/root'
2018-09-01 22:11:30,563 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221130546731
2018-09-01 22:11:30,578 [salt.minion      :1431][INFO    ][5905] Starting a new job with PID 5905
2018-09-01 22:11:30,600 [salt.minion      :1708][INFO    ][5905] Returning information for job: 20180901221130546731
2018-09-01 22:11:35,579 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:11:35,617 [salt.state       :290 ][INFO    ][3345] Made the following changes:
'ntp' changed from 'absent' to '1:4.2.8p4+dfsg-3ubuntu5.9'
'libopts25' changed from 'absent' to '1:5.18.7-3'

2018-09-01 22:11:35,632 [salt.state       :905 ][INFO    ][3345] Loading fresh modules for state activity
2018-09-01 22:11:35,666 [salt.state       :1941][INFO    ][3345] Completed state [ntp] at time 22:11:35.666427 duration_in_ms=7717.701
2018-09-01 22:11:35,670 [salt.state       :1770][INFO    ][3345] Running state [/etc/ntp.conf] at time 22:11:35.670090
2018-09-01 22:11:35,670 [salt.state       :1803][INFO    ][3345] Executing state file.managed for [/etc/ntp.conf]
2018-09-01 22:11:35,692 [salt.fileclient  :1215][INFO    ][3345] Fetching file from saltenv 'base', ** done ** 'ntp/files/ntp.conf'
2018-09-01 22:11:35,739 [salt.state       :290 ][INFO    ][3345] File changed:
--- 
+++ 
@@ -1,66 +1,25 @@
-# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help
 
-driftfile /var/lib/ntp/ntp.drift
 
-# Enable this if you want statistics to be logged.
-#statsdir /var/log/ntpstats/
+# ntpd will only synchronize your clock.
 
-statistics loopstats peerstats clockstats
-filegen loopstats file loopstats type day enable
-filegen peerstats file peerstats type day enable
-filegen clockstats file clockstats type day enable
+# For details, see:
+# - the ntp.conf man page
+# - http://support.ntp.org/bin/view/Support/GettingStarted
+# - https://wiki.archlinux.org/index.php/Network_Time_Protocol_daemon
 
-# Specify one or more NTP servers.
+# Associate to cloud NTP pool servers
+server 1.pool.ntp.org iburst
+server 0.pool.ntp.org
 
-# Use servers from the NTP Pool Project. Approved by Ubuntu Technical Board
-# on 2011-02-08 (LP: #104525). See http://www.pool.ntp.org/join.html for
-# more information.
-pool 0.ubuntu.pool.ntp.org iburst
-pool 1.ubuntu.pool.ntp.org iburst
-pool 2.ubuntu.pool.ntp.org iburst
-pool 3.ubuntu.pool.ntp.org iburst
+# Exchange time with everybody, but don't allow configuration.
+restrict -4 default kod nomodify notrap nopeer noquery
+restrict -6 default kod nomodify notrap nopeer noquery
 
-# Use Ubuntu's ntp server as a fallback.
-pool ntp.ubuntu.com
-
-# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for
-# details.  The web page <http://support.ntp.org/bin/view/Support/AccessRestrictions>
-# might also be helpful.
-#
-# Note that "restrict" applies to both servers and clients, so a configuration
-# that might be intended to block requests from certain clients could also end
-# up blocking replies from your own upstream servers.
-
-# By default, exchange time with everybody, but don't allow configuration.
-restrict -4 default kod notrap nomodify nopeer noquery limited
-restrict -6 default kod notrap nomodify nopeer noquery limited
-
-# Local users may interrogate the ntp server more closely.
+# Only allow read-only access from localhost
 restrict 127.0.0.1
 restrict ::1
 
-# Needed for adding pool entries
-restrict source notrap nomodify noquery
+# mode7 is required for collectd monitoring
 
-# Clients from this (example!) subnet have unlimited access, but only if
-# cryptographically authenticated.
-#restrict 192.168.123.0 mask 255.255.255.0 notrust
-
-
-# If you want to provide time to your local subnet, change the next line.
-# (Again, the address is an example only.)
-#broadcast 192.168.123.255
-
-# If you want to listen to time broadcasts on your local subnet, de-comment the
-# next lines.  Please do this only if you trust everybody on the network!
-#disable auth
-#broadcastclient
-
-#Changes recquired to use pps synchonisation as explained in documentation:
-#http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm#AEN3918
-
-#server 127.127.8.1 mode 135 prefer    # Meinberg GPS167 with PPS
-#fudge 127.127.8.1 time1 0.0042        # relative to PPS for my hardware
-
-#server 127.127.22.1                   # ATOM(PPS)
-#fudge 127.127.22.1 flag3 1            # enable PPS API
+# Location of drift file
+driftfile /var/lib/ntp/ntp.drift

2018-09-01 22:11:35,740 [salt.state       :1941][INFO    ][3345] Completed state [/etc/ntp.conf] at time 22:11:35.740132 duration_in_ms=70.042
2018-09-01 22:11:36,059 [salt.state       :1770][INFO    ][3345] Running state [ntp] at time 22:11:36.059555
2018-09-01 22:11:36,059 [salt.state       :1803][INFO    ][3345] Executing state service.running for [ntp]
2018-09-01 22:11:36,060 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2018-09-01 22:11:36,074 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-09-01 22:11:36,085 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2018-09-01 22:11:36,098 [salt.state       :290 ][INFO    ][3345] The service ntp is already running
2018-09-01 22:11:36,099 [salt.state       :1941][INFO    ][3345] Completed state [ntp] at time 22:11:36.099247 duration_in_ms=39.692
2018-09-01 22:11:36,099 [salt.state       :1770][INFO    ][3345] Running state [ntp] at time 22:11:36.099551
2018-09-01 22:11:36,099 [salt.state       :1803][INFO    ][3345] Executing state service.mod_watch for [ntp]
2018-09-01 22:11:36,100 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2018-09-01 22:11:36,114 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3345] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'ntp.service'] in directory '/root'
2018-09-01 22:11:36,180 [salt.state       :290 ][INFO    ][3345] {'ntp': True}
2018-09-01 22:11:36,181 [salt.state       :1941][INFO    ][3345] Completed state [ntp] at time 22:11:36.181179 duration_in_ms=81.628
2018-09-01 22:11:36,187 [salt.minion      :1708][INFO    ][3345] Returning information for job: 20180901221024268410
2018-09-01 22:11:41,109 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command ssh.set_auth_key with jid 20180901221141092238
2018-09-01 22:11:41,135 [salt.minion      :1431][INFO    ][6777] Starting a new job with PID 6777
2018-09-01 22:11:41,153 [salt.minion      :1708][INFO    ][6777] Returning information for job: 20180901221141092238
2018-09-01 22:11:41,874 [salt.minion      :1307][INFO    ][3251] User sudo_ubuntu Executing command system.reboot with jid 20180901221141855449
2018-09-01 22:11:41,893 [salt.minion      :1431][INFO    ][6782] Starting a new job with PID 6782
2018-09-01 22:11:41,900 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][6782] Executing command ['shutdown', '-r', 'now'] in directory '/root'
2018-09-01 22:11:42,029 [salt.utils.parsers:1051][WARNING ][3251] Minion received a SIGTERM. Exiting.
2018-09-01 22:11:42,029 [salt.cli.daemons :82  ][INFO    ][3251] Shutting down the Salt Minion
2018-09-01 22:11:43,033 [salt.minion      :1708][INFO    ][6782] Returning information for job: 20180901221141855449
2018-09-01 22:11:57,702 [salt.cli.daemons :293 ][INFO    ][1654] Setting up the Salt Minion "prx01.mcp-ovs-ha.local"
2018-09-01 22:11:58,462 [salt.cli.daemons :82  ][INFO    ][1654] Starting up the Salt Minion
2018-09-01 22:11:58,462 [salt.utils.event :1017][INFO    ][1654] Starting pull socket on /var/run/salt/minion/minion_event_ff902ec8d4_pull.ipc
2018-09-01 22:11:59,456 [salt.minion      :976 ][INFO    ][1654] Creating minion process manager
2018-09-01 22:12:00,664 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1654] Executing command ['date', '+%z'] in directory '/root'
2018-09-01 22:12:00,741 [salt.utils.schedule:568 ][INFO    ][1654] Updating job settings for scheduled job: __mine_interval
2018-09-01 22:12:00,744 [salt.minion      :1107][INFO    ][1654] Added mine.update to scheduler
2018-09-01 22:12:00,758 [salt.minion      :1965][INFO    ][1654] Minion is starting as user 'root'
2018-09-01 22:12:00,773 [salt.minion      :2324][INFO    ][1654] Minion is ready to receive requests!
2018-09-01 22:12:03,780 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221202627128
2018-09-01 22:12:03,805 [salt.minion      :1431][INFO    ][1792] Starting a new job with PID 1792
2018-09-01 22:12:03,887 [salt.minion      :1708][INFO    ][1792] Returning information for job: 20180901221202627128
2018-09-01 22:12:23,382 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command test.ping with jid 20180901221223373922
2018-09-01 22:12:23,401 [salt.minion      :1431][INFO    ][1797] Starting a new job with PID 1797
2018-09-01 22:12:23,472 [salt.minion      :1708][INFO    ][1797] Returning information for job: 20180901221223373922
2018-09-01 22:12:24,136 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command pkg.upgrade with jid 20180901221224128382
2018-09-01 22:12:24,157 [salt.minion      :1431][INFO    ][1802] Starting a new job with PID 1802
2018-09-01 22:12:24,328 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1802] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:12:25,044 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1802] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'dist-upgrade'] in directory '/root'
2018-09-01 22:12:29,244 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221229236054
2018-09-01 22:12:29,264 [salt.minion      :1431][INFO    ][1842] Starting a new job with PID 1842
2018-09-01 22:12:29,286 [salt.minion      :1708][INFO    ][1842] Returning information for job: 20180901221229236054
2018-09-01 22:12:39,418 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221239411245
2018-09-01 22:12:39,441 [salt.minion      :1431][INFO    ][1879] Starting a new job with PID 1879
2018-09-01 22:12:39,468 [salt.minion      :1708][INFO    ][1879] Returning information for job: 20180901221239411245
2018-09-01 22:12:49,545 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221249539023
2018-09-01 22:12:49,570 [salt.minion      :1431][INFO    ][1896] Starting a new job with PID 1896
2018-09-01 22:12:49,585 [salt.minion      :1708][INFO    ][1896] Returning information for job: 20180901221249539023
2018-09-01 22:12:59,752 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221259747836
2018-09-01 22:12:59,775 [salt.minion      :1431][INFO    ][1920] Starting a new job with PID 1920
2018-09-01 22:12:59,864 [salt.minion      :1708][INFO    ][1920] Returning information for job: 20180901221259747836
2018-09-01 22:13:09,869 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221309862012
2018-09-01 22:13:09,891 [salt.minion      :1431][INFO    ][1931] Starting a new job with PID 1931
2018-09-01 22:13:09,905 [salt.minion      :1708][INFO    ][1931] Returning information for job: 20180901221309862012
2018-09-01 22:13:20,054 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221320047537
2018-09-01 22:13:20,075 [salt.minion      :1431][INFO    ][1954] Starting a new job with PID 1954
2018-09-01 22:13:20,123 [salt.minion      :1708][INFO    ][1954] Returning information for job: 20180901221320047537
2018-09-01 22:13:30,227 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221330220870
2018-09-01 22:13:30,247 [salt.minion      :1431][INFO    ][1971] Starting a new job with PID 1971
2018-09-01 22:13:30,405 [salt.minion      :1708][INFO    ][1971] Returning information for job: 20180901221330220870
2018-09-01 22:13:40,358 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221340351888
2018-09-01 22:13:40,379 [salt.minion      :1431][INFO    ][2003] Starting a new job with PID 2003
2018-09-01 22:13:40,393 [salt.minion      :1708][INFO    ][2003] Returning information for job: 20180901221340351888
2018-09-01 22:13:50,430 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221350425072
2018-09-01 22:13:50,454 [salt.minion      :1431][INFO    ][2022] Starting a new job with PID 2022
2018-09-01 22:13:50,468 [salt.minion      :1708][INFO    ][2022] Returning information for job: 20180901221350425072
2018-09-01 22:14:00,571 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221400565809
2018-09-01 22:14:00,587 [salt.minion      :1431][INFO    ][2094] Starting a new job with PID 2094
2018-09-01 22:14:00,602 [salt.minion      :1708][INFO    ][2094] Returning information for job: 20180901221400565809
2018-09-01 22:14:10,655 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221410651093
2018-09-01 22:14:10,673 [salt.minion      :1431][INFO    ][2218] Starting a new job with PID 2218
2018-09-01 22:14:10,721 [salt.minion      :1708][INFO    ][2218] Returning information for job: 20180901221410651093
2018-09-01 22:14:20,757 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221420753114
2018-09-01 22:14:20,774 [salt.minion      :1431][INFO    ][2356] Starting a new job with PID 2356
2018-09-01 22:14:20,790 [salt.minion      :1708][INFO    ][2356] Returning information for job: 20180901221420753114
2018-09-01 22:14:30,787 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221430782986
2018-09-01 22:14:30,808 [salt.minion      :1431][INFO    ][2872] Starting a new job with PID 2872
2018-09-01 22:14:30,821 [salt.minion      :1708][INFO    ][2872] Returning information for job: 20180901221430782986
2018-09-01 22:14:40,294 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1802] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:14:40,331 [salt.minion      :1708][INFO    ][1802] Returning information for job: 20180901221224128382
2018-09-01 22:14:41,392 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command test.ping with jid 20180901221441389573
2018-09-01 22:14:41,418 [salt.minion      :1431][INFO    ][3175] Starting a new job with PID 3175
2018-09-01 22:14:41,435 [salt.minion      :1708][INFO    ][3175] Returning information for job: 20180901221441389573
2018-09-01 22:14:41,607 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command state.sls with jid 20180901221441602581
2018-09-01 22:14:41,626 [salt.minion      :1431][INFO    ][3180] Starting a new job with PID 3180
2018-09-01 22:14:45,454 [salt.state       :905 ][INFO    ][3180] Loading fresh modules for state activity
2018-09-01 22:14:45,535 [salt.fileclient  :1215][INFO    ][3180] Fetching file from saltenv 'base', ** done ** 'keepalived/init.sls'
2018-09-01 22:14:45,564 [salt.fileclient  :1215][INFO    ][3180] Fetching file from saltenv 'base', ** done ** 'keepalived/cluster.sls'
2018-09-01 22:14:46,715 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221446711482
2018-09-01 22:14:46,732 [salt.minion      :1431][INFO    ][3192] Starting a new job with PID 3192
2018-09-01 22:14:46,746 [salt.minion      :1708][INFO    ][3192] Returning information for job: 20180901221446711482
2018-09-01 22:14:47,627 [salt.state       :1770][INFO    ][3180] Running state [keepalived] at time 22:14:47.627298
2018-09-01 22:14:47,628 [salt.state       :1803][INFO    ][3180] Executing state pkg.installed for [keepalived]
2018-09-01 22:14:47,629 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:14:47,985 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['apt-cache', '-q', 'policy', 'keepalived'] in directory '/root'
2018-09-01 22:14:48,115 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:14:50,290 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:14:50,315 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'keepalived'] in directory '/root'
2018-09-01 22:14:56,875 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901221456870959
2018-09-01 22:14:56,897 [salt.minion      :1431][INFO    ][4199] Starting a new job with PID 4199
2018-09-01 22:14:56,948 [salt.minion      :1708][INFO    ][4199] Returning information for job: 20180901221456870959
2018-09-01 22:15:00,149 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:15:00,187 [salt.state       :290 ][INFO    ][3180] Made the following changes:
'libsnmp30' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'libsensors4' changed from 'absent' to '1:3.4.0-2'
'libsnmp-base' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.1'
'keepalived' changed from 'absent' to '1:1.3.9-1ubuntu0.18.04.1~cloud0'
'ipvsadm' changed from 'absent' to '1:1.28-3'
'libnl-route-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'

2018-09-01 22:15:00,238 [salt.state       :905 ][INFO    ][3180] Loading fresh modules for state activity
2018-09-01 22:15:00,263 [salt.state       :1941][INFO    ][3180] Completed state [keepalived] at time 22:15:00.263272 duration_in_ms=12635.974
2018-09-01 22:15:00,267 [salt.state       :1770][INFO    ][3180] Running state [lsof] at time 22:15:00.267299
2018-09-01 22:15:00,267 [salt.state       :1803][INFO    ][3180] Executing state pkg.installed for [lsof]
2018-09-01 22:15:00,717 [salt.state       :290 ][INFO    ][3180] All specified packages are already installed
2018-09-01 22:15:00,718 [salt.state       :1941][INFO    ][3180] Completed state [lsof] at time 22:15:00.717956 duration_in_ms=450.656
2018-09-01 22:15:00,743 [salt.state       :1770][INFO    ][3180] Running state [/etc/keepalived/keepalived.conf] at time 22:15:00.743331
2018-09-01 22:15:00,744 [salt.state       :1803][INFO    ][3180] Executing state file.managed for [/etc/keepalived/keepalived.conf]
2018-09-01 22:15:00,771 [salt.fileclient  :1215][INFO    ][3180] Fetching file from saltenv 'base', ** done ** 'keepalived/files/keepalived.conf'
2018-09-01 22:15:00,807 [salt.state       :290 ][INFO    ][3180] File changed:
New file
2018-09-01 22:15:00,808 [salt.state       :1941][INFO    ][3180] Completed state [/etc/keepalived/keepalived.conf] at time 22:15:00.808017 duration_in_ms=64.688
2018-09-01 22:15:00,808 [salt.state       :1770][INFO    ][3180] Running state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 22:15:00.808416
2018-09-01 22:15:00,808 [salt.state       :1803][INFO    ][3180] Executing state file.managed for [/usr/local/bin/vrrp_script_check_pidof.sh]
2018-09-01 22:15:00,950 [salt.fileclient  :1215][INFO    ][3180] Fetching file from saltenv 'base', ** done ** 'keepalived/files/vrrp_script_check_pidof.sh'
2018-09-01 22:15:00,957 [salt.state       :290 ][INFO    ][3180] File changed:
New file
2018-09-01 22:15:00,957 [salt.state       :1941][INFO    ][3180] Completed state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 22:15:00.957687 duration_in_ms=149.269
2018-09-01 22:15:00,978 [salt.state       :1770][INFO    ][3180] Running state [keepalived] at time 22:15:00.978649
2018-09-01 22:15:00,979 [salt.state       :1803][INFO    ][3180] Executing state service.running for [keepalived]
2018-09-01 22:15:00,980 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'status', 'keepalived.service', '-n', '0'] in directory '/root'
2018-09-01 22:15:01,004 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,018 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,032 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,171 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,190 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,202 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3180] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2018-09-01 22:15:01,217 [salt.state       :290 ][INFO    ][3180] {'keepalived': True}
2018-09-01 22:15:01,218 [salt.state       :1941][INFO    ][3180] Completed state [keepalived] at time 22:15:01.218028 duration_in_ms=239.379
2018-09-01 22:15:01,219 [salt.minion      :1708][INFO    ][3180] Returning information for job: 20180901221441602581
2018-09-01 22:18:48,618 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command pillar.get with jid 20180901221848613405
2018-09-01 22:18:48,641 [salt.minion      :1431][INFO    ][4646] Starting a new job with PID 4646
2018-09-01 22:18:48,649 [salt.minion      :1708][INFO    ][4646] Returning information for job: 20180901221848613405
2018-09-01 22:31:13,391 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command state.sls with jid 20180901223113386300
2018-09-01 22:31:13,417 [salt.minion      :1431][INFO    ][4969] Starting a new job with PID 4969
2018-09-01 22:31:18,498 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901223118491130
2018-09-01 22:31:18,542 [salt.minion      :1431][INFO    ][4974] Starting a new job with PID 4974
2018-09-01 22:31:18,556 [salt.minion      :1708][INFO    ][4974] Returning information for job: 20180901223118491130
2018-09-01 22:31:19,608 [salt.state       :905 ][INFO    ][4969] Loading fresh modules for state activity
2018-09-01 22:31:20,104 [salt.fileclient  :1215][INFO    ][4969] Fetching file from saltenv 'base', ** done ** 'memcached/init.sls'
2018-09-01 22:31:20,135 [salt.fileclient  :1215][INFO    ][4969] Fetching file from saltenv 'base', ** done ** 'memcached/server.sls'
2018-09-01 22:31:20,156 [salt.fileclient  :1215][INFO    ][4969] Fetching file from saltenv 'base', ** done ** 'memcached/map.jinja'
2018-09-01 22:31:20,697 [salt.state       :1770][INFO    ][4969] Running state [memcached] at time 22:31:20.697200
2018-09-01 22:31:20,697 [salt.state       :1803][INFO    ][4969] Executing state pkg.installed for [memcached]
2018-09-01 22:31:20,698 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:21,032 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['apt-cache', '-q', 'policy', 'memcached'] in directory '/root'
2018-09-01 22:31:21,140 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 22:31:22,912 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:31:22,933 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'memcached'] in directory '/root'
2018-09-01 22:31:28,069 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:28,109 [salt.state       :290 ][INFO    ][4969] Made the following changes:
'memcached' changed from 'absent' to '1.4.25-2ubuntu1.4'

2018-09-01 22:31:28,126 [salt.state       :905 ][INFO    ][4969] Loading fresh modules for state activity
2018-09-01 22:31:28,161 [salt.state       :1941][INFO    ][4969] Completed state [memcached] at time 22:31:28.161278 duration_in_ms=7464.077
2018-09-01 22:31:28,166 [salt.state       :1770][INFO    ][4969] Running state [python-memcache] at time 22:31:28.166056
2018-09-01 22:31:28,169 [salt.state       :1803][INFO    ][4969] Executing state pkg.installed for [python-memcache]
2018-09-01 22:31:28,641 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 22:31:28,661 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901223128656973
2018-09-01 22:31:28,673 [salt.minion      :1431][INFO    ][5790] Starting a new job with PID 5790
2018-09-01 22:31:28,673 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-memcache'] in directory '/root'
2018-09-01 22:31:28,684 [salt.minion      :1708][INFO    ][5790] Returning information for job: 20180901223128656973
2018-09-01 22:31:30,781 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 22:31:30,820 [salt.state       :290 ][INFO    ][4969] Made the following changes:
'python-memcache' changed from 'absent' to '1.57+fixed-1~u16.04+mcp1'

2018-09-01 22:31:30,836 [salt.state       :905 ][INFO    ][4969] Loading fresh modules for state activity
2018-09-01 22:31:30,859 [salt.state       :1941][INFO    ][4969] Completed state [python-memcache] at time 22:31:30.859693 duration_in_ms=2693.635
2018-09-01 22:31:30,867 [salt.state       :1770][INFO    ][4969] Running state [/etc/memcached.conf] at time 22:31:30.866978
2018-09-01 22:31:30,867 [salt.state       :1803][INFO    ][4969] Executing state file.managed for [/etc/memcached.conf]
2018-09-01 22:31:30,890 [salt.fileclient  :1215][INFO    ][4969] Fetching file from saltenv 'base', ** done ** 'memcached/files/memcached.conf'
2018-09-01 22:31:30,914 [salt.state       :290 ][INFO    ][4969] File changed:
--- 
+++ 
@@ -1,11 +1,10 @@
+
 # memcached default config file
 # 2003 - Jay Bonci <jaybonci@debian.org>
-# This configuration file is read by the start-memcached script provided as
-# part of the Debian GNU/Linux distribution.
+# This configuration file is read by the start-memcached script provided as part of the Debian GNU/Linux distribution. 
 
 # Run memcached as a daemon. This command is implied, and is not needed for the
-# daemon to run. See the README.Debian that comes with this package for more
-# information.
+# daemon to run. See the README.Debian that comes with this package for more information.
 -d
 
 # Log memcached's output to /var/log/memcached
@@ -18,13 +17,13 @@
 # -vv
 
 # Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
-# Note that the daemon will grow to this size, but does not start out holding this much
-# memory
+# Note that the daemon will grow to this size, but does not start out holding this much memory
 -m 64
 
 # Default connection port is 11211
 -p 11211
 
+-U 11211
 # Run the daemon as root. The start-memcached will default to running as root if no
 # -u command is present in this config file
 -u memcache
@@ -32,10 +31,12 @@
 # Specify which IP address to listen on. The default is to listen on all IP addresses
 # This parameter is one of the only security measures that memcached has, so make sure
 # it's listening on a firewalled interface.
--l 127.0.0.1
+-l 0.0.0.0
 
 # Limit the number of simultaneous incoming connections. The daemon default is 1024
 # -c 1024
+# Mirantis
+-c 8192
 
 # Lock down all paged memory. Consult with the README and homepage before you do this
 # -k
@@ -45,3 +46,9 @@
 
 # Maximize core file limit
 # -r
+
+# Number of threads to use to process incoming requests.
+-t 1
+
+# Set size of each slab page. Default value for this parameter is 1m, minimum is 1k, max is 128m.
+-I 1m

2018-09-01 22:31:30,919 [salt.state       :1941][INFO    ][4969] Completed state [/etc/memcached.conf] at time 22:31:30.919767 duration_in_ms=52.79
2018-09-01 22:31:31,237 [salt.state       :1770][INFO    ][4969] Running state [memcached] at time 22:31:31.237866
2018-09-01 22:31:31,238 [salt.state       :1803][INFO    ][4969] Executing state service.running for [memcached]
2018-09-01 22:31:31,238 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemctl', 'status', 'memcached.service', '-n', '0'] in directory '/root'
2018-09-01 22:31:31,251 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,263 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,275 [salt.state       :290 ][INFO    ][4969] The service memcached is already running
2018-09-01 22:31:31,276 [salt.state       :1941][INFO    ][4969] Completed state [memcached] at time 22:31:31.276112 duration_in_ms=38.246
2018-09-01 22:31:31,276 [salt.state       :1770][INFO    ][4969] Running state [memcached] at time 22:31:31.276698
2018-09-01 22:31:31,277 [salt.state       :1803][INFO    ][4969] Executing state service.mod_watch for [memcached]
2018-09-01 22:31:31,278 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,294 [salt.loaded.int.module.cmdmod:395 ][INFO    ][4969] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'memcached.service'] in directory '/root'
2018-09-01 22:31:31,315 [salt.state       :290 ][INFO    ][4969] {'memcached': True}
2018-09-01 22:31:31,316 [salt.state       :1941][INFO    ][4969] Completed state [memcached] at time 22:31:31.316610 duration_in_ms=39.911
2018-09-01 22:31:31,319 [salt.minion      :1708][INFO    ][4969] Returning information for job: 20180901223113386300
2018-09-01 23:12:01,777 [salt.utils.schedule:1375][INFO    ][1654] Running scheduled job: __mine_interval
2018-09-01 23:26:03,769 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command state.sls with jid 20180901232603765218
2018-09-01 23:26:03,792 [salt.minion      :1431][INFO    ][7294] Starting a new job with PID 7294
2018-09-01 23:26:07,642 [salt.state       :905 ][INFO    ][7294] Loading fresh modules for state activity
2018-09-01 23:26:08,032 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/init.sls'
2018-09-01 23:26:08,066 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/init.sls'
2018-09-01 23:26:08,442 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/service/init.sls'
2018-09-01 23:26:08,515 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/service/modules.sls'
2018-09-01 23:26:08,572 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/service/mpm.sls'
2018-09-01 23:26:08,633 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/site.sls'
2018-09-01 23:26:08,721 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/users.sls'
2018-09-01 23:26:08,800 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/server/robots.sls'
2018-09-01 23:26:08,857 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/init.sls'
2018-09-01 23:26:08,880 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232608872791
2018-09-01 23:26:08,884 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/server/init.sls'
2018-09-01 23:26:08,900 [salt.minion      :1431][INFO    ][7321] Starting a new job with PID 7321
2018-09-01 23:26:08,908 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/server/service.sls'
2018-09-01 23:26:08,916 [salt.minion      :1708][INFO    ][7321] Returning information for job: 20180901232608872791
2018-09-01 23:26:08,979 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/server/plugin.sls'
2018-09-01 23:26:09,586 [salt.state       :1770][INFO    ][7294] Running state [apache2] at time 23:26:09.586254
2018-09-01 23:26:09,586 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [apache2]
2018-09-01 23:26:09,587 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:26:09,928 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['apt-cache', '-q', 'policy', 'apache2'] in directory '/root'
2018-09-01 23:26:10,041 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 23:26:13,440 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:26:13,467 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'apache2'] in directory '/root'
2018-09-01 23:26:19,073 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232619066644
2018-09-01 23:26:19,094 [salt.minion      :1431][INFO    ][8250] Starting a new job with PID 8250
2018-09-01 23:26:19,110 [salt.minion      :1708][INFO    ][8250] Returning information for job: 20180901232619066644
2018-09-01 23:26:26,359 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:26:26,396 [salt.state       :290 ][INFO    ][7294] Made the following changes:
'apache2-data' changed from 'absent' to '2.4.18-2ubuntu3.9'
'httpd-cgi' changed from 'absent' to '1'
'apache2-utils' changed from 'absent' to '2.4.18-2ubuntu3.9'
'httpd' changed from 'absent' to '1'
'ssl-cert' changed from 'absent' to '1.0.37'
'apache2' changed from 'absent' to '2.4.18-2ubuntu3.9'

2018-09-01 23:26:26,412 [salt.state       :905 ][INFO    ][7294] Loading fresh modules for state activity
2018-09-01 23:26:26,437 [salt.state       :1941][INFO    ][7294] Completed state [apache2] at time 23:26:26.437573 duration_in_ms=16851.319
2018-09-01 23:26:26,446 [salt.state       :1770][INFO    ][7294] Running state [openssl] at time 23:26:26.446900
2018-09-01 23:26:26,447 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [openssl]
2018-09-01 23:26:26,910 [salt.state       :290 ][INFO    ][7294] All specified packages are already installed
2018-09-01 23:26:26,911 [salt.state       :1941][INFO    ][7294] Completed state [openssl] at time 23:26:26.911187 duration_in_ms=464.286
2018-09-01 23:26:26,912 [salt.state       :1770][INFO    ][7294] Running state [a2enmod ssl] at time 23:26:26.912611
2018-09-01 23:26:26,912 [salt.state       :1803][INFO    ][7294] Executing state cmd.run for [a2enmod ssl]
2018-09-01 23:26:26,913 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command 'a2enmod ssl' in directory '/root'
2018-09-01 23:26:26,963 [salt.state       :290 ][INFO    ][7294] {'pid': 8769, 'retcode': 0, 'stderr': '', 'stdout': 'Considering dependency setenvif for ssl:\nModule setenvif already enabled\nConsidering dependency mime for ssl:\nModule mime already enabled\nConsidering dependency socache_shmcb for ssl:\nEnabling module socache_shmcb.\nEnabling module ssl.\nSee /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:26,964 [salt.state       :1941][INFO    ][7294] Completed state [a2enmod ssl] at time 23:26:26.964283 duration_in_ms=51.673
2018-09-01 23:26:26,965 [salt.state       :1770][INFO    ][7294] Running state [a2enmod rewrite] at time 23:26:26.965217
2018-09-01 23:26:26,965 [salt.state       :1803][INFO    ][7294] Executing state cmd.run for [a2enmod rewrite]
2018-09-01 23:26:26,966 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command 'a2enmod rewrite' in directory '/root'
2018-09-01 23:26:27,009 [salt.state       :290 ][INFO    ][7294] {'pid': 8782, 'retcode': 0, 'stderr': '', 'stdout': 'Enabling module rewrite.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:27,009 [salt.state       :1941][INFO    ][7294] Completed state [a2enmod rewrite] at time 23:26:27.009608 duration_in_ms=44.391
2018-09-01 23:26:27,015 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/mods-available/mpm_prefork.conf] at time 23:26:27.015396
2018-09-01 23:26:27,015 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/etc/apache2/mods-available/mpm_prefork.conf]
2018-09-01 23:26:27,038 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/files/mpm/mpm_prefork.conf'
2018-09-01 23:26:27,076 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -6,11 +6,12 @@
 # MaxConnectionsPerChild: maximum number of requests a server process serves
 
 <IfModule mpm_prefork_module>
-	StartServers			 5
-	MinSpareServers		  5
-	MaxSpareServers		 10
-	MaxRequestWorkers	  150
-	MaxConnectionsPerChild   0
+    StartServers            5
+    MinSpareServers         5
+    MaxSpareServers         10
+    MaxRequestWorkers       150
+    MaxConnectionsPerChild  0
+    ServerLimit             150
 </IfModule>
 
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+# vim: syntax=apache ts=4 sw=4 sts=4 sr et

2018-09-01 23:26:27,076 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/mods-available/mpm_prefork.conf] at time 23:26:27.076622 duration_in_ms=61.226
2018-09-01 23:26:27,076 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/mods-enabled/mpm_worker.load] at time 23:26:27.076823
2018-09-01 23:26:27,077 [salt.state       :1803][INFO    ][7294] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.load]
2018-09-01 23:26:27,077 [salt.state       :290 ][INFO    ][7294] File /etc/apache2/mods-enabled/mpm_worker.load is not present
2018-09-01 23:26:27,077 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/mods-enabled/mpm_worker.load] at time 23:26:27.077437 duration_in_ms=0.614
2018-09-01 23:26:27,077 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/mods-enabled/mpm_event.load] at time 23:26:27.077613
2018-09-01 23:26:27,077 [salt.state       :1803][INFO    ][7294] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_event.load]
2018-09-01 23:26:27,078 [salt.state       :290 ][INFO    ][7294] {'removed': '/etc/apache2/mods-enabled/mpm_event.load'}
2018-09-01 23:26:27,078 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/mods-enabled/mpm_event.load] at time 23:26:27.078224 duration_in_ms=0.612
2018-09-01 23:26:27,078 [salt.state       :1770][INFO    ][7294] Running state [a2enmod mpm_prefork] at time 23:26:27.078750
2018-09-01 23:26:27,078 [salt.state       :1803][INFO    ][7294] Executing state cmd.run for [a2enmod mpm_prefork]
2018-09-01 23:26:27,079 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command 'a2enmod mpm_prefork' in directory '/root'
2018-09-01 23:26:27,118 [salt.state       :290 ][INFO    ][7294] {'pid': 8795, 'retcode': 0, 'stderr': '', 'stdout': 'Considering conflict mpm_event for mpm_prefork:\nConsidering conflict mpm_worker for mpm_prefork:\nEnabling module mpm_prefork.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2018-09-01 23:26:27,119 [salt.state       :1941][INFO    ][7294] Completed state [a2enmod mpm_prefork] at time 23:26:27.119402 duration_in_ms=40.651
2018-09-01 23:26:27,120 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 23:26:27.120492
2018-09-01 23:26:27,121 [salt.state       :1803][INFO    ][7294] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.conf]
2018-09-01 23:26:27,122 [salt.state       :290 ][INFO    ][7294] File /etc/apache2/mods-enabled/mpm_worker.conf is not present
2018-09-01 23:26:27,123 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 23:26:27.123275 duration_in_ms=2.783
2018-09-01 23:26:27,124 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/mods-enabled/mpm_event.conf] at time 23:26:27.124004
2018-09-01 23:26:27,124 [salt.state       :1803][INFO    ][7294] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_event.conf]
2018-09-01 23:26:27,125 [salt.state       :290 ][INFO    ][7294] {'removed': '/etc/apache2/mods-enabled/mpm_event.conf'}
2018-09-01 23:26:27,126 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/mods-enabled/mpm_event.conf] at time 23:26:27.126405 duration_in_ms=2.4
2018-09-01 23:26:27,132 [salt.state       :1770][INFO    ][7294] Running state [apache_server_service_task] at time 23:26:27.132513
2018-09-01 23:26:27,132 [salt.state       :1803][INFO    ][7294] Executing state test.show_notification for [apache_server_service_task]
2018-09-01 23:26:27,133 [salt.state       :290 ][INFO    ][7294] Running apache.server.service
2018-09-01 23:26:27,133 [salt.state       :1941][INFO    ][7294] Completed state [apache_server_service_task] at time 23:26:27.133621 duration_in_ms=1.108
2018-09-01 23:26:27,134 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/ports.conf] at time 23:26:27.134262
2018-09-01 23:26:27,134 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/etc/apache2/ports.conf]
2018-09-01 23:26:27,151 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/files/ports.conf'
2018-09-01 23:26:27,186 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -2,14 +2,4 @@
 # have to change the VirtualHost statement in
 # /etc/apache2/sites-enabled/000-default.conf
 
-Listen 80
-
-<IfModule ssl_module>
-	Listen 443
-</IfModule>
-
-<IfModule mod_gnutls.c>
-	Listen 443
-</IfModule>
-
 # vim: syntax=apache ts=4 sw=4 sts=4 sr noet

2018-09-01 23:26:27,189 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/ports.conf] at time 23:26:27.189103 duration_in_ms=54.84
2018-09-01 23:26:27,189 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/conf-available/security.conf] at time 23:26:27.189650
2018-09-01 23:26:27,190 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/etc/apache2/conf-available/security.conf]
2018-09-01 23:26:27,205 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'apache/files/security.conf'
2018-09-01 23:26:27,298 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -1,73 +1,14 @@
-#
-# Disable access to the entire file system except for the directories that
-# are explicitly allowed later.
-#
-# This currently breaks the configurations that come with some web application
-# Debian packages.
-#
-#<Directory />
-#   AllowOverride None
-#   Require all denied
-#</Directory>
+ServerSignature Off
+TraceEnable Off
+ServerTokens Prod
+<DirectoryMatch "/\.svn">
+    Require all denied
+</DirectoryMatch>
 
+<DirectoryMatch "/\.git">
+    Require all denied
+</DirectoryMatch>
 
-# Changing the following options will not really affect the security of the
-# server, but might make attacks slightly more difficult in some cases.
-
-#
-# ServerTokens
-# This directive configures what you return as the Server HTTP response
-# Header. The default is 'Full' which sends information about the OS-Type
-# and compiled in modules.
-# Set to one of:  Full | OS | Minimal | Minor | Major | Prod
-# where Full conveys the most information, and Prod the least.
-#ServerTokens Minimal
-ServerTokens OS
-#ServerTokens Full
-
-#
-# Optionally add a line containing the server version and virtual host
-# name to server-generated pages (internal error documents, FTP directory
-# listings, mod_status and mod_info output etc., but not CGI generated
-# documents or custom error documents).
-# Set to "EMail" to also include a mailto: link to the ServerAdmin.
-# Set to one of:  On | Off | EMail
-#ServerSignature Off
-ServerSignature On
-
-#
-# Allow TRACE method
-#
-# Set to "extended" to also reflect the request body (only for testing and
-# diagnostic purposes).
-#
-# Set to one of:  On | Off | extended
-TraceEnable Off
-#TraceEnable On
-
-#
-# Forbid access to version control directories
-#
-# If you use version control systems in your document root, you should
-# probably deny access to their directories. For example, for subversion:
-#
-#<DirectoryMatch "/\.svn">
-#   Require all denied
-#</DirectoryMatch>
-
-#
-# Setting this header will prevent MSIE from interpreting files as something
-# else than declared by the content type in the HTTP headers.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Content-Type-Options: "nosniff"
-
-#
-# Setting this header will prevent other sites from embedding pages from this
-# site as frames. This defends against clickjacking attacks.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Frame-Options: "sameorigin"
-
-
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+<DirectoryMatch "/\.hg">
+    Require all denied
+</DirectoryMatch>

2018-09-01 23:26:27,301 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/conf-available/security.conf] at time 23:26:27.301365 duration_in_ms=111.713
2018-09-01 23:26:27,314 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/sites-enabled/000-default.conf] at time 23:26:27.314021
2018-09-01 23:26:27,314 [salt.state       :1803][INFO    ][7294] Executing state file.absent for [/etc/apache2/sites-enabled/000-default.conf]
2018-09-01 23:26:27,314 [salt.state       :290 ][INFO    ][7294] {'removed': '/etc/apache2/sites-enabled/000-default.conf'}
2018-09-01 23:26:27,314 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/sites-enabled/000-default.conf] at time 23:26:27.314748 duration_in_ms=0.727
2018-09-01 23:26:27,316 [salt.state       :1770][INFO    ][7294] Running state [apache2] at time 23:26:27.316105
2018-09-01 23:26:27,316 [salt.state       :1803][INFO    ][7294] Executing state service.running for [apache2]
2018-09-01 23:26:27,316 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2018-09-01 23:26:27,334 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,353 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,374 [salt.state       :290 ][INFO    ][7294] The service apache2 is already running
2018-09-01 23:26:27,375 [salt.state       :1941][INFO    ][7294] Completed state [apache2] at time 23:26:27.375017 duration_in_ms=58.911
2018-09-01 23:26:27,375 [salt.state       :1770][INFO    ][7294] Running state [apache2] at time 23:26:27.375841
2018-09-01 23:26:27,376 [salt.state       :1803][INFO    ][7294] Executing state service.mod_watch for [apache2]
2018-09-01 23:26:27,378 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,392 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemd-run', '--scope', 'systemctl', 'reload', 'apache2.service'] in directory '/root'
2018-09-01 23:26:27,592 [salt.state       :290 ][INFO    ][7294] {'apache2': True}
2018-09-01 23:26:27,593 [salt.state       :1941][INFO    ][7294] Completed state [apache2] at time 23:26:27.593479 duration_in_ms=217.638
2018-09-01 23:26:27,595 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/conf-enabled/security.conf] at time 23:26:27.595098
2018-09-01 23:26:27,595 [salt.state       :1803][INFO    ][7294] Executing state file.symlink for [/etc/apache2/conf-enabled/security.conf]
2018-09-01 23:26:27,598 [salt.state       :290 ][INFO    ][7294] {'new': '/etc/apache2/conf-enabled/security.conf'}
2018-09-01 23:26:27,599 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/conf-enabled/security.conf] at time 23:26:27.599407 duration_in_ms=4.309
2018-09-01 23:26:27,600 [salt.state       :1770][INFO    ][7294] Running state [openstack-dashboard] at time 23:26:27.600094
2018-09-01 23:26:27,600 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [openstack-dashboard]
2018-09-01 23:26:27,629 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:26:27,657 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'openstack-dashboard'] in directory '/root'
2018-09-01 23:26:29,266 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232629259730
2018-09-01 23:26:29,287 [salt.minion      :1431][INFO    ][8901] Starting a new job with PID 8901
2018-09-01 23:26:29,302 [salt.minion      :1708][INFO    ][8901] Returning information for job: 20180901232629259730
2018-09-01 23:26:39,466 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232639456740
2018-09-01 23:26:39,486 [salt.minion      :1431][INFO    ][8998] Starting a new job with PID 8998
2018-09-01 23:26:39,500 [salt.minion      :1708][INFO    ][8998] Returning information for job: 20180901232639456740
2018-09-01 23:26:49,685 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232649677329
2018-09-01 23:26:49,710 [salt.minion      :1431][INFO    ][9187] Starting a new job with PID 9187
2018-09-01 23:26:49,727 [salt.minion      :1708][INFO    ][9187] Returning information for job: 20180901232649677329
2018-09-01 23:26:59,891 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232659884234
2018-09-01 23:26:59,911 [salt.minion      :1431][INFO    ][9404] Starting a new job with PID 9404
2018-09-01 23:26:59,927 [salt.minion      :1708][INFO    ][9404] Returning information for job: 20180901232659884234
2018-09-01 23:27:09,961 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232709954170
2018-09-01 23:27:10,314 [salt.minion      :1431][INFO    ][9651] Starting a new job with PID 9651
2018-09-01 23:27:10,334 [salt.minion      :1708][INFO    ][9651] Returning information for job: 20180901232709954170
2018-09-01 23:27:20,142 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232720134742
2018-09-01 23:27:20,161 [salt.minion      :1431][INFO    ][9883] Starting a new job with PID 9883
2018-09-01 23:27:20,180 [salt.minion      :1708][INFO    ][9883] Returning information for job: 20180901232720134742
2018-09-01 23:27:30,352 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232730344309
2018-09-01 23:27:30,371 [salt.minion      :1431][INFO    ][10162] Starting a new job with PID 10162
2018-09-01 23:27:30,394 [salt.minion      :1708][INFO    ][10162] Returning information for job: 20180901232730344309
2018-09-01 23:27:40,560 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232740553565
2018-09-01 23:27:40,581 [salt.minion      :1431][INFO    ][10438] Starting a new job with PID 10438
2018-09-01 23:27:40,601 [salt.minion      :1708][INFO    ][10438] Returning information for job: 20180901232740553565
2018-09-01 23:27:50,763 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232750758196
2018-09-01 23:27:50,787 [salt.minion      :1431][INFO    ][10608] Starting a new job with PID 10608
2018-09-01 23:27:50,805 [salt.minion      :1708][INFO    ][10608] Returning information for job: 20180901232750758196
2018-09-01 23:28:00,967 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232800959711
2018-09-01 23:28:00,992 [salt.minion      :1431][INFO    ][11295] Starting a new job with PID 11295
2018-09-01 23:28:01,016 [salt.minion      :1708][INFO    ][11295] Returning information for job: 20180901232800959711
2018-09-01 23:28:11,187 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232811181712
2018-09-01 23:28:11,202 [salt.minion      :1431][INFO    ][11369] Starting a new job with PID 11369
2018-09-01 23:28:11,228 [salt.minion      :1708][INFO    ][11369] Returning information for job: 20180901232811181712
2018-09-01 23:28:21,409 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232821401517
2018-09-01 23:28:21,429 [salt.minion      :1431][INFO    ][11693] Starting a new job with PID 11693
2018-09-01 23:28:21,452 [salt.minion      :1708][INFO    ][11693] Returning information for job: 20180901232821401517
2018-09-01 23:28:31,627 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232831619612
2018-09-01 23:28:31,646 [salt.minion      :1431][INFO    ][12046] Starting a new job with PID 12046
2018-09-01 23:28:31,663 [salt.minion      :1708][INFO    ][12046] Returning information for job: 20180901232831619612
2018-09-01 23:28:41,841 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232841833815
2018-09-01 23:28:41,859 [salt.minion      :1431][INFO    ][12387] Starting a new job with PID 12387
2018-09-01 23:28:41,873 [salt.minion      :1708][INFO    ][12387] Returning information for job: 20180901232841833815
2018-09-01 23:28:52,057 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232852050086
2018-09-01 23:28:52,084 [salt.minion      :1431][INFO    ][12546] Starting a new job with PID 12546
2018-09-01 23:28:52,103 [salt.minion      :1708][INFO    ][12546] Returning information for job: 20180901232852050086
2018-09-01 23:29:02,285 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232902277717
2018-09-01 23:29:02,307 [salt.minion      :1431][INFO    ][12555] Starting a new job with PID 12555
2018-09-01 23:29:02,323 [salt.minion      :1708][INFO    ][12555] Returning information for job: 20180901232902277717
2018-09-01 23:29:12,508 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232912501134
2018-09-01 23:29:12,538 [salt.minion      :1431][INFO    ][12564] Starting a new job with PID 12564
2018-09-01 23:29:12,555 [salt.minion      :1708][INFO    ][12564] Returning information for job: 20180901232912501134
2018-09-01 23:29:19,194 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:19,240 [salt.state       :290 ][INFO    ][7294] Made the following changes:
'python-routes' changed from 'absent' to '2.4.1-1~u16.04+mcp2'
'python-retrying' changed from 'absent' to '1.3.3-1'
'libjs-angular-file-upload' changed from 'absent' to '12.0.4+dfsg1-2.1~u16.04+mcp2'
'python-os-service-types' changed from 'absent' to '1.1.0-1.0~u16.04+mcp1'
'python-kombu' changed from 'absent' to '4.1.0-1~u16.04+mcp1'
'python-oslo.concurrency' changed from 'absent' to '3.25.0-1.0~u16.04+mcp2'
'python-xstatic-angular-fileupload' changed from 'absent' to '12.0.4.0+dfsg1-1.1~u16.04+mcp2'
'python-sqlparse' changed from 'absent' to '0.2.2-1~u16.04+mcp1'
'python-pint' changed from 'absent' to '0.6-1ubuntu1'
'python-monotonic' changed from 'absent' to '0.6-2'
'python2.7-pymongo' changed from 'absent' to '1'
'python-openstacksdk' changed from 'absent' to '0.11.3+repack-1.0~u16.04+mcp2'
'python-deprecation' changed from 'absent' to '1.0.1-1~u16.04+mcp2'
'python2.7-bson' changed from 'absent' to '1'
'libtiff5' changed from 'absent' to '4.0.6-1ubuntu0.4'
'python-secretstorage' changed from 'absent' to '2.1.3-1'
'libjs-jsencrypt' changed from 'absent' to '2.3.0+dfsg2-1~u16.04+mcp2'
'python-glanceclient' changed from 'absent' to '1:2.10.0-1.0~u16.04+mcp3'
'python-formencode' changed from 'absent' to '1.3.0-0ubuntu5'
'twitter-bootstrap' changed from 'absent' to '1'
'libjs-term.js' changed from 'absent' to '0.0.7-1~u16.04+mcp2'
'python-cachetools' changed from 'absent' to '2.0.0-2.0~u16.04+mcp1'
'python-xstatic-jasmine' changed from 'absent' to '2.4.1.1+fixed1-1~u16.04+mcp1'
'python-semantic-version' changed from 'absent' to '2.3.1-1'
'python-blinker' changed from 'absent' to '1.3.dfsg2-1build1'
'python-django-common' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'python-roman' changed from 'absent' to '2.0.0-2'
'python-prettytable' changed from 'absent' to '0.7.2-3'
'python-bs4' changed from 'absent' to '4.6.0-1~u16.04+mcp1'
'python2.7-pymongo-ext' changed from 'absent' to '1'
'python-tenacity' changed from 'absent' to '4.8.0-1.0~u16.04+mcp1'
'python-unittest2' changed from 'absent' to '1.1.0-6.1'
'python-setuptools' changed from 'absent' to '39.0.1-2~cloud0'
'python2.7-django-appconf' changed from 'absent' to '1'
'python-stevedore' changed from 'absent' to '1:1.25.0-1~u16.04+mcp2'
'docutils-doc' changed from 'absent' to '0.12+dfsg-1'
'python-dbus' changed from 'absent' to '1.2.0-3'
'python-gridfs' changed from 'absent' to '3.2-1build1'
'python-fixtures' changed from 'absent' to '3.0.0-1.1~u16.04+mcp2'
'python-xstatic-jquery.tablesorter' changed from 'absent' to '2.14.5.1-2.0~u16.04+mcp1'
'libjs-twitter-bootstrap' changed from 'absent' to '2.0.2+dfsg-9'
'python-testtools' changed from 'absent' to '2.3.0-1.0~u16.04+mcp1'
'libjs-jquery-cookie' changed from 'absent' to '10-2ubuntu2'
'python-anyjson' changed from 'absent' to '0.3.3-1build1'
'libjs-angularjs-smart-table' changed from 'absent' to '1.4.13-1~u16.04+mcp2'
'python-xstatic-hogan' changed from 'absent' to '2.0.0.2-1'
'python-dogpile.cache' changed from 'absent' to '0.6.2-1.1~u16.04+mcp2'
'python-compressor' changed from 'absent' to '1'
'python-dnspython' changed from 'absent' to '1.14.0-3.1~u16.04+mcp2'
'libjs-spin.js' changed from 'absent' to '1.2.8+dfsg2-1'
'fonts-roboto-fontface' changed from 'absent' to '0.5.0-2~u16.04+mcp2'
'python-pil' changed from 'absent' to '3.1.2-0ubuntu1.1'
'docutils-common' changed from 'absent' to '0.12+dfsg-1'
'python2.7-lxml' changed from 'absent' to '1'
'python-pika' changed from 'absent' to '0.10.0-1'
'libpaper-utils' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-fasteners' changed from 'absent' to '0.12.0-2ubuntu1'
'python-babel' changed from 'absent' to '2.3.4+dfsg.1-2.1~u16.04+mcp2'
'python-osc-lib' changed from 'absent' to '1.9.0-1.0~u16.04+mcp1'
'liblcms2-2' changed from 'absent' to '2.6-3ubuntu2'
'python2.7-simplejson' changed from 'absent' to '1'
'python-extras' changed from 'absent' to '1.0.0-2.0~u16.04+mcp1'
'python-xstatic-bootstrap-scss' changed from 'absent' to '3.3.7.1-2~u16.04+mcp3'
'python-xstatic-term.js' changed from 'absent' to '0.0.7.0-2~u16.04+mcp2'
'python-bson-ext' changed from 'absent' to '3.2-1build1'
'python-scgi' changed from 'absent' to '1.13-1.1build1'
'python2.7-pil' changed from 'absent' to '1'
'python-repoze.lru' changed from 'absent' to '0.6-6'
'python-posix-ipc' changed from 'absent' to '0.9.8-2build2'
'formencode-i18n' changed from 'absent' to '1.3.0-0ubuntu5'
'python-xstatic-angular-bootstrap' changed from 'absent' to '2.2.0.0-1.1~u16.04+mcp2'
'python2.7-testtools' changed from 'absent' to '1'
'docutils' changed from 'absent' to '1'
'python-django-pyscss' changed from 'absent' to '2.0.2-4'
'python-xstatic-bootstrap-datepicker' changed from 'absent' to '1.3.1.1-1~u16.04+mcp1'
'python2.7-dbus' changed from 'absent' to '1'
'python-oslo.middleware' changed from 'absent' to '3.34.0-1.0~u16.04+mcp2'
'fonts-materialdesignicons-webfont' changed from 'absent' to '1.4.57-1.1~u16.04+mcp2'
'python-xstatic-angular' changed from 'absent' to '1.5.8.0-1.1~u16.04+mcp2'
'python-pillow' changed from 'absent' to '1'
'python2.7-cinderclient' changed from 'absent' to '1'
'libpaperg' changed from 'absent' to '1'
'python2.7-netifaces' changed from 'absent' to '1'
'python-xstatic-mdi' changed from 'absent' to '1.4.57.0-1.1~u16.04+mcp2'
'python-xstatic-jquery' changed from 'absent' to '1.10.2.1-2~u16.04+mcp2'
'python-oslo.context' changed from 'absent' to '1:2.20.0-1.0~u16.04+mcp1'
'python-neutronclient' changed from 'absent' to '1:6.7.0-1.0~u16.04+mcp12'
'python-pymongo-ext' changed from 'absent' to '3.2-1build1'
'python-xstatic-angular-schema-form' changed from 'absent' to '0.8.13.0-1.1~u16.04+mcp2'
'python2.7-pyinotify' changed from 'absent' to '1'
'libjs-jquery-tablesorter' changed from 'absent' to '10-2ubuntu2'
'python-pyparsing' changed from 'absent' to '2.1.10+dfsg1-1.1~u16.04+mcp2'
'python-babel-localedata' changed from 'absent' to '2.3.4+dfsg.1-2.1~u16.04+mcp2'
'python-positional' changed from 'absent' to '1.1.1-3.1~u16.04+mcp2'
'python-appconf' changed from 'absent' to '1'
'python-cmd2' changed from 'absent' to '0.6.8-1'
'libjs-magic-search' changed from 'absent' to '0.2.5-1'
'python-distribute' changed from 'absent' to '1'
'python-xstatic-tv4' changed from 'absent' to '1.2.7.0-1.1~u16.04+mcp2'
'python-oslo-log' changed from 'absent' to '1'
'python-keystoneclient' changed from 'absent' to '1:3.15.0-1.0~u16.04+mcp2'
'python-xstatic-font-awesome' changed from 'absent' to '4.7.0.0-3~u16.04+mcp2'
'python-rjsmin' changed from 'absent' to '1.0.12+dfsg1-2ubuntu1'
'python-pygments' changed from 'absent' to '2.2.0+dfsg-1~u16.04+mcp2'
'python-pathlib' changed from 'absent' to '1.0.1-2'
'python-iso8601' changed from 'absent' to '0.1.11-1'
'python-xstatic-jsencrypt' changed from 'absent' to '2.3.1.1-2~u16.04+mcp2'
'python-jsonpatch' changed from 'absent' to '1.21-1~u16.04+mcp1'
'python-xstatic-d3' changed from 'absent' to '3.5.17.0-2~u16.04+mcp2'
'libwebpmux1' changed from 'absent' to '0.4.4-1'
'python-xstatic-roboto-fontface' changed from 'absent' to '0.5.0.0-2~u16.04+mcp2'
'python-oslo.policy' changed from 'absent' to '1.33.2-1.0~u16.04+mcp3'
'python-xstatic' changed from 'absent' to '1.0.0-4'
'python-paste' changed from 'absent' to '2.0.3+dfsg-4.1~u16.04+mcp1'
'python-xstatic-jquery-ui' changed from 'absent' to '1.12.0.1+debian+dfsg3-2~u16.04+mcp2'
'python-lxml' changed from 'absent' to '3.5.0-1build1'
'python-oslo.config' changed from 'absent' to '1:5.2.0-1.0~u16.04+mcp5'
'python-futurist' changed from 'absent' to '1.6.0-1.0~u16.04+mcp1'
'libpaper1' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-webob' changed from 'absent' to '1:1.7.2-1~u16.04+mcp2'
'python2.7-gi' changed from 'absent' to '1'
'python-linecache2' changed from 'absent' to '1.0.0-2'
'python-xstatic-objectpath' changed from 'absent' to '1.2.1.0-2.1~u16.04+mcp2'
'python-pastedeploy-tpl' changed from 'absent' to '1.5.2-1'
'python-oauthlib' changed from 'absent' to '1.0.3-1'
'python-mimeparse' changed from 'absent' to '0.1.4-1build1'
'python-gi' changed from 'absent' to '3.20.0-0ubuntu1'
'python-xstatic-spin' changed from 'absent' to '1.2.8.0+dfsg1-1'
'python2.7-django-compressor' changed from 'absent' to '1'
'python-xstatic-angular-lrdragndrop' changed from 'absent' to '1.0.2.2-1'
'python-contextlib2' changed from 'absent' to '0.5.1-1'
'python-xstatic-bootswatch' changed from 'absent' to '3.3.7.0-2~u16.04+mcp2'
'python-xstatic-jquery-migrate' changed from 'absent' to '1.2.1.1+dfsg1-1'
'libjs-jquery.quicksearch' changed from 'absent' to '2.0.4-1'
'python-novaclient' changed from 'absent' to '2:9.1.1-1~u16.04+mcp6'
'python-oslo.utils' changed from 'absent' to '3.35.0-1.0~u16.04+mcp8'
'python-pika-pool' changed from 'absent' to '0.1.3-1ubuntu1'
'python-django' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'libjs-twitter-bootstrap-datepicker' changed from 'absent' to '1.3.1+dfsg1-1'
'python-debtcollector' changed from 'absent' to '1.3.0-2'
'python2.7-iso8601' changed from 'absent' to '1'
'python-bson' changed from 'absent' to '3.2-1build1'
'python-simplejson' changed from 'absent' to '3.8.1-1ubuntu2'
'fonts-font-awesome' changed from 'absent' to '4.7.0~dfsg-3~u16.04+mcp2'
'python-docutils' changed from 'absent' to '0.12+dfsg-1'
'python-openid' changed from 'absent' to '2.2.5-6'
'python-pastedeploy' changed from 'absent' to '1.5.2-1'
'python2.7-cmd2' changed from 'absent' to '1'
'libjs-jquery-ui' changed from 'absent' to '1.12.1+dfsg-5~u16.04+mcp2'
'python-tz' changed from 'absent' to '2014.10~dfsg1-0ubuntu2'
'python-pastescript' changed from 'absent' to '1.7.5-3build1'
'python-cliff' changed from 'absent' to '2.8.0-1~u16.04+mcp2'
'python-oslo.i18n' changed from 'absent' to '3.19.0-1.0~u16.04+mcp6'
'python-munch' changed from 'absent' to '2.2.0-1.0~u16.04+mcp1'
'python-xstatic-magic-search' changed from 'absent' to '0.2.5.1-1'
'python-appdirs' changed from 'absent' to '1.4.0-2'
'python2.7-pathlib' changed from 'absent' to '1'
'python-statsd' changed from 'absent' to '3.2.1-2~u16.04+mcp2'
'libjs-d3' changed from 'absent' to '3.5.17-2~u16.04+mcp2'
'python-keyring' changed from 'absent' to '8.5.1-1.1~u16.04+mcp2'
'python-django-appconf' changed from 'absent' to '1.0.1-4'
'python-xstatic-jquery.quicksearch' changed from 'absent' to '2.0.4.1-1'
'python-xstatic-smart-table' changed from 'absent' to '1.4.13.2-2~u16.04+mcp1'
'python-oslo-utils' changed from 'absent' to '1'
'python-oslo.serialization' changed from 'absent' to '2.24.0-1.0~u16.04+mcp1'
'python-django-babel' changed from 'absent' to '0.5.1-1.1~u16.04+mcp2'
'python-unicodecsv' changed from 'absent' to '0.14.1-1'
'python-wrapt' changed from 'absent' to '1.8.0-5build2'
'python-rfc3986' changed from 'absent' to '0.3.1-2.1~u16.04+mcp2'
'python-eventlet' changed from 'absent' to '0.20.0-4~u16.04+mcp2'
'python-django-horizon' changed from 'absent' to '3:13.0.1-4~u16.04+mcp46'
'python2.7-pyparsing' changed from 'absent' to '1'
'python-oslo.log' changed from 'absent' to '3.36.0-1.0~u16.04+mcp6'
'python-pyscss' changed from 'absent' to '1.3.4-5'
'python-pyinotify' changed from 'absent' to '0.9.6-1.1~u16.04+mcp2'
'libjpeg-turbo8' changed from 'absent' to '1.4.2-0ubuntu3.1'
'libjs-angularjs' changed from 'absent' to '1.5.10-1.1~u16.04+mcp2'
'libjpeg8' changed from 'absent' to '8c-2ubuntu8'
'python-amqp' changed from 'absent' to '2.2.1-1~exp1~u16.04+mcp1'
'libjs-bootswatch' changed from 'absent' to '3.3.7+dfsg2-1~u16.04+mcp2'
'libwebp5' changed from 'absent' to '0.4.4-1'
'python-vine' changed from 'absent' to '1.1.3+dfsg-2~u16.04+mcp3'
'python-django-compressor' changed from 'absent' to '2.1-1~u16.04+mcp2'
'python-netifaces' changed from 'absent' to '0.10.4-0.1build2'
'python-decorator' changed from 'absent' to '4.0.6-1'
'python-osprofiler' changed from 'absent' to '1.15.2-1.0~u16.04+mcp3'
'python-os-client-config' changed from 'absent' to '1.29.0-1.0~u16.04+mcp2'
'python-oslo.messaging' changed from 'absent' to '5.35.1-1.0~u16.04+mcp16'
'python-warlock' changed from 'absent' to '1.2.0-2.0~u16.04+mcp1'
'python-tempita' changed from 'absent' to '0.5.2-1build1'
'python-keyrings.alt' changed from 'absent' to '1.1.1-1'
'openstack-dashboard' changed from 'absent' to '3:13.0.1-4~u16.04+mcp46'
'python-json-pointer' changed from 'absent' to '1.9-3'
'libjs-lrdragndrop' changed from 'absent' to '1.0.2-2'
'python-html5lib' changed from 'absent' to '0.999-4'
'python-swiftclient' changed from 'absent' to '1:3.4.0-1~u16.04+mcp2'
'python-jwt' changed from 'absent' to '1.3.0-1ubuntu0.1'
'python2.7-gridfs' changed from 'absent' to '1'
'python-greenlet' changed from 'absent' to '0.4.12-2.0~u16.04+mcp1'
'python-oslo.service' changed from 'absent' to '1.29.0-1.0~u16.04+mcp1'
'python-rcssmin' changed from 'absent' to '1.0.6-1ubuntu1'
'python-ceilometerclient' changed from 'absent' to '2.9.0-2~u16.04+mcp1'
'python-csscompressor' changed from 'absent' to '0.9.4-2'
'python-traceback2' changed from 'absent' to '1.4.0-3'
'python-jmespath' changed from 'absent' to '0.9.0-2'
'python-keystoneauth1' changed from 'absent' to '3.4.0-1.0~u16.04+mcp7'
'libjs-angular-gettext' changed from 'absent' to '2.3.8-2~u16.04+mcp2'
'python-pymongo' changed from 'absent' to '3.2-1build1'
'libjs-jquery-metadata' changed from 'absent' to '10-2ubuntu2'
'libjs-rickshaw' changed from 'absent' to '1.5.1.dfsg-1'
'python-xstatic-rickshaw' changed from 'absent' to '1.5.0.2-2'
'python-cinderclient' changed from 'absent' to '1:3.5.0-1.0~u16.04+mcp1'
'python-requestsexceptions' changed from 'absent' to '1.3.0-3~u16.04+mcp2'
'python-oslo-context' changed from 'absent' to '1'
'python2.7-bson-ext' changed from 'absent' to '1'
'python-xstatic-angular-gettext' changed from 'absent' to '2.3.8.0-2~u16.04+mcp2'
'libjbig0' changed from 'absent' to '2.1-3.1'

2018-09-01 23:29:19,261 [salt.state       :905 ][INFO    ][7294] Loading fresh modules for state activity
2018-09-01 23:29:19,285 [salt.state       :1941][INFO    ][7294] Completed state [openstack-dashboard] at time 23:29:19.285262 duration_in_ms=171685.168
2018-09-01 23:29:19,289 [salt.state       :1770][INFO    ][7294] Running state [python-lesscpy] at time 23:29:19.289475
2018-09-01 23:29:19,289 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [python-lesscpy]
2018-09-01 23:29:20,627 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:29:20,658 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-lesscpy'] in directory '/root'
2018-09-01 23:29:22,567 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232922562529
2018-09-01 23:29:22,583 [salt.minion      :1431][INFO    ][12706] Starting a new job with PID 12706
2018-09-01 23:29:22,598 [salt.minion      :1708][INFO    ][12706] Returning information for job: 20180901232922562529
2018-09-01 23:29:24,381 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:24,428 [salt.state       :290 ][INFO    ][7294] Made the following changes:
'python-lesscpy' changed from 'absent' to '0.10-1'

2018-09-01 23:29:24,449 [salt.state       :905 ][INFO    ][7294] Loading fresh modules for state activity
2018-09-01 23:29:24,565 [salt.state       :1941][INFO    ][7294] Completed state [python-lesscpy] at time 23:29:24.565804 duration_in_ms=5276.327
2018-09-01 23:29:24,569 [salt.state       :1770][INFO    ][7294] Running state [python-memcache] at time 23:29:24.569580
2018-09-01 23:29:24,569 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [python-memcache]
2018-09-01 23:29:25,023 [salt.state       :290 ][INFO    ][7294] All specified packages are already installed
2018-09-01 23:29:25,023 [salt.state       :1941][INFO    ][7294] Completed state [python-memcache] at time 23:29:25.023568 duration_in_ms=453.988
2018-09-01 23:29:25,024 [salt.state       :1770][INFO    ][7294] Running state [gettext-base] at time 23:29:25.023990
2018-09-01 23:29:25,024 [salt.state       :1803][INFO    ][7294] Executing state pkg.installed for [gettext-base]
2018-09-01 23:29:25,029 [salt.state       :290 ][INFO    ][7294] All specified packages are already installed
2018-09-01 23:29:25,030 [salt.state       :1941][INFO    ][7294] Completed state [gettext-base] at time 23:29:25.030090 duration_in_ms=6.1
2018-09-01 23:29:25,030 [salt.state       :1770][INFO    ][7294] Running state [openstack-dashboard-apache] at time 23:29:25.030909
2018-09-01 23:29:25,031 [salt.state       :1803][INFO    ][7294] Executing state pkg.purged for [openstack-dashboard-apache]
2018-09-01 23:29:25,040 [salt.state       :290 ][INFO    ][7294] All specified packages are already absent
2018-09-01 23:29:25,040 [salt.state       :1941][INFO    ][7294] Completed state [openstack-dashboard-apache] at time 23:29:25.040703 duration_in_ms=9.794
2018-09-01 23:29:25,042 [salt.state       :1770][INFO    ][7294] Running state [/etc/openstack-dashboard/local_settings.py] at time 23:29:25.042436
2018-09-01 23:29:25,042 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/etc/openstack-dashboard/local_settings.py]
2018-09-01 23:29:25,060 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/local_settings/queens_settings.py'
2018-09-01 23:29:25,103 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_local_settings.py'
2018-09-01 23:29:25,147 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_horizon_settings.py'
2018-09-01 23:29:25,175 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_keystone_settings.py'
2018-09-01 23:29:25,202 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_nova_settings.py'
2018-09-01 23:29:25,219 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_glance_settings.py'
2018-09-01 23:29:25,238 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_neutron_settings.py'
2018-09-01 23:29:25,257 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_heat_settings.py'
2018-09-01 23:29:25,276 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_websso_settings.py'
2018-09-01 23:29:25,304 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_ssl_settings.py'
2018-09-01 23:29:25,312 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -1,173 +1,83 @@
-# -*- coding: utf-8 -*-
-
 import os
 
+from django.utils.translation import pgettext_lazy
 from django.utils.translation import ugettext_lazy as _
-
-from horizon.utils import secret_key
-
-from openstack_dashboard.settings import HORIZON_CONFIG
-
-DEBUG = True
-
-# This setting controls whether or not compression is enabled. Disabling
-# compression makes Horizon considerably slower, but makes it much easier
-# to debug JS and CSS changes
-#COMPRESS_ENABLED = not DEBUG
-
-# This setting controls whether compression happens on the fly, or offline
-# with `python manage.py compress`
-# See https://django-compressor.readthedocs.io/en/latest/usage/#offline-compression
-# for more information
-#COMPRESS_OFFLINE = not DEBUG
-
-# WEBROOT is the location relative to Webserver root
-# should end with a slash.
-WEBROOT = '/'
-#LOGIN_URL = WEBROOT + 'auth/login/'
-#LOGOUT_URL = WEBROOT + 'auth/logout/'
-#
-# LOGIN_REDIRECT_URL can be used as an alternative for
-# HORIZON_CONFIG.user_home, if user_home is not set.
-# Do not set it to '/home/', as this will cause circular redirect loop
-#LOGIN_REDIRECT_URL = WEBROOT
-
-# If horizon is running in production (DEBUG is False), set this
-# with the list of host/domain names that the application can serve.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
-ALLOWED_HOSTS = [ 'prx01', 'localhost', ]
-
-# Set SSL proxy settings:
-# Pass this header from the proxy after terminating the SSL,
-# and don't forget to strip it from the client's request.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
-#SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
-
-# If Horizon is being served through SSL, then uncomment the following two
-# settings to better secure the cookies from security exploits
-#CSRF_COOKIE_SECURE = True
-#SESSION_COOKIE_SECURE = True
-
-# The absolute path to the directory where message files are collected.
-# The message file must have a .json file extension. When the user logins to
-# horizon, the message files collected are processed and displayed to the user.
-#MESSAGES_PATH=None
-
-# Overrides for OpenStack API versions. Use this setting to force the
-# OpenStack dashboard to use a specific API version for a given service API.
-# Versions specified here should be integers or floats, not strings.
-# NOTE: The version should be formatted as it appears in the URL for the
-# service API. For example, The identity service APIs have inconsistent
-# use of the decimal point, so valid options would be 2.0 or 3.
-# Minimum compute version to get the instance locked status is 2.9.
-#OPENSTACK_API_VERSIONS = {
-#    "data-processing": 1.1,
-#    "identity": 3,
-#    "image": 2,
-#    "volume": 2,
-#    "compute": 2,
-#}
-
-# Set this to True if running on a multi-domain model. When this is enabled, it
-# will require the user to enter the Domain name in addition to the username
-# for login.
-#OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
-
-# Set this to True if you want available domains displayed as a dropdown menu
-# on the login screen. It is strongly advised NOT to enable this for public
-# clouds, as advertising enabled domains to unauthenticated customers
-# irresponsibly exposes private information. This should only be used for
-# private clouds where the dashboard sits behind a corporate firewall.
-#OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN = False
-
-# If OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN is enabled, this option can be used to
-# set the available domains to choose from. This is a list of pairs whose first
-# value is the domain name and the second is the display name.
-#OPENSTACK_KEYSTONE_DOMAIN_CHOICES = (
-#  ('Default', 'Default'),
-#)
-
-# Overrides the default domain used when running on single-domain model
-# with Keystone V3. All entities will be created in the default domain.
-# NOTE: This value must be the name of the default domain, NOT the ID.
-# Also, you will most likely have a value in the keystone policy file like this
-#    "cloud_admin": "rule:admin_required and domain_id:<your domain id>"
-# This value must be the name of the domain whose ID is specified there.
-#OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
-
-# Set this to True to enable panels that provide the ability for users to
-# manage Identity Providers (IdPs) and establish a set of rules to map
-# federation protocol attributes to Identity API attributes.
-# This extension requires v3.0+ of the Identity API.
-#OPENSTACK_KEYSTONE_FEDERATION_MANAGEMENT = False
-
-# Set Console type:
-# valid options are "AUTO"(default), "VNC", "SPICE", "RDP", "SERIAL", "MKS"
-# or None. Set to None explicitly if you want to deactivate the console.
-#CONSOLE_TYPE = "AUTO"
-
-# Toggle showing the openrc file for Keystone V2.
-# If set to false the link will be removed from the user dropdown menu
-# and the API Access page
-#SHOW_KEYSTONE_V2_RC = True
-
-# If provided, a "Report Bug" link will be displayed in the site header
-# which links to the value of this setting (ideally a URL containing
-# information on how to report issues).
-#HORIZON_CONFIG["bug_url"] = "http://bug-report.example.com"
-
-# Show backdrop element outside the modal, do not close the modal
-# after clicking on backdrop.
-#HORIZON_CONFIG["modal_backdrop"] = "static"
-
-# Specify a regular expression to validate user passwords.
-#HORIZON_CONFIG["password_validator"] = {
-#    "regex": '.*',
-#    "help_text": _("Your password does not meet the requirements."),
-#}
-
-# Disable simplified floating IP address management for deployments with
-# multiple floating IP pools or complex network requirements.
-#HORIZON_CONFIG["simple_ip_management"] = False
-
-# Turn off browser autocompletion for forms including the login form and
-# the database creation workflow if so desired.
-#HORIZON_CONFIG["password_autocomplete"] = "off"
-
-# Setting this to True will disable the reveal button for password fields,
-# including on the login form.
-#HORIZON_CONFIG["disable_password_reveal"] = False
+from openstack_dashboard import exceptions
+
+HORIZON_CONFIG = {
+    'user_home': 'openstack_dashboard.views.get_user_home',
+    'ajax_queue_limit': 10,
+    'auto_fade_alerts': {
+        'delay': 3000,
+        'fade_duration': 1500,
+        'types': ['alert-success', 'alert-info']
+    },
+    'help_url': "http://docs.openstack.org",
+    'exceptions': {'recoverable': exceptions.RECOVERABLE,
+                   'not_found': exceptions.NOT_FOUND,
+                   'unauthorized': exceptions.UNAUTHORIZED},
+    'modal_backdrop': 'static',
+    'angular_modules': [],
+    'js_files': [],
+    'js_spec_files': [],
+    'disable_password_reveal': True,
+    'password_autocomplete': 'off'
+}
+# 'key', 'label', 'path'
+AVAILABLE_THEMES = [
+    (
+        "default",
+        pgettext_lazy("Default style theme", "Default"),
+        "themes/default"
+    ),
+    (
+        "material",
+        pgettext_lazy("Google's Material Design style theme", "Material"),
+        "themes/material"
+    ),
+]
+
+# The default theme if no cookie is present
+DEFAULT_THEME = 'default'
+
+# Theme Static Directory
+THEME_COLLECTION_DIR = 'themes'
+
+# Theme Cookie Name
+THEME_COOKIE_NAME = 'theme'
+
+INSTALLED_APPS = (
+    'openstack_dashboard',
+    'django.contrib.contenttypes',
+    'django.contrib.auth',
+    'django.contrib.sessions',
+    'django.contrib.messages',
+    'django.contrib.staticfiles',
+    'django.contrib.humanize',
+    'compressor',
+    'horizon',
+    'openstack_auth',
+)
+
+
+
+DEBUG = False
+
+TEMPLATE_DEBUG = DEBUG
+
+ALLOWED_HOSTS = ['*']
+
+AUTHENTICATION_URLS = ['openstack_auth.urls']
 
 LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
 
-# Set custom secret key:
-# You can either set it to a specific value or you can let horizon generate a
-# default secret key that is unique on this machine, e.i. regardless of the
-# amount of Python WSGI workers (if used behind Apache+mod_wsgi): However,
-# there may be situations where you would want to set this explicitly, e.g.
-# when multiple dashboard instances are distributed on different machines
-# (usually behind a load-balancer). Either you have to make sure that a session
-# gets all requests routed to the same dashboard instance or you set the same
-# SECRET_KEY for all of them.
-SECRET_KEY = secret_key.generate_or_read_from_file(
-    os.path.join("/","var","lib","openstack-dashboard","secret-key", '.secret_key_store'))
-
-# We recommend you use memcached for development; otherwise after every reload
-# of the django development server, you will have to login again. To use
-# memcached set CACHES to something like
-#CACHES = {
-#    'default': {
-#        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-#        'LOCATION': '127.0.0.1:11211',
-#    },
-#}
+SECRET_KEY = 'opaesee8Que2yahJoh9fo0eefo1Aeyo6ahyei8zeiboh3aeth5loth7ieNa5xi5e'
 
 CACHES = {
     'default': {
-        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
-    },
+        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
+        'LOCATION': "172.30.10.102:11211"
+    }
 }
 
 # Send email to the console by default
@@ -176,76 +86,249 @@
 #EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
 
 # Configure these for your outgoing email host
-#EMAIL_HOST = 'smtp.my-company.com'
-#EMAIL_PORT = 25
-#EMAIL_HOST_USER = 'djangomail'
-#EMAIL_HOST_PASSWORD = 'top-secret!'
+# EMAIL_HOST = 'smtp.my-company.com'
+# EMAIL_PORT = 25
+# EMAIL_HOST_USER = 'djangomail'
+# EMAIL_HOST_PASSWORD = 'top-secret!'
+
+# The number of objects (Swift containers/objects or images) to display
+# on a single page before providing a paging element (a "more" link)
+# to paginate results.
+API_RESULT_LIMIT = 1000
+API_RESULT_PAGE_SIZE = 20
+
+# The timezone of the server. This should correspond with the timezone
+# of your entire OpenStack installation, and hopefully be in UTC.
+TIME_ZONE = "UTC"
+
+COMPRESS_OFFLINE = True
+
+# Trove user and database extension support. By default support for
+# creating users and databases on database instances is turned on.
+# To disable these extensions set the permission here to something
+# unusable such as ["!"].
+# TROVE_ADD_USER_PERMS = []
+# TROVE_ADD_DATABASE_PERMS = []
+
+SITE_BRANDING = 'OpenStack Dashboard'
+SESSION_COOKIE_HTTPONLY = True
+BOOT_ONLY_FROM_VOLUME = True
+
+REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
+                             'LAUNCH_INSTANCE_DEFAULTS',
+                             'OPENSTACK_IMAGE_FORMATS']
+
+
+# Specify a regular expression to validate user passwords.
+# HORIZON_CONFIG["password_validator"] = {
+#     "regex": '.*',
+#     "help_text": _("Your password does not meet the requirements.")
+# }
+
+# Turn off browser autocompletion for the login form if so desired.
+# HORIZON_CONFIG["password_autocomplete"] = "off"
+
+# The Horizon Policy Enforcement engine uses these values to load per service
+# policy rule files. The content of these files should match the files the
+# OpenStack services are using to determine role based access control in the
+# target installation.
+
+SESSION_TIMEOUT = 43200
+SESSION_ENGINE = "django.contrib.sessions.backends.cache"
+DROPDOWN_MAX_ITEMS = 30
+# A dictionary of settings which can be used to provide the default values for
+# properties found in the Launch Instance modal.
+
+# Path to directory containing policy.json files
+POLICY_FILES_PATH = "/usr/share/openstack-dashboard/openstack_dashboard/conf"
+# Map of local copy of service policy files
+POLICY_FILES = {
+    "compute": "nova_policy.json",
+    "network": "neutron_policy.json",
+    "image": "glance_policy.json",
+    "telemetry": "ceilometer_policy.json",
+    "volume": "cinder_policy.json",
+    "orchestration": "heat_policy.json",
+    "identity": "keystone_policy.json",
+}
+
+LOGGING = {
+    'version': 1,
+    # When set to True this will disable all logging except
+    # for loggers specified in this configuration dictionary. Note that
+    # if nothing is specified here and disable_existing_loggers is True,
+    # django.db.backends will still log unless it is disabled explicitly.
+    
+    'disable_existing_loggers': False,
+    'handlers': {
+        'null': {
+            'level': 'DEBUG',
+            'class': 'logging.NullHandler',
+        },
+        'console': {
+            # Set the level to "DEBUG" for verbose output logging.
+            'level': 'INFO',
+            'class': 'logging.StreamHandler',
+        },
+        'file': {
+            'level': 'DEBUG',
+            'class': 'logging.FileHandler',
+            'filename': '/var/log/horizon/horizon.log',
+        },
+    },
+    'loggers': {
+        # Logging from django.db.backends is VERY verbose, send to null
+        # by default.
+        'django.db.backends': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        # DEBUG level for django.template starting Pike has some false positive traces, set it to INFO
+        # by default. Caused by bug PROD-17558.
+        'django.template': {
+            'handlers': ['file'],
+            'level': 'INFO',
+            'propagate': True,
+        },
+        'requests': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'horizon': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_dashboard': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'novaclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'cinderclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'keystoneclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'glanceclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'neutronclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'heatclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'ceilometerclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'troveclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'mistralclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'swiftclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_auth': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'scss.expression': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'nose.plugins.manager': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'django': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'iso8601': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+    }
+}
+
+
+# Overrides for OpenStack API versions. Use this setting to force the
+# OpenStack dashboard to use a specfic API version for a given service API.
+# NOTE: The version should be formatted as it appears in the URL for the
+# service API. For example, The identity service APIs have inconsistent
+# use of the decimal point, so valid options would be "2.0" or "3".
+OPENSTACK_API_VERSIONS = {
+    "identity": 3
+}
+# Set this to True if running on multi-domain model. When this is enabled, it
+# will require user to enter the Domain name in addition to username for login.
+# OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+
+# Overrides the default domain used when running on single-domain model
+# with Keystone V3. All entities will be created in the default domain.
+# OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
 
 # For multiple regions uncomment this configuration, and add (endpoint, title).
-#AVAILABLE_REGIONS = [
-#    ('http://cluster1.example.com:5000/v3', 'cluster1'),
-#    ('http://cluster2.example.com:5000/v3', 'cluster2'),
-#]
-
-OPENSTACK_HOST = "127.0.0.1"
+# AVAILABLE_REGIONS = [
+#     ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
+#     ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
+# ]
+
+
+OPENSTACK_HOST = "10.167.4.35"
 OPENSTACK_KEYSTONE_URL = "http://%s:5000/v3" % OPENSTACK_HOST
-OPENSTACK_KEYSTONE_DEFAULT_ROLE = "_member_"
-
-# For setting the default service region on a per-endpoint basis. Note that the
-# default value for this setting is {}, and below is just an example of how it
-# should be specified.
-#DEFAULT_SERVICE_REGIONS = {
-#    OPENSTACK_KEYSTONE_URL: 'RegionOne'
-#}
-
-# Enables keystone web single-sign-on if set to True.
-#WEBSSO_ENABLED = False
-
-# Authentication mechanism to be selected as default.
-# The value must be a key from WEBSSO_CHOICES.
-#WEBSSO_INITIAL_CHOICE = "credentials"
-
-# The list of authentication mechanisms which include keystone
-# federation protocols and identity provider/federation protocol
-# mapping keys (WEBSSO_IDP_MAPPING). Current supported protocol
-# IDs are 'saml2' and 'oidc'  which represent SAML 2.0, OpenID
-# Connect respectively.
-# Do not remove the mandatory credentials mechanism.
-# Note: The last two tuples are sample mapping keys to a identity provider
-# and federation protocol combination (WEBSSO_IDP_MAPPING).
-#WEBSSO_CHOICES = (
-#    ("credentials", _("Keystone Credentials")),
-#    ("oidc", _("OpenID Connect")),
-#    ("saml2", _("Security Assertion Markup Language")),
-#    ("acme_oidc", "ACME - OpenID Connect"),
-#    ("acme_saml2", "ACME - SAML2"),
-#)
-
-# A dictionary of specific identity provider and federation protocol
-# combinations. From the selected authentication mechanism, the value
-# will be looked up as keys in the dictionary. If a match is found,
-# it will redirect the user to a identity provider and federation protocol
-# specific WebSSO endpoint in keystone, otherwise it will use the value
-# as the protocol_id when redirecting to the WebSSO by protocol endpoint.
-# NOTE: The value is expected to be a tuple formatted as: (<idp_id>, <protocol_id>).
-#WEBSSO_IDP_MAPPING = {
-#    "acme_oidc": ("acme", "oidc"),
-#    "acme_saml2": ("acme", "saml2"),
-#}
-
-# The Keystone Provider drop down uses Keystone to Keystone federation
-# to switch between Keystone service providers.
-# Set display name for Identity Provider (dropdown display name)
-#KEYSTONE_PROVIDER_IDP_NAME = "Local Keystone"
-# This id is used for only for comparison with the service provider IDs. This ID
-# should not match any service provider IDs.
-#KEYSTONE_PROVIDER_IDP_ID = "localkeystone"
+
+OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = "default"
+
+OPENSTACK_KEYSTONE_DEFAULT_ROLE = "Member"
 
 # Disable SSL certificate checks (useful for self-signed certificates):
-#OPENSTACK_SSL_NO_VERIFY = True
 
 # The CA certificate to use to verify SSL connections
-#OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+# OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+
+# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is 'publicURL'.
+OPENSTACK_ENDPOINT_TYPE = "internalURL"
+
+# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
+# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is None.  This
+# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
+#SECONDARY_ENDPOINT_TYPE = "publicURL"
 
 # The OPENSTACK_KEYSTONE_BACKEND settings can be used to identify the
 # capabilities of the auth backend for Keystone.
@@ -259,43 +342,13 @@
     'can_edit_group': True,
     'can_edit_project': True,
     'can_edit_domain': True,
-    'can_edit_role': True,
-}
-
-# Setting this to True, will add a new "Retrieve Password" action on instance,
-# allowing Admin session password retrieval/decryption.
-#OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
-
-# This setting allows deployers to control whether a token is deleted on log
-# out. This can be helpful when there are often long running processes being
-# run in the Horizon environment.
-#TOKEN_DELETION_DISABLED = False
-
-# The Launch Instance user experience has been significantly enhanced.
-# You can choose whether to enable the new launch instance experience,
-# the legacy experience, or both. The legacy experience will be removed
-# in a future release, but is available as a temporary backup setting to ensure
-# compatibility with existing deployments. Further development will not be
-# done on the legacy experience. Please report any problems with the new
-# experience via the Launchpad tracking system.
-#
-# Toggle LAUNCH_INSTANCE_LEGACY_ENABLED and LAUNCH_INSTANCE_NG_ENABLED to
-# determine the experience to enable.  Set them both to true to enable
-# both.
-#LAUNCH_INSTANCE_LEGACY_ENABLED = True
-#LAUNCH_INSTANCE_NG_ENABLED = False
-
-# A dictionary of settings which can be used to provide the default values for
-# properties found in the Launch Instance modal.
-#LAUNCH_INSTANCE_DEFAULTS = {
-#    'config_drive': False,
-#    'enable_scheduler_hints': True,
-#    'disable_image': False,
-#    'disable_instance_snapshot': False,
-#    'disable_volume': False,
-#    'disable_volume_snapshot': False,
-#    'create_volume': True,
-#}
+    'can_edit_role': True
+}
+
+
+# Set Console type:
+# valid options would be "AUTO", "VNC" or "SPICE"
+# CONSOLE_TYPE = "AUTO"
 
 # The Xen Hypervisor has the ability to set the mount point for volumes
 # attached to instances (other Hypervisors currently do not). Setting
@@ -304,102 +357,52 @@
 OPENSTACK_HYPERVISOR_FEATURES = {
     'can_set_mount_point': False,
     'can_set_password': False,
-    'requires_keypair': False,
-    'enable_quotas': True
-}
-
-# This settings controls whether IP addresses of servers are retrieved from
-# neutron in the project instance table. Setting this to ``False`` may mitigate
-# a performance issue in the project instance table in large deployments.
-#OPENSTACK_INSTANCE_RETRIEVE_IP_ADDRESSES = True
-
-# The OPENSTACK_CINDER_FEATURES settings can be used to enable optional
-# services provided by cinder that is not exposed by its extension API.
-OPENSTACK_CINDER_FEATURES = {
-    'enable_backup': False,
-}
-
-# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
-# services provided by neutron. Options currently available are load
-# balancer service, security groups, quotas, VPN service.
-OPENSTACK_NEUTRON_NETWORK = {
-    'enable_router': True,
-    'enable_quotas': True,
-    'enable_ipv6': True,
-    'enable_distributed_router': False,
-    'enable_ha_router': False,
-    'enable_fip_topology_check': True,
-
-    # Default dns servers you would like to use when a subnet is
-    # created.  This is only a default, users can still choose a different
-    # list of dns servers when creating a new subnet.
-    # The entries below are examples only, and are not appropriate for
-    # real deployments
-    # 'default_dns_nameservers': ["8.8.8.8", "8.8.4.4", "208.67.222.222"],
-
-    # Set which provider network types are supported. Only the network types
-    # in this list will be available to choose from when creating a network.
-    # Network types include local, flat, vlan, gre, vxlan and geneve.
-    # 'supported_provider_types': ['*'],
-
-    # You can configure available segmentation ID range per network type
-    # in your deployment.
-    # 'segmentation_id_range': {
-    #     'vlan': [1024, 2048],
-    #     'vxlan': [4094, 65536],
-    # },
-
-    # You can define additional provider network types here.
-    # 'extra_provider_types': {
-    #     'awesome_type': {
-    #         'display_name': 'Awesome New Type',
-    #         'require_physical_network': False,
-    #         'require_segmentation_id': True,
-    #     }
-    # },
-
-    # Set which VNIC types are supported for port binding. Only the VNIC
-    # types in this list will be available to choose from when creating a
-    # port.
-    # VNIC types include 'normal', 'direct', 'direct-physical', 'macvtap',
-    # 'baremetal' and 'virtio-forwarder'
-    # Set to empty list or None to disable VNIC type selection.
-    'supported_vnic_types': ['*'],
-
-    # Set list of available physical networks to be selected in the physical
-    # network field on the admin create network modal. If it's set to an empty
-    # list, the field will be a regular input field.
-    # e.g. ['default', 'test']
-    'physical_networks': [],
-
-}
-
-# The OPENSTACK_HEAT_STACK settings can be used to disable password
-# field required while launching the stack.
-OPENSTACK_HEAT_STACK = {
-    'enable_user_pass': True,
-}
+}
+
+# When set, enables the instance action "Retrieve password"
+# allowing password retrieval
+OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
+
+# When launching an instance, the menu of available flavors is
+# sorted by RAM usage, ascending.  Provide a callback method here
+# (and/or a flag for reverse sort) for the sorted() method if you'd
+# like a different behaviour.  For more info, see
+# http://docs.python.org/2/library/functions.html#sorted
+# CREATE_INSTANCE_FLAVOR_SORT = {
+#     'key': my_awesome_callback_method,
+#     'reverse': False,
+# }
+
+FLAVOR_EXTRA_KEYS = {
+    'flavor_keys': [
+        ('quota:read_bytes_sec', _('Quota: Read bytes')),
+        ('quota:write_bytes_sec', _('Quota: Write bytes')),
+        ('quota:cpu_quota', _('Quota: CPU')),
+        ('quota:cpu_period', _('Quota: CPU period')),
+        ('quota:inbound_average', _('Quota: Inbound average')),
+        ('quota:outbound_average', _('Quota: Outbound average')),
+    ]
+}
+
 
 # The OPENSTACK_IMAGE_BACKEND settings can be used to customize features
 # in the OpenStack Dashboard related to the Image service, such as the list
 # of supported image formats.
-#OPENSTACK_IMAGE_BACKEND = {
-#    'image_formats': [
-#        ('', _('Select format')),
-#        ('aki', _('AKI - Amazon Kernel Image')),
-#        ('ami', _('AMI - Amazon Machine Image')),
-#        ('ari', _('ARI - Amazon Ramdisk Image')),
-#        ('docker', _('Docker')),
-#        ('iso', _('ISO - Optical Disk Image')),
-#        ('ova', _('OVA - Open Virtual Appliance')),
-#        ('qcow2', _('QCOW2 - QEMU Emulator')),
-#        ('raw', _('Raw')),
-#        ('vdi', _('VDI - Virtual Disk Image')),
-#        ('vhd', _('VHD - Virtual Hard Disk')),
-#        ('vhdx', _('VHDX - Large Virtual Hard Disk')),
-#        ('vmdk', _('VMDK - Virtual Machine Disk')),
-#    ],
-#}
+OPENSTACK_IMAGE_BACKEND = {
+    'image_formats': [
+        ('', ''),
+        ('aki', _('AKI - Amazon Kernel Image')),
+        ('ami', _('AMI - Amazon Machine Image')),
+        ('ari', _('ARI - Amazon Ramdisk Image')),
+        ('iso', _('ISO - Optical Disk Image')),
+        ('qcow2', _('QCOW2 - QEMU Emulator')),
+        ('raw', _('Raw')),
+        ('vdi', _('VDI')),
+        ('vhd', _('VHD')),
+        ('vmdk', _('VMDK')),
+        ('docker', _('Docker Container'))
+    ]
+}
 
 # The IMAGE_CUSTOM_PROPERTY_TITLES settings is used to customize the titles for
 # image custom property attributes that appear on image detail pages.
@@ -409,285 +412,53 @@
     "ramdisk_id": _("Ramdisk ID"),
     "image_state": _("Euca2ools state"),
     "project_id": _("Project ID"),
-    "image_type": _("Image Type"),
-}
-
-# The IMAGE_RESERVED_CUSTOM_PROPERTIES setting is used to specify which image
-# custom properties should not be displayed in the Image Custom Properties
-# table.
-IMAGE_RESERVED_CUSTOM_PROPERTIES = []
-
-# Set to 'legacy' or 'direct' to allow users to upload images to glance via
-# Horizon server. When enabled, a file form field will appear on the create
-# image form. If set to 'off', there will be no file form field on the create
-# image form. See documentation for deployment considerations.
-#HORIZON_IMAGES_UPLOAD_MODE = 'legacy'
-
-# Allow a location to be set when creating or updating Glance images.
-# If using Glance V2, this value should be False unless the Glance
-# configuration and policies allow setting locations.
-#IMAGES_ALLOW_LOCATION = False
-
-# A dictionary of default settings for create image modal.
-#CREATE_IMAGE_DEFAULTS = {
-#    'image_visibility': "public",
-#}
-
-# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is 'publicURL'.
-#OPENSTACK_ENDPOINT_TYPE = "publicURL"
-
-# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
-# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is None. This
-# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
-#SECONDARY_ENDPOINT_TYPE = None
-
-# The number of objects (Swift containers/objects or images) to display
-# on a single page before providing a paging element (a "more" link)
-# to paginate results.
-API_RESULT_LIMIT = 1000
-API_RESULT_PAGE_SIZE = 20
-
-# The size of chunk in bytes for downloading objects from Swift
-SWIFT_FILE_TRANSFER_CHUNK_SIZE = 512 * 1024
-
-# The default number of lines displayed for instance console log.
-INSTANCE_LOG_LENGTH = 35
-
-# Specify a maximum number of items to display in a dropdown.
-DROPDOWN_MAX_ITEMS = 30
-
-# The timezone of the server. This should correspond with the timezone
-# of your entire OpenStack installation, and hopefully be in UTC.
-TIME_ZONE = "UTC"
-
-# When launching an instance, the menu of available flavors is
-# sorted by RAM usage, ascending. If you would like a different sort order,
-# you can provide another flavor attribute as sorting key. Alternatively, you
-# can provide a custom callback method to use for sorting. You can also provide
-# a flag for reverse sort. For more info, see
-# http://docs.python.org/2/library/functions.html#sorted
-#CREATE_INSTANCE_FLAVOR_SORT = {
-#    'key': 'name',
-#     # or
-#    'key': my_awesome_callback_method,
-#    'reverse': False,
-#}
-
-# Set this to True to display an 'Admin Password' field on the Change Password
-# form to verify that it is indeed the admin logged-in who wants to change
-# the password.
-#ENFORCE_PASSWORD_CHECK = False
-
-# Modules that provide /auth routes that can be used to handle different types
-# of user authentication. Add auth plugins that require extra route handling to
-# this list.
-#AUTHENTICATION_URLS = [
-#    'openstack_auth.urls',
-#]
-
-# The Horizon Policy Enforcement engine uses these values to load per service
-# policy rule files. The content of these files should match the files the
-# OpenStack services are using to determine role based access control in the
-# target installation.
-
-# Path to directory containing policy.json files
-#POLICY_FILES_PATH = os.path.join(ROOT_PATH, "conf")
-
-# Map of local copy of service policy files.
-# Please insure that your identity policy file matches the one being used on
-# your keystone servers. There is an alternate policy file that may be used
-# in the Keystone v3 multi-domain case, policy.v3cloudsample.json.
-# This file is not included in the Horizon repository by default but can be
-# found at
-# http://git.openstack.org/cgit/openstack/keystone/tree/etc/ \
-# policy.v3cloudsample.json
-# Having matching policy files on the Horizon and Keystone servers is essential
-# for normal operation. This holds true for all services and their policy files.
-#POLICY_FILES = {
-#    'identity': 'keystone_policy.json',
-#    'compute': 'nova_policy.json',
-#    'volume': 'cinder_policy.json',
-#    'image': 'glance_policy.json',
-#    'network': 'neutron_policy.json',
-#}
-
-# TODO: (david-lyle) remove when plugins support adding settings.
-# Note: Only used when trove-dashboard plugin is configured to be used by
-# Horizon.
-# Trove user and database extension support. By default support for
-# creating users and databases on database instances is turned on.
-# To disable these extensions set the permission here to something
-# unusable such as ["!"].
-#TROVE_ADD_USER_PERMS = []
-#TROVE_ADD_DATABASE_PERMS = []
-
-# Change this patch to the appropriate list of tuples containing
-# a key, label and static directory containing two files:
-# _variables.scss and _styles.scss
-#AVAILABLE_THEMES = [
-#    ('default', 'Default', 'themes/default'),
-#    ('material', 'Material', 'themes/material'),
-#]
-
-LOGGING = {
-    'version': 1,
-    # When set to True this will disable all logging except
-    # for loggers specified in this configuration dictionary. Note that
-    # if nothing is specified here and disable_existing_loggers is True,
-    # django.db.backends will still log unless it is disabled explicitly.
-    'disable_existing_loggers': False,
-    # If apache2 mod_wsgi is used to deploy OpenStack dashboard
-    # timestamp is output by mod_wsgi. If WSGI framework you use does not
-    # output timestamp for logging, add %(asctime)s in the following
-    # format definitions.
-    'formatters': {
-        'console': {
-            'format': '%(levelname)s %(name)s %(message)s'
-        },
-        'operation': {
-            # The format of "%(message)s" is defined by
-            # OPERATION_LOG_OPTIONS['format']
-            'format': '%(message)s'
-        },
-    },
-    'handlers': {
-        'null': {
-            'level': 'DEBUG',
-            'class': 'logging.NullHandler',
-        },
-        'console': {
-            # Set the level to "DEBUG" for verbose output logging.
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'console',
-        },
-        'operation': {
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'operation',
-        },
-    },
-    'loggers': {
-        'horizon': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'horizon.operation_log': {
-            'handlers': ['operation'],
-            'level': 'INFO',
-            'propagate': False,
-        },
-        'openstack_dashboard': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'novaclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'cinderclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneauth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'glanceclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'neutronclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'swiftclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'oslo_policy': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'openstack_auth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'nose.plugins.manager': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'django': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        # Logging from django.db.backends is VERY verbose, send to null
-        # by default.
-        'django.db.backends': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'requests': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'urllib3': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'chardet.charsetprober': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'iso8601': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'scss': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-    },
+    "image_type": _("Image Type")
+}
+
+HORIZON_IMAGES_UPLOAD_MODE = "legacy"
+IMAGES_ALLOW_LOCATION = True
+
+
+# Disable simplified floating IP address management for deployments with
+# multiple floating IP pools or complex network requirements.
+# HORIZON_CONFIG["simple_ip_management"] = False
+
+# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
+# services provided by neutron. Options currenly available are load
+# balancer service, security groups, quotas, VPN service.
+
+OPENSTACK_NEUTRON_NETWORK = {
+    'enable_lb': True,
+    'enable_firewall': False,
+    'enable_quotas': True,
+    'enable_security_group': True,
+    'enable_vpn': False,
+    # The profile_support option is used to detect if an externa lrouter can be
+    # configured via the dashboard. When using specific plugins the
+    # profile_support can be turned on if needed.
+    'profile_support': None,
+    'enable_fip_topology_check': True,
+
+    #'profile_support': 'cisco',
 }
 
 # 'direction' should not be specified for all_tcp/udp/icmp.
 # It is specified in the form.
 SECURITY_GROUP_RULES = {
     'all_tcp': {
-        'name': _('All TCP'),
+        'name': 'ALL TCP',
         'ip_protocol': 'tcp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_udp': {
-        'name': _('All UDP'),
+        'name': 'ALL UDP',
         'ip_protocol': 'udp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_icmp': {
-        'name': _('All ICMP'),
+        'name': 'ALL ICMP',
         'ip_protocol': 'icmp',
         'from_port': '-1',
         'to_port': '-1',
@@ -778,144 +549,12 @@
     },
 }
 
-# Deprecation Notice:
-#
-# The setting FLAVOR_EXTRA_KEYS has been deprecated.
-# Please load extra spec metadata into the Glance Metadata Definition Catalog.
-#
-# The sample quota definitions can be found in:
-# <glance_source>/etc/metadefs/compute-quota.json
-#
-# The metadata definition catalog supports CLI and API:
-#  $glance --os-image-api-version 2 help md-namespace-import
-#  $glance-manage db_load_metadefs <directory_with_definition_files>
-#
-# See Metadata Definitions on: http://docs.openstack.org/developer/glance/
-
-# TODO: (david-lyle) remove when plugins support settings natively
-# Note: This is only used when the Sahara plugin is configured and enabled
-# for use in Horizon.
-# Indicate to the Sahara data processing service whether or not
-# automatic floating IP allocation is in effect.  If it is not
-# in effect, the user will be prompted to choose a floating IP
-# pool for use in their cluster.  False by default.  You would want
-# to set this to True if you were running Nova Networking with
-# auto_assign_floating_ip = True.
-#SAHARA_AUTO_IP_ALLOCATION_ENABLED = False
-
-# The hash algorithm to use for authentication tokens. This must
-# match the hash algorithm that the identity server and the
-# auth_token middleware are using. Allowed values are the
-# algorithms supported by Python's hashlib library.
-#OPENSTACK_TOKEN_HASH_ALGORITHM = 'md5'
-
-# AngularJS requires some settings to be made available to
-# the client side. Some settings are required by in-tree / built-in horizon
-# features. These settings must be added to REST_API_REQUIRED_SETTINGS in the
-# form of ['SETTING_1','SETTING_2'], etc.
-#
-# You may remove settings from this list for security purposes, but do so at
-# the risk of breaking a built-in horizon feature. These settings are required
-# for horizon to function properly. Only remove them if you know what you
-# are doing. These settings may in the future be moved to be defined within
-# the enabled panel configuration.
-# You should not add settings to this list for out of tree extensions.
-# See: https://wiki.openstack.org/wiki/Horizon/RESTAPI
-REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
-                              'LAUNCH_INSTANCE_DEFAULTS',
-                              'OPENSTACK_IMAGE_FORMATS',
-                              'OPENSTACK_KEYSTONE_DEFAULT_DOMAIN',
-                              'CREATE_IMAGE_DEFAULTS',
-                              'ENFORCE_PASSWORD_CHECK']
-
-# Additional settings can be made available to the client side for
-# extensibility by specifying them in REST_API_ADDITIONAL_SETTINGS
-# !! Please use extreme caution as the settings are transferred via HTTP/S
-# and are not encrypted on the browser. This is an experimental API and
-# may be deprecated in the future without notice.
-#REST_API_ADDITIONAL_SETTINGS = []
-
-# DISALLOW_IFRAME_EMBED can be used to prevent Horizon from being embedded
-# within an iframe. Legacy browsers are still vulnerable to a Cross-Frame
-# Scripting (XFS) vulnerability, so this option allows extra security hardening
-# where iframes are not used in deployment. Default setting is True.
-# For more information see:
-# http://tinyurl.com/anticlickjack
-#DISALLOW_IFRAME_EMBED = True
-
-# Help URL can be made available for the client. To provide a help URL, edit the
-# following attribute to the URL of your choice.
-#HORIZON_CONFIG["help_url"] = "http://openstack.mycompany.org"
-
-# Settings for OperationLogMiddleware
-# OPERATION_LOG_ENABLED is flag to use the function to log an operation on
-# Horizon.
-# mask_targets is arrangement for appointing a target to mask.
-# method_targets is arrangement of HTTP method to output log.
-# format is the log contents.
-#OPERATION_LOG_ENABLED = False
-#OPERATION_LOG_OPTIONS = {
-#    'mask_fields': ['password'],
-#    'target_methods': ['POST'],
-#    'ignored_urls': ['/js/', '/static/', '^/api/'],
-#    'format': ("[%(client_ip)s] [%(domain_name)s]"
-#        " [%(domain_id)s] [%(project_name)s]"
-#        " [%(project_id)s] [%(user_name)s] [%(user_id)s] [%(request_scheme)s]"
-#        " [%(referer_url)s] [%(request_url)s] [%(message)s] [%(method)s]"
-#        " [%(http_status)s] [%(param)s]"),
-#}
-
-# The default date range in the Overview panel meters - either <today> minus N
-# days (if the value is integer N), or from the beginning of the current month
-# until today (if set to None). This setting should be used to limit the amount
-# of data fetched by default when rendering the Overview panel.
-#OVERVIEW_DAYS_RANGE = 1
-
-# To allow operators to require users provide a search criteria first
-# before loading any data into the views, set the following dict
-# attributes to True in each one of the panels you want to enable this feature.
-# Follow the convention <dashboard>.<view>
-#FILTER_DATA_FIRST = {
-#    'admin.instances': False,
-#    'admin.images': False,
-#    'admin.networks': False,
-#    'admin.routers': False,
-#    'admin.volumes': False,
-#    'identity.users': False,
-#    'identity.projects': False,
-#    'identity.groups': False,
-#    'identity.roles': False
-#}
-
-# Dict used to restrict user private subnet cidr range.
-# An empty list means that user input will not be restricted
-# for a corresponding IP version. By default, there is
-# no restriction for IPv4 or IPv6. To restrict
-# user private subnet cidr range set ALLOWED_PRIVATE_SUBNET_CIDR
-# to something like
-#ALLOWED_PRIVATE_SUBNET_CIDR = {
-#    'ipv4': ['10.0.0.0/8', '192.168.0.0/16'],
-#    'ipv6': ['fc00::/7']
-#}
-ALLOWED_PRIVATE_SUBNET_CIDR = {'ipv4': [], 'ipv6': []}
-
-# Projects and users can have extra attributes as defined by keystone v3.
-# Horizon has the ability to display these extra attributes via this setting.
-# If you'd like to display extra data in the project or user tables, set the
-# corresponding dict key to the attribute name, followed by the display name.
-# For more information, see horizon's customization (http://docs.openstack.org/developer/horizon/topics/customizing.html#horizon-customization-module-overrides)
-#PROJECT_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-#USER_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-
-# Password will have an expiration date when using keystone v3 and enabling the
-# feature.
-# This setting allows you to set the number of days that the user will be alerted
-# prior to the password expiration.
-# Once the password expires keystone will deny the access and users must
-# contact an admin to change their password.
-#PASSWORD_EXPIRES_WARNING_THRESHOLD_DAYS = 0
-COMPRESS_OFFLINE=True
+
+
+
+
+
+USE_SSL = True
+CSRF_COOKIE_SECURE = True
+CSRF_COOKIE_SECURE = True
+SESSION_COOKIE_HTTPONLY = True

2018-09-01 23:29:25,351 [salt.state       :905 ][INFO    ][7294] Loading fresh modules for state activity
2018-09-01 23:29:25,392 [salt.state       :1941][INFO    ][7294] Completed state [/etc/openstack-dashboard/local_settings.py] at time 23:29:25.392827 duration_in_ms=350.389
2018-09-01 23:29:25,401 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 23:29:25.401446
2018-09-01 23:29:25,402 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json]
2018-09-01 23:29:25,441 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/nova_policy.json'
2018-09-01 23:29:25,444 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -1,174 +1,500 @@
 {
-    "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or project_id:%(project_id)s",
+    "context_is_admin":  "role:admin",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
+
+    "compute:create": "rule:admin_or_owner",
+    "compute:create:attach_network": "rule:admin_or_owner",
+    "compute:create:attach_volume": "rule:admin_or_owner",
+    "compute:create:forced_host": "is_admin:True",
+
+    "compute:get": "rule:admin_or_owner",
+    "compute:get_all": "rule:admin_or_owner",
+    "compute:get_all_tenants": "is_admin:True",
+
+    "compute:update": "rule:admin_or_owner",
+
+    "compute:get_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_system_metadata": "rule:admin_or_owner",
+    "compute:update_instance_metadata": "rule:admin_or_owner",
+    "compute:delete_instance_metadata": "rule:admin_or_owner",
+
+    "compute:get_diagnostics": "rule:admin_or_owner",
+    "compute:get_instance_diagnostics": "rule:admin_or_owner",
+
+    "compute:start": "rule:admin_or_owner",
+    "compute:stop": "rule:admin_or_owner",
+
+    "compute:lock": "rule:admin_or_owner",
+    "compute:unlock": "rule:admin_or_owner",
+    "compute:unlock_override": "rule:admin_api",
+
+    "compute:get_vnc_console": "rule:admin_or_owner",
+    "compute:get_spice_console": "rule:admin_or_owner",
+    "compute:get_rdp_console": "rule:admin_or_owner",
+    "compute:get_serial_console": "rule:admin_or_owner",
+    "compute:get_mks_console": "rule:admin_or_owner",
+    "compute:get_console_output": "rule:admin_or_owner",
+
+    "compute:reset_network": "rule:admin_or_owner",
+    "compute:inject_network_info": "rule:admin_or_owner",
+    "compute:add_fixed_ip": "rule:admin_or_owner",
+    "compute:remove_fixed_ip": "rule:admin_or_owner",
+
+    "compute:attach_volume": "rule:admin_or_owner",
+    "compute:detach_volume": "rule:admin_or_owner",
+    "compute:swap_volume": "rule:admin_api",
+
+    "compute:attach_interface": "rule:admin_or_owner",
+    "compute:detach_interface": "rule:admin_or_owner",
+
+    "compute:set_admin_password": "rule:admin_or_owner",
+
+    "compute:rescue": "rule:admin_or_owner",
+    "compute:unrescue": "rule:admin_or_owner",
+
+    "compute:suspend": "rule:admin_or_owner",
+    "compute:resume": "rule:admin_or_owner",
+
+    "compute:pause": "rule:admin_or_owner",
+    "compute:unpause": "rule:admin_or_owner",
+
+    "compute:shelve": "rule:admin_or_owner",
+    "compute:shelve_offload": "rule:admin_or_owner",
+    "compute:unshelve": "rule:admin_or_owner",
+
+    "compute:snapshot": "rule:admin_or_owner",
+    "compute:snapshot_volume_backed": "rule:admin_or_owner",
+    "compute:backup": "rule:admin_or_owner",
+
+    "compute:resize": "rule:admin_or_owner",
+    "compute:confirm_resize": "rule:admin_or_owner",
+    "compute:revert_resize": "rule:admin_or_owner",
+
+    "compute:rebuild": "rule:admin_or_owner",
+    "compute:reboot": "rule:admin_or_owner",
+    "compute:delete": "rule:admin_or_owner",
+    "compute:soft_delete": "rule:admin_or_owner",
+    "compute:force_delete": "rule:admin_or_owner",
+
+    "compute:security_groups:add_to_instance": "rule:admin_or_owner",
+    "compute:security_groups:remove_from_instance": "rule:admin_or_owner",
+
+    "compute:restore": "rule:admin_or_owner",
+
+    "compute:volume_snapshot_create": "rule:admin_or_owner",
+    "compute:volume_snapshot_delete": "rule:admin_or_owner",
+
     "admin_api": "is_admin:True",
-    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
-    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
-    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
-    "os_compute_api:os-admin-password": "rule:admin_or_owner",
-    "os_compute_api:os-agents": "rule:admin_api",
-    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
-    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:create": "rule:admin_api",
-    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:update": "rule:admin_api",
-    "os_compute_api:os-aggregates:index": "rule:admin_api",
-    "os_compute_api:os-aggregates:delete": "rule:admin_api",
-    "os_compute_api:os-aggregates:show": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
-    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:create": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:delete": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
-    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
-    "os_compute_api:os-cells:update": "rule:admin_api",
-    "os_compute_api:os-cells:create": "rule:admin_api",
-    "os_compute_api:os-cells": "rule:admin_api",
-    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
-    "os_compute_api:os-cells:delete": "rule:admin_api",
-    "cells_scheduler_filter:DifferentCellFilter": "is_admin:True",
-    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
-    "os_compute_api:os-config-drive": "rule:admin_or_owner",
-    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
-    "os_compute_api:os-console-output": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
-    "os_compute_api:os-create-backup": "rule:admin_or_owner",
-    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
-    "os_compute_api:os-evacuate": "rule:admin_api",
-    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
-    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
-    "os_compute_api:os-extended-status": "rule:admin_or_owner",
-    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
-    "os_compute_api:extensions": "rule:admin_or_owner",
-    "os_compute_api:os-fixed-ips": "rule:admin_api",
-    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-manage": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:create": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-manage:update": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:delete": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
-    "os_compute_api:flavors": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
-    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
-    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
-    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
-    "os_compute_api:os-fping": "rule:admin_or_owner",
-    "os_compute_api:os-hide-server-addresses": "is_admin:False",
-    "os_compute_api:os-hosts": "rule:admin_api",
-    "os_compute_api:os-hypervisors": "rule:admin_api",
-    "os_compute_api:image-size": "rule:admin_or_owner",
-    "os_compute_api:os-instance-actions:events": "rule:admin_api",
-    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
-    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
-    "os_compute_api:ips:show": "rule:admin_or_owner",
-    "os_compute_api:ips:index": "rule:admin_or_owner",
-    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs": "rule:admin_or_owner",
-    "os_compute_api:limits": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
-    "os_compute_api:os-migrations:index": "rule:admin_api",
-    "os_compute_api:os-multinic": "rule:admin_or_owner",
-    "os_compute_api:os-networks": "rule:admin_api",
-    "os_compute_api:os-networks:view": "rule:admin_or_owner",
-    "os_compute_api:os-networks-associate": "rule:admin_api",
-    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
-    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
-    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
-    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:defaults": "@",
-    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
-    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
-    "os_compute_api:os-quota-sets:detail": "rule:admin_or_owner",
-    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
-    "os_compute_api:os-rescue": "rule:admin_or_owner",
-    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
-    "os_compute_api:os-security-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-diagnostics": "rule:admin_api",
-    "os_compute_api:os-server-external-events:create": "rule:admin_api",
-    "os_compute_api:os-server-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-groups:create": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:delete": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:index": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:show": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-password": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:index": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:show": "rule:admin_or_owner",
-    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "compute_extension:accounts": "rule:admin_api",
+    "compute_extension:admin_actions": "rule:admin_api",
+    "compute_extension:admin_actions:pause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unpause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:suspend": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resume": "rule:admin_or_owner",
+    "compute_extension:admin_actions:lock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unlock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resetNetwork": "rule:admin_api",
+    "compute_extension:admin_actions:injectNetworkInfo": "rule:admin_api",
+    "compute_extension:admin_actions:createBackup": "rule:admin_or_owner",
+    "compute_extension:admin_actions:migrateLive": "rule:admin_api",
+    "compute_extension:admin_actions:resetState": "rule:admin_api",
+    "compute_extension:admin_actions:migrate": "rule:admin_api",
+    "compute_extension:aggregates": "rule:admin_api",
+    "compute_extension:agents": "rule:admin_api",
+    "compute_extension:attach_interfaces": "rule:admin_or_owner",
+    "compute_extension:baremetal_nodes": "rule:admin_api",
+    "compute_extension:cells": "rule:admin_api",
+    "compute_extension:cells:create": "rule:admin_api",
+    "compute_extension:cells:delete": "rule:admin_api",
+    "compute_extension:cells:update": "rule:admin_api",
+    "compute_extension:cells:sync_instances": "rule:admin_api",
+    "compute_extension:certificates": "rule:admin_or_owner",
+    "compute_extension:cloudpipe": "rule:admin_api",
+    "compute_extension:cloudpipe_update": "rule:admin_api",
+    "compute_extension:config_drive": "rule:admin_or_owner",
+    "compute_extension:console_output": "rule:admin_or_owner",
+    "compute_extension:consoles": "rule:admin_or_owner",
+    "compute_extension:createserverext": "rule:admin_or_owner",
+    "compute_extension:deferred_delete": "rule:admin_or_owner",
+    "compute_extension:disk_config": "rule:admin_or_owner",
+    "compute_extension:evacuate": "rule:admin_api",
+    "compute_extension:extended_server_attributes": "rule:admin_api",
+    "compute_extension:extended_status": "rule:admin_or_owner",
+    "compute_extension:extended_availability_zone": "rule:admin_or_owner",
+    "compute_extension:extended_ips": "rule:admin_or_owner",
+    "compute_extension:extended_ips_mac": "rule:admin_or_owner",
+    "compute_extension:extended_vif_net": "rule:admin_or_owner",
+    "compute_extension:extended_volumes": "rule:admin_or_owner",
+    "compute_extension:fixed_ips": "rule:admin_api",
+    "compute_extension:flavor_access": "rule:admin_or_owner",
+    "compute_extension:flavor_access:addTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_access:removeTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_disabled": "rule:admin_or_owner",
+    "compute_extension:flavor_rxtx": "rule:admin_or_owner",
+    "compute_extension:flavor_swap": "rule:admin_or_owner",
+    "compute_extension:flavorextradata": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:index": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:show": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:create": "rule:admin_api",
+    "compute_extension:flavorextraspecs:update": "rule:admin_api",
+    "compute_extension:flavorextraspecs:delete": "rule:admin_api",
+    "compute_extension:flavormanage": "rule:admin_api",
+    "compute_extension:floating_ip_dns": "rule:admin_or_owner",
+    "compute_extension:floating_ip_pools": "rule:admin_or_owner",
+    "compute_extension:floating_ips": "rule:admin_or_owner",
+    "compute_extension:floating_ips_bulk": "rule:admin_api",
+    "compute_extension:fping": "rule:admin_or_owner",
+    "compute_extension:fping:all_tenants": "rule:admin_api",
+    "compute_extension:hide_server_addresses": "is_admin:False",
+    "compute_extension:hosts": "rule:admin_api",
+    "compute_extension:hypervisors": "rule:admin_api",
+    "compute_extension:image_size": "rule:admin_or_owner",
+    "compute_extension:instance_actions": "rule:admin_or_owner",
+    "compute_extension:instance_actions:events": "rule:admin_api",
+    "compute_extension:instance_usage_audit_log": "rule:admin_api",
+    "compute_extension:keypairs": "rule:admin_or_owner",
+    "compute_extension:keypairs:index": "rule:admin_or_owner",
+    "compute_extension:keypairs:show": "rule:admin_or_owner",
+    "compute_extension:keypairs:create": "rule:admin_or_owner",
+    "compute_extension:keypairs:delete": "rule:admin_or_owner",
+    "compute_extension:multinic": "rule:admin_or_owner",
+    "compute_extension:networks": "rule:admin_api",
+    "compute_extension:networks:view": "rule:admin_or_owner",
+    "compute_extension:networks_associate": "rule:admin_api",
+    "compute_extension:os-tenant-networks": "rule:admin_or_owner",
+    "compute_extension:quotas:show": "rule:admin_or_owner",
+    "compute_extension:quotas:update": "rule:admin_api",
+    "compute_extension:quotas:delete": "rule:admin_api",
+    "compute_extension:quota_classes": "rule:admin_or_owner",
+    "compute_extension:rescue": "rule:admin_or_owner",
+    "compute_extension:security_group_default_rules": "rule:admin_api",
+    "compute_extension:security_groups": "rule:admin_or_owner",
+    "compute_extension:server_diagnostics": "rule:admin_api",
+    "compute_extension:server_groups": "rule:admin_or_owner",
+    "compute_extension:server_password": "rule:admin_or_owner",
+    "compute_extension:server_usage": "rule:admin_or_owner",
+    "compute_extension:services": "rule:admin_api",
+    "compute_extension:shelve": "rule:admin_or_owner",
+    "compute_extension:shelveOffload": "rule:admin_api",
+    "compute_extension:simple_tenant_usage:show": "rule:admin_or_owner",
+    "compute_extension:simple_tenant_usage:list": "rule:admin_api",
+    "compute_extension:unshelve": "rule:admin_or_owner",
+    "compute_extension:users": "rule:admin_api",
+    "compute_extension:virtual_interfaces": "rule:admin_or_owner",
+    "compute_extension:virtual_storage_arrays": "rule:admin_or_owner",
+    "compute_extension:volumes": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:index": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:show": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:create": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:update": "rule:admin_api",
+    "compute_extension:volume_attachments:delete": "rule:admin_or_owner",
+    "compute_extension:volumetypes": "rule:admin_or_owner",
+    "compute_extension:availability_zone:list": "rule:admin_or_owner",
+    "compute_extension:availability_zone:detail": "rule:admin_api",
+    "compute_extension:used_limits_for_admin": "rule:admin_api",
+    "compute_extension:migrations:index": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "compute_extension:console_auth_tokens": "rule:admin_api",
+    "compute_extension:os-server-external-events:create": "rule:admin_api",
+
+    "network:get_all": "rule:admin_or_owner",
+    "network:get": "rule:admin_or_owner",
+    "network:create": "rule:admin_or_owner",
+    "network:delete": "rule:admin_or_owner",
+    "network:associate": "rule:admin_or_owner",
+    "network:disassociate": "rule:admin_or_owner",
+    "network:get_vifs_by_instance": "rule:admin_or_owner",
+    "network:allocate_for_instance": "rule:admin_or_owner",
+    "network:deallocate_for_instance": "rule:admin_or_owner",
+    "network:validate_networks": "rule:admin_or_owner",
+    "network:get_instance_uuids_by_ip_filter": "rule:admin_or_owner",
+    "network:get_instance_id_by_floating_address": "rule:admin_or_owner",
+    "network:setup_networks_on_host": "rule:admin_or_owner",
+    "network:get_backdoor_port": "rule:admin_or_owner",
+
+    "network:get_floating_ip": "rule:admin_or_owner",
+    "network:get_floating_ip_pools": "rule:admin_or_owner",
+    "network:get_floating_ip_by_address": "rule:admin_or_owner",
+    "network:get_floating_ips_by_project": "rule:admin_or_owner",
+    "network:get_floating_ips_by_fixed_address": "rule:admin_or_owner",
+    "network:allocate_floating_ip": "rule:admin_or_owner",
+    "network:associate_floating_ip": "rule:admin_or_owner",
+    "network:disassociate_floating_ip": "rule:admin_or_owner",
+    "network:release_floating_ip": "rule:admin_or_owner",
+    "network:migrate_instance_start": "rule:admin_or_owner",
+    "network:migrate_instance_finish": "rule:admin_or_owner",
+
+    "network:get_fixed_ip": "rule:admin_or_owner",
+    "network:get_fixed_ip_by_address": "rule:admin_or_owner",
+    "network:add_fixed_ip_to_instance": "rule:admin_or_owner",
+    "network:remove_fixed_ip_from_instance": "rule:admin_or_owner",
+    "network:add_network_to_project": "rule:admin_or_owner",
+    "network:get_instance_nw_info": "rule:admin_or_owner",
+
+    "network:get_dns_domains": "rule:admin_or_owner",
+    "network:add_dns_entry": "rule:admin_or_owner",
+    "network:modify_dns_entry": "rule:admin_or_owner",
+    "network:delete_dns_entry": "rule:admin_or_owner",
+    "network:get_dns_entries_by_address": "rule:admin_or_owner",
+    "network:get_dns_entries_by_name": "rule:admin_or_owner",
+    "network:create_private_dns_domain": "rule:admin_or_owner",
+    "network:create_public_dns_domain": "rule:admin_or_owner",
+    "network:delete_dns_domain": "rule:admin_or_owner",
+    "network:attach_external_network": "rule:admin_api",
+    "network:get_vif_by_mac_address": "rule:admin_or_owner",
+
+    "os_compute_api:servers:detail:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:index:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:create": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
+    "os_compute_api:servers:create:forced_host": "rule:admin_api",
+    "os_compute_api:servers:delete": "rule:admin_or_owner",
+    "os_compute_api:servers:update": "rule:admin_or_owner",
+    "os_compute_api:servers:detail": "rule:admin_or_owner",
     "os_compute_api:servers:index": "rule:admin_or_owner",
-    "os_compute_api:servers:detail": "rule:admin_or_owner",
-    "os_compute_api:servers:index:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:detail:get_all_tenants": "rule:admin_api",
+    "os_compute_api:servers:reboot": "rule:admin_or_owner",
+    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
+    "os_compute_api:servers:resize": "rule:admin_or_owner",
+    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
     "os_compute_api:servers:show": "rule:admin_or_owner",
     "os_compute_api:servers:show:host_status": "rule:admin_api",
-    "os_compute_api:servers:create": "rule:admin_or_owner",
-    "os_compute_api:servers:create:forced_host": "rule:admin_api",
-    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
-    "network:attach_external_network": "is_admin:True",
-    "os_compute_api:servers:delete": "rule:admin_or_owner",
-    "os_compute_api:servers:update": "rule:admin_or_owner",
-    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:reboot": "rule:admin_or_owner",
-    "os_compute_api:servers:resize": "rule:admin_or_owner",
-    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
     "os_compute_api:servers:create_image": "rule:admin_or_owner",
     "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
     "os_compute_api:servers:start": "rule:admin_or_owner",
     "os_compute_api:servers:stop": "rule:admin_or_owner",
     "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
-    "os_compute_api:servers:migrations:show": "rule:admin_api",
     "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
     "os_compute_api:servers:migrations:delete": "rule:admin_api",
+    "os_compute_api:servers:discoverable": "@",
     "os_compute_api:servers:migrations:index": "rule:admin_api",
+    "os_compute_api:servers:migrations:show": "rule:admin_api",
+    "os_compute_api:os-access-ips:discoverable": "@",
+    "os_compute_api:os-access-ips": "rule:admin_or_owner",
+    "os_compute_api:os-admin-actions": "rule:admin_api",
+    "os_compute_api:os-admin-actions:discoverable": "@",
+    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
+    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
+    "os_compute_api:os-admin-password": "rule:admin_or_owner",
+    "os_compute_api:os-admin-password:discoverable": "@",
+    "os_compute_api:os-aggregates:discoverable": "@",
+    "os_compute_api:os-aggregates:index": "rule:admin_api",
+    "os_compute_api:os-aggregates:create": "rule:admin_api",
+    "os_compute_api:os-aggregates:show": "rule:admin_api",
+    "os_compute_api:os-aggregates:update": "rule:admin_api",
+    "os_compute_api:os-aggregates:delete": "rule:admin_api",
+    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
+    "os_compute_api:os-agents": "rule:admin_api",
+    "os_compute_api:os-agents:discoverable": "@",
+    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-attach-interfaces:discoverable": "@",
+    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
+    "os_compute_api:os-baremetal-nodes:discoverable": "@",
+    "os_compute_api:os-block-device-mapping-v1:discoverable": "@",
+    "os_compute_api:os-cells": "rule:admin_api",
+    "os_compute_api:os-cells:create": "rule:admin_api",
+    "os_compute_api:os-cells:delete": "rule:admin_api",
+    "os_compute_api:os-cells:update": "rule:admin_api",
+    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
+    "os_compute_api:os-cells:discoverable": "@",
+    "os_compute_api:os-certificates:create": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:show": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:discoverable": "@",
+    "os_compute_api:os-cloudpipe": "rule:admin_api",
+    "os_compute_api:os-cloudpipe:discoverable": "@",
+    "os_compute_api:os-config-drive": "rule:admin_or_owner",
+    "os_compute_api:os-config-drive:discoverable": "@",
+    "os_compute_api:os-consoles:discoverable": "@",
+    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
+    "os_compute_api:os-console-output:discoverable": "@",
+    "os_compute_api:os-console-output": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles:discoverable": "@",
+    "os_compute_api:os-create-backup:discoverable": "@",
+    "os_compute_api:os-create-backup": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete:discoverable": "@",
+    "os_compute_api:os-disk-config": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config:discoverable": "@",
+    "os_compute_api:os-evacuate": "rule:admin_api",
+    "os_compute_api:os-evacuate:discoverable": "@",
+    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes:discoverable": "@",
+    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:os-extended-status:discoverable": "@",
+    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
+    "os_compute_api:os-extended-availability-zone:discoverable": "@",
+    "os_compute_api:extensions": "rule:admin_or_owner",
+    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:extension_info:discoverable": "@",
+    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-extended-volumes:discoverable": "@",
+    "os_compute_api:os-fixed-ips": "rule:admin_api",
+    "os_compute_api:os-fixed-ips:discoverable": "@",
+    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-access:discoverable": "@",
+    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-rxtx:discoverable": "@",
+    "os_compute_api:flavors": "rule:admin_or_owner",
+    "os_compute_api:flavors:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
+    "os_compute_api:os-flavor-manage:discoverable": "@",
+    "os_compute_api:os-flavor-manage": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-dns:discoverable": "@",
+    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
+    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-pools:discoverable": "@",
+    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ips:discoverable": "@",
+    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
+    "os_compute_api:os-floating-ips-bulk:discoverable": "@",
+    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-fping:discoverable": "@",
+    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
+    "os_compute_api:os-hide-server-addresses": "is_admin:False",
+    "os_compute_api:os-hide-server-addresses:discoverable": "@",
+    "os_compute_api:os-hosts": "rule:admin_api",
+    "os_compute_api:os-hosts:discoverable": "@",
+    "os_compute_api:os-hypervisors": "rule:admin_api",
+    "os_compute_api:os-hypervisors:discoverable": "@",
+    "os_compute_api:images:discoverable": "@",
+    "os_compute_api:image-size": "rule:admin_or_owner",
+    "os_compute_api:image-size:discoverable": "@",
+    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
+    "os_compute_api:os-instance-actions:discoverable": "@",
+    "os_compute_api:os-instance-actions:events": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log:discoverable": "@",
+    "os_compute_api:ips:discoverable": "@",
+    "os_compute_api:ips:index": "rule:admin_or_owner",
+    "os_compute_api:ips:show": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:discoverable": "@",
+    "os_compute_api:os-keypairs": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:limits:discoverable": "@",
+    "os_compute_api:limits": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:discoverable": "@",
+    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
+    "os_compute_api:os-migrate-server:discoverable": "@",
+    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
+    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
+    "os_compute_api:os-multinic": "rule:admin_or_owner",
+    "os_compute_api:os-multinic:discoverable": "@",
+    "os_compute_api:os-networks": "rule:admin_api",
+    "os_compute_api:os-networks:view": "rule:admin_or_owner",
+    "os_compute_api:os-networks:discoverable": "@",
+    "os_compute_api:os-networks-associate": "rule:admin_api",
+    "os_compute_api:os-networks-associate:discoverable": "@",
+    "os_compute_api:os-pause-server:discoverable": "@",
+    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
+    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
+    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
+    "os_compute_api:os-pci:discoverable": "@",
+    "os_compute_api:os-pci:index": "rule:admin_api",
+    "os_compute_api:os-pci:detail": "rule:admin_api",
+    "os_compute_api:os-pci:show": "rule:admin_api",
+    "os_compute_api:os-personality:discoverable": "@",
+    "os_compute_api:os-preserve-ephemeral-rebuild:discoverable": "@",
+    "os_compute_api:os-quota-sets:discoverable": "@",
+    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
+    "os_compute_api:os-quota-sets:defaults": "@",
+    "os_compute_api:os-quota-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
+    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
+    "os_compute_api:os-quota-class-sets:discoverable": "@",
+    "os_compute_api:os-rescue": "rule:admin_or_owner",
+    "os_compute_api:os-rescue:discoverable": "@",
+    "os_compute_api:os-scheduler-hints:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
+    "os_compute_api:os-security-groups": "rule:admin_or_owner",
+    "os_compute_api:os-security-groups:discoverable": "@",
+    "os_compute_api:os-server-diagnostics": "rule:admin_api",
+    "os_compute_api:os-server-diagnostics:discoverable": "@",
+    "os_compute_api:os-server-password": "rule:admin_or_owner",
+    "os_compute_api:os-server-password:discoverable": "@",
+    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "os_compute_api:os-server-usage:discoverable": "@",
+    "os_compute_api:os-server-groups": "rule:admin_or_owner",
+    "os_compute_api:os-server-groups:discoverable": "@",
+    "os_compute_api:os-server-tags:index": "@",
+    "os_compute_api:os-server-tags:show": "@",
+    "os_compute_api:os-server-tags:update": "@",
+    "os_compute_api:os-server-tags:update_all": "@",
+    "os_compute_api:os-server-tags:delete": "@",
+    "os_compute_api:os-server-tags:delete_all": "@",
     "os_compute_api:os-services": "rule:admin_api",
+    "os_compute_api:os-services:discoverable": "@",
+    "os_compute_api:server-metadata:discoverable": "@",
+    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
     "os_compute_api:os-shelve:shelve": "rule:admin_or_owner",
-    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-shelve:shelve:discoverable": "@",
     "os_compute_api:os-shelve:shelve_offload": "rule:admin_api",
+    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
     "os_compute_api:os-simple-tenant-usage:show": "rule:admin_or_owner",
     "os_compute_api:os-simple-tenant-usage:list": "rule:admin_api",
+    "os_compute_api:os-suspend-server:discoverable": "@",
+    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-suspend-server:resume": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-tenant-networks": "rule:admin_or_owner",
+    "os_compute_api:os-tenant-networks:discoverable": "@",
+    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-user-data:discoverable": "@",
+    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-virtual-interfaces:discoverable": "@",
+    "os_compute_api:os-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-volumes:discoverable": "@",
+    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
+    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:discoverable": "@",
+    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
+    "os_compute_api:os-availability-zone:discoverable": "@",
+    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
     "os_compute_api:os-used-limits": "rule:admin_api",
-    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-volumes": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
-    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner"
+    "os_compute_api:os-used-limits:discoverable": "@",
+    "os_compute_api:os-migrations:index": "rule:admin_api",
+    "os_compute_api:os-migrations:discoverable": "@",
+    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
+    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-console-auth-tokens:discoverable": "@",
+    "os_compute_api:os-server-external-events:create": "rule:admin_api",
+    "os_compute_api:os-server-external-events:discoverable": "@"
 }

2018-09-01 23:29:25,448 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 23:29:25.448642 duration_in_ms=47.195
2018-09-01 23:29:25,449 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 23:29:25.449465
2018-09-01 23:29:25,449 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json]
2018-09-01 23:29:25,466 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/neutron_policy.json'
2018-09-01 23:29:25,469 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -7,8 +7,9 @@
     "admin_owner_or_network_owner": "rule:owner or rule:admin_or_network_owner",
     "admin_only": "rule:context_is_admin",
     "regular_user": "",
-    "admin_or_data_plane_int": "rule:context_is_admin or role:data_plane_integrator",
     "shared": "field:networks:shared=True",
+    "shared_firewalls": "field:firewalls:shared=True",
+    "shared_firewall_policies": "field:firewall_policies:shared=True",
     "shared_subnetpools": "field:subnetpools:shared=True",
     "shared_address_scopes": "field:address_scopes:shared=True",
     "external": "field:networks:router:external=True",
@@ -16,11 +17,9 @@
 
     "create_subnet": "rule:admin_or_network_owner",
     "create_subnet:segment_id": "rule:admin_only",
-    "create_subnet:service_types": "rule:admin_only",
     "get_subnet": "rule:admin_or_owner or rule:shared",
     "get_subnet:segment_id": "rule:admin_only",
     "update_subnet": "rule:admin_or_network_owner",
-    "update_subnet:service_types": "rule:admin_only",
     "delete_subnet": "rule:admin_or_network_owner",
 
     "create_subnetpool": "",
@@ -94,7 +93,6 @@
     "update_port:binding:profile": "rule:admin_only",
     "update_port:mac_learning_enabled": "rule:context_is_advsvc or rule:admin_or_network_owner",
     "update_port:allowed_address_pairs": "rule:admin_or_network_owner",
-    "update_port:data_plane_status": "rule:admin_or_data_plane_int",
     "delete_port": "rule:context_is_advsvc or rule:admin_owner_or_network_owner",
 
     "get_router:ha": "rule:admin_only",
@@ -104,9 +102,6 @@
     "create_router:ha": "rule:admin_only",
     "get_router": "rule:admin_or_owner",
     "get_router:distributed": "rule:admin_only",
-    "update_router": "rule:admin_or_owner",
-    "update_router:external_gateway_info": "rule:admin_or_owner",
-    "update_router:external_gateway_info:network_id": "rule:admin_or_owner",
     "update_router:external_gateway_info:enable_snat": "rule:admin_only",
     "update_router:distributed": "rule:admin_only",
     "update_router:ha": "rule:admin_only",
@@ -117,6 +112,28 @@
 
     "create_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
     "update_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
+
+    "create_firewall": "",
+    "get_firewall": "rule:admin_or_owner",
+    "create_firewall:shared": "rule:admin_only",
+    "get_firewall:shared": "rule:admin_only",
+    "update_firewall": "rule:admin_or_owner",
+    "update_firewall:shared": "rule:admin_only",
+    "delete_firewall": "rule:admin_or_owner",
+
+    "create_firewall_policy": "",
+    "get_firewall_policy": "rule:admin_or_owner or rule:shared_firewall_policies",
+    "create_firewall_policy:shared": "rule:admin_or_owner",
+    "update_firewall_policy": "rule:admin_or_owner",
+    "delete_firewall_policy": "rule:admin_or_owner",
+
+    "insert_rule": "rule:admin_or_owner",
+    "remove_rule": "rule:admin_or_owner",
+
+    "create_firewall_rule": "",
+    "get_firewall_rule": "rule:admin_or_owner or rule:shared_firewalls",
+    "update_firewall_rule": "rule:admin_or_owner",
+    "delete_firewall_rule": "rule:admin_or_owner",
 
     "create_qos_queue": "rule:admin_only",
     "get_qos_queue": "rule:admin_only",
@@ -189,10 +206,6 @@
     "delete_policy_dscp_marking_rule": "rule:admin_only",
     "update_policy_dscp_marking_rule": "rule:admin_only",
     "get_rule_type": "rule:regular_user",
-    "get_policy_minimum_bandwidth_rule": "rule:regular_user",
-    "create_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "delete_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "update_policy_minimum_bandwidth_rule": "rule:admin_only",
 
     "restrict_wildcard": "(not field:rbac_policy:target_tenant=*) or rule:admin_only",
     "create_rbac_policy": "",
@@ -205,29 +218,5 @@
     "create_flavor_service_profile": "rule:admin_only",
     "delete_flavor_service_profile": "rule:admin_only",
     "get_flavor_service_profile": "rule:regular_user",
-    "get_auto_allocated_topology": "rule:admin_or_owner",
-
-    "create_trunk": "rule:regular_user",
-    "get_trunk": "rule:admin_or_owner",
-    "delete_trunk": "rule:admin_or_owner",
-    "get_subports": "",
-    "add_subports": "rule:admin_or_owner",
-    "remove_subports": "rule:admin_or_owner",
-
-    "get_security_groups": "rule:admin_or_owner",
-    "get_security_group": "rule:admin_or_owner",
-    "create_security_group": "rule:admin_or_owner",
-    "update_security_group": "rule:admin_or_owner",
-    "delete_security_group": "rule:admin_or_owner",
-    "get_security_group_rules": "rule:admin_or_owner",
-    "get_security_group_rule": "rule:admin_or_owner",
-    "create_security_group_rule": "rule:admin_or_owner",
-    "delete_security_group_rule": "rule:admin_or_owner",
-
-    "get_loggable_resources": "rule:admin_only",
-    "create_log": "rule:admin_only",
-    "update_log": "rule:admin_only",
-    "delete_log": "rule:admin_only",
-    "get_logs": "rule:admin_only",
-    "get_log": "rule:admin_only"
+    "get_auto_allocated_topology": "rule:admin_or_owner"
 }

2018-09-01 23:29:25,470 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 23:29:25.469976 duration_in_ms=20.51
2018-09-01 23:29:25,470 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 23:29:25.470430
2018-09-01 23:29:25,470 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json]
2018-09-01 23:29:25,486 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/glance_policy.json'
2018-09-01 23:29:25,487 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -8,7 +8,6 @@
     "get_images": "",
     "modify_image": "",
     "publicize_image": "role:admin",
-    "communitize_image": "",
     "copy_from": "",
 
     "download_image": "",
@@ -26,11 +25,10 @@
 
     "manage_image_cache": "role:admin",
 
-    "get_task": "",
-    "get_tasks": "",
-    "add_task": "",
-    "modify_task": "",
-    "tasks_api_access": "role:admin",
+    "get_task": "role:admin",
+    "get_tasks": "role:admin",
+    "add_task": "role:admin",
+    "modify_task": "role:admin",
 
     "deactivate": "",
     "reactivate": "",

2018-09-01 23:29:25,487 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 23:29:25.487869 duration_in_ms=17.438
2018-09-01 23:29:25,488 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 23:29:25.488341
2018-09-01 23:29:25,488 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json]
2018-09-01 23:29:25,505 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/ceilometer_policy.json'
2018-09-01 23:29:25,507 [salt.state       :290 ][INFO    ][7294] File changed:
New file
2018-09-01 23:29:25,507 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 23:29:25.507235 duration_in_ms=18.894
2018-09-01 23:29:25,507 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 23:29:25.507715
2018-09-01 23:29:25,507 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json]
2018-09-01 23:29:25,527 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/cinder_policy.json'
2018-09-01 23:29:25,530 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -1,136 +1,113 @@
 {
     "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or (role:admin and is_admin_project:True) or  project_id:%(project_id)s",
-    "admin_api": "is_admin:True or (role:admin and is_admin_project:True)",
-    "volume:attachment_create": "",
-    "volume:attachment_update": "rule:admin_or_owner",
-    "volume:attachment_delete": "rule:admin_or_owner",
-    "message:get_all": "rule:admin_or_owner",
-    "message:get": "rule:admin_or_owner",
-    "message:delete": "rule:admin_or_owner",
-    "clusters:get_all": "rule:admin_api",
-    "clusters:get": "rule:admin_api",
-    "clusters:update": "rule:admin_api",
-    "workers:cleanup": "rule:admin_api",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "admin_api": "is_admin:True",
+
+    "volume:create": "",
+    "volume:delete": "rule:admin_or_owner",
+    "volume:get": "rule:admin_or_owner",
+    "volume:get_all": "rule:admin_or_owner",
+    "volume:get_volume_metadata": "rule:admin_or_owner",
+    "volume:delete_volume_metadata": "rule:admin_or_owner",
+    "volume:update_volume_metadata": "rule:admin_or_owner",
+    "volume:get_volume_admin_metadata": "rule:admin_api",
+    "volume:update_volume_admin_metadata": "rule:admin_api",
+    "volume:get_snapshot": "rule:admin_or_owner",
+    "volume:get_all_snapshots": "rule:admin_or_owner",
+    "volume:create_snapshot": "rule:admin_or_owner",
+    "volume:delete_snapshot": "rule:admin_or_owner",
+    "volume:update_snapshot": "rule:admin_or_owner",
     "volume:get_snapshot_metadata": "rule:admin_or_owner",
+    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
     "volume:update_snapshot_metadata": "rule:admin_or_owner",
-    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
-    "volume:get_all_snapshots": "rule:admin_or_owner",
-    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
-    "volume:create_snapshot": "rule:admin_or_owner",
-    "volume:get_snapshot": "rule:admin_or_owner",
-    "volume:update_snapshot": "rule:admin_or_owner",
-    "volume:delete_snapshot": "rule:admin_or_owner",
-    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
-    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
-    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
-    "snapshot_extension:list_manageable": "rule:admin_api",
-    "snapshot_extension:snapshot_manage": "rule:admin_api",
-    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
-    "backup:get_all": "rule:admin_or_owner",
-    "backup:backup_project_attribute": "rule:admin_api",
-    "backup:create": "",
-    "backup:get": "rule:admin_or_owner",
-    "backup:update": "rule:admin_or_owner",
-    "backup:delete": "rule:admin_or_owner",
-    "backup:restore": "rule:admin_or_owner",
-    "backup:backup-import": "rule:admin_api",
-    "backup:export-import": "rule:admin_api",
-    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
-    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
-    "group:get_all": "rule:admin_or_owner",
-    "group:create": "",
-    "group:get": "rule:admin_or_owner",
-    "group:update": "rule:admin_or_owner",
-    "group:group_types_manage": "rule:admin_api",
-    "group:access_group_types_specs": "rule:admin_api",
-    "group:group_types_specs": "rule:admin_api",
-    "group:get_all_group_snapshots": "rule:admin_or_owner",
-    "group:create_group_snapshot": "",
-    "group:get_group_snapshot": "rule:admin_or_owner",
-    "group:delete_group_snapshot": "rule:admin_or_owner",
-    "group:update_group_snapshot": "rule:admin_or_owner",
-    "group:reset_group_snapshot_status": "rule:admin_or_owner",
-    "group:delete": "rule:admin_or_owner",
-    "group:reset_status": "rule:admin_api",
-    "group:enable_replication": "rule:admin_or_owner",
-    "group:disable_replication": "rule:admin_or_owner",
-    "group:failover_replication": "rule:admin_or_owner",
-    "group:list_replication_targets": "rule:admin_or_owner",
-    "volume_extension:qos_specs_manage:get_all": "rule:admin_api",
-    "volume_extension:qos_specs_manage:get": "rule:admin_api",
-    "volume_extension:qos_specs_manage:create": "rule:admin_api",
-    "volume_extension:qos_specs_manage:update": "rule:admin_api",
-    "volume_extension:qos_specs_manage:delete": "rule:admin_api",
-    "volume_extension:quota_classes": "rule:admin_api",
-    "volume_extension:quotas:show": "rule:admin_or_owner",
-    "volume_extension:quotas:update": "rule:admin_api",
-    "volume_extension:quotas:delete": "rule:admin_api",
-    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
-    "volume_extension:capabilities": "rule:admin_api",
-    "volume_extension:services:index": "rule:admin_api",
-    "volume_extension:services:update": "rule:admin_api",
-    "volume:freeze_host": "rule:admin_api",
-    "volume:thaw_host": "rule:admin_api",
-    "volume:failover_host": "rule:admin_api",
-    "scheduler_extension:scheduler_stats:get_pools": "rule:admin_api",
-    "volume_extension:hosts": "rule:admin_api",
-    "limits_extension:used_limits": "rule:admin_or_owner",
-    "volume_extension:list_manageable": "rule:admin_api",
-    "volume_extension:volume_manage": "rule:admin_api",
-    "volume_extension:volume_unmanage": "rule:admin_api",
+    "volume:extend": "rule:admin_or_owner",
+    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume:retype": "rule:admin_or_owner",
+    "volume:update": "rule:admin_or_owner",
+
     "volume_extension:types_manage": "rule:admin_api",
-    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:types_extra_specs": "rule:admin_api",
+    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:access_types_extra_specs": "rule:admin_api",
-    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:volume_type_access": "rule:admin_or_owner",
     "volume_extension:volume_type_access:addProjectAccess": "rule:admin_api",
     "volume_extension:volume_type_access:removeProjectAccess": "rule:admin_api",
-    "volume:extend": "rule:admin_or_owner",
-    "volume:extend_attached_volume": "rule:admin_or_owner",
-    "volume:revert_to_snapshot": "rule:admin_or_owner",
+    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner",
+    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
+    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
+
+    "volume_extension:quotas:show": "",
+    "volume_extension:quotas:update": "rule:admin_api",
+    "volume_extension:quotas:delete": "rule:admin_api",
+    "volume_extension:quota_classes": "rule:admin_api",
+    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
+
     "volume_extension:volume_admin_actions:reset_status": "rule:admin_api",
-    "volume:retype": "rule:admin_or_owner",
-    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
+    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
     "volume_extension:volume_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
+    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
+
     "volume_extension:volume_actions:upload_public": "rule:admin_api",
     "volume_extension:volume_actions:upload_image": "rule:admin_or_owner",
-    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
-    "volume_extension:volume_actions:initialize_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:terminate_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:roll_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:reserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:unreserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:begin_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:attach": "rule:admin_or_owner",
-    "volume_extension:volume_actions:detach": "rule:admin_or_owner",
-    "volume:get_all_transfers": "rule:admin_or_owner",
-    "volume:create_transfer": "rule:admin_or_owner",
-    "volume:get_transfer": "rule:admin_or_owner",
-    "volume:accept_transfer": "",
-    "volume:delete_transfer": "rule:admin_or_owner",
-    "volume:get_volume_metadata": "rule:admin_or_owner",
-    "volume:create_volume_metadata": "rule:admin_or_owner",
-    "volume:update_volume_metadata": "rule:admin_or_owner",
-    "volume:delete_volume_metadata": "rule:admin_or_owner",
-    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
-    "volume:update_volume_admin_metadata": "rule:admin_api",
-    "volume_extension:types_extra_specs:index": "rule:admin_api",
-    "volume_extension:types_extra_specs:create": "rule:admin_api",
-    "volume_extension:types_extra_specs:show": "rule:admin_api",
-    "volume_extension:types_extra_specs:update": "rule:admin_api",
-    "volume_extension:types_extra_specs:delete": "rule:admin_api",
-    "volume:create": "",
-    "volume:create_from_image": "",
-    "volume:get": "rule:admin_or_owner",
-    "volume:get_all": "rule:admin_or_owner",
-    "volume:update": "rule:admin_or_owner",
-    "volume:delete": "rule:admin_or_owner",
-    "volume:force_delete": "rule:admin_api",
+
     "volume_extension:volume_host_attribute": "rule:admin_api",
     "volume_extension:volume_tenant_attribute": "rule:admin_or_owner",
     "volume_extension:volume_mig_status_attribute": "rule:admin_api",
-    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner"
+    "volume_extension:hosts": "rule:admin_api",
+    "volume_extension:services:index": "rule:admin_api",
+    "volume_extension:services:update" : "rule:admin_api",
+
+    "volume_extension:volume_manage": "rule:admin_api",
+    "volume_extension:volume_unmanage": "rule:admin_api",
+
+    "volume_extension:capabilities": "rule:admin_api",
+
+    "volume:create_transfer": "rule:admin_or_owner",
+    "volume:accept_transfer": "",
+    "volume:delete_transfer": "rule:admin_or_owner",
+    "volume:get_transfer": "rule:admin_or_owner",
+    "volume:get_all_transfers": "rule:admin_or_owner",
+
+    "volume_extension:replication:promote": "rule:admin_api",
+    "volume_extension:replication:reenable": "rule:admin_api",
+
+    "volume:failover_host": "rule:admin_api",
+    "volume:freeze_host": "rule:admin_api",
+    "volume:thaw_host": "rule:admin_api",
+
+    "backup:create" : "",
+    "backup:delete": "rule:admin_or_owner",
+    "backup:get": "rule:admin_or_owner",
+    "backup:get_all": "rule:admin_or_owner",
+    "backup:restore": "rule:admin_or_owner",
+    "backup:backup-import": "rule:admin_api",
+    "backup:backup-export": "rule:admin_api",
+
+    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
+    "snapshot_extension:snapshot_manage": "rule:admin_api",
+    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
+
+    "consistencygroup:create" : "group:nobody",
+    "consistencygroup:delete": "group:nobody",
+    "consistencygroup:update": "group:nobody",
+    "consistencygroup:get": "group:nobody",
+    "consistencygroup:get_all": "group:nobody",
+
+    "consistencygroup:create_cgsnapshot" : "group:nobody",
+    "consistencygroup:delete_cgsnapshot": "group:nobody",
+    "consistencygroup:get_cgsnapshot": "group:nobody",
+    "consistencygroup:get_all_cgsnapshots": "group:nobody",
+
+    "scheduler_extension:scheduler_stats:get_pools" : "rule:admin_api",
+    "message:delete": "rule:admin_or_owner",
+    "message:get": "rule:admin_or_owner",
+    "message:get_all": "rule:admin_or_owner"
 }

2018-09-01 23:29:25,531 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 23:29:25.531187 duration_in_ms=23.472
2018-09-01 23:29:25,531 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 23:29:25.531894
2018-09-01 23:29:25,532 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json]
2018-09-01 23:29:25,553 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/heat_policy.json'
2018-09-01 23:29:25,554 [salt.state       :290 ][INFO    ][7294] File changed:
New file
2018-09-01 23:29:25,554 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 23:29:25.554894 duration_in_ms=22.999
2018-09-01 23:29:25,555 [salt.state       :1770][INFO    ][7294] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 23:29:25.555349
2018-09-01 23:29:25,555 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json]
2018-09-01 23:29:25,574 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/queens/keystone_policy.json'
2018-09-01 23:29:25,576 [salt.state       :290 ][INFO    ][7294] File changed:
--- 
+++ 
@@ -2,50 +2,137 @@
     "admin_required": "role:admin or is_admin:1",
     "service_role": "role:service",
     "service_or_admin": "rule:admin_required or rule:service_role",
-    "owner": "user_id:%(user_id)s",
+    "owner" : "user_id:%(user_id)s",
     "admin_or_owner": "rule:admin_required or rule:owner",
     "token_subject": "user_id:%(target.token.user_id)s",
     "admin_or_token_subject": "rule:admin_required or rule:token_subject",
     "service_admin_or_token_subject": "rule:service_or_admin or rule:token_subject",
-    "identity:authorize_request_token": "rule:admin_required",
-    "identity:get_access_token": "rule:admin_required",
-    "identity:get_access_token_role": "rule:admin_required",
-    "identity:list_access_tokens": "rule:admin_required",
-    "identity:list_access_token_roles": "rule:admin_required",
-    "identity:delete_access_token": "rule:admin_required",
-    "identity:get_auth_catalog": "",
-    "identity:get_auth_projects": "",
-    "identity:get_auth_domains": "",
-    "identity:get_consumer": "rule:admin_required",
-    "identity:list_consumers": "rule:admin_required",
-    "identity:create_consumer": "rule:admin_required",
-    "identity:update_consumer": "rule:admin_required",
-    "identity:delete_consumer": "rule:admin_required",
+
+    "default": "rule:admin_required",
+
+    "identity:get_region": "",
+    "identity:list_regions": "",
+    "identity:create_region": "rule:admin_required",
+    "identity:update_region": "rule:admin_required",
+    "identity:delete_region": "rule:admin_required",
+
+    "identity:get_service": "rule:admin_required",
+    "identity:list_services": "rule:admin_required",
+    "identity:create_service": "rule:admin_required",
+    "identity:update_service": "rule:admin_required",
+    "identity:delete_service": "rule:admin_required",
+
+    "identity:get_endpoint": "rule:admin_required",
+    "identity:list_endpoints": "rule:admin_required",
+    "identity:create_endpoint": "rule:admin_required",
+    "identity:update_endpoint": "rule:admin_required",
+    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_domain": "rule:admin_required",
+    "identity:list_domains": "rule:admin_required",
+    "identity:create_domain": "rule:admin_required",
+    "identity:update_domain": "rule:admin_required",
+    "identity:delete_domain": "rule:admin_required",
+
+    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
+    "identity:list_projects": "rule:admin_required",
+    "identity:list_user_projects": "rule:admin_or_owner",
+    "identity:create_project": "rule:admin_required",
+    "identity:update_project": "rule:admin_required",
+    "identity:delete_project": "rule:admin_required",
+
+    "identity:get_user": "rule:admin_required",
+    "identity:list_users": "rule:admin_required",
+    "identity:create_user": "rule:admin_required",
+    "identity:update_user": "rule:admin_required",
+    "identity:delete_user": "rule:admin_required",
+    "identity:change_password": "rule:admin_or_owner",
+
+    "identity:get_group": "rule:admin_required",
+    "identity:list_groups": "rule:admin_required",
+    "identity:list_groups_for_user": "rule:admin_or_owner",
+    "identity:create_group": "rule:admin_required",
+    "identity:update_group": "rule:admin_required",
+    "identity:delete_group": "rule:admin_required",
+    "identity:list_users_in_group": "rule:admin_required",
+    "identity:remove_user_from_group": "rule:admin_required",
+    "identity:check_user_in_group": "rule:admin_required",
+    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:get_credential": "rule:admin_required",
     "identity:list_credentials": "rule:admin_required",
     "identity:create_credential": "rule:admin_required",
     "identity:update_credential": "rule:admin_required",
     "identity:delete_credential": "rule:admin_required",
-    "identity:get_domain": "rule:admin_required or token.project.domain.id:%(target.domain.id)s",
-    "identity:list_domains": "rule:admin_required",
-    "identity:create_domain": "rule:admin_required",
-    "identity:update_domain": "rule:admin_required",
-    "identity:delete_domain": "rule:admin_required",
-    "identity:create_domain_config": "rule:admin_required",
-    "identity:get_domain_config": "rule:admin_required",
-    "identity:get_security_compliance_domain_config": "",
-    "identity:update_domain_config": "rule:admin_required",
-    "identity:delete_domain_config": "rule:admin_required",
-    "identity:get_domain_config_default": "rule:admin_required",
+
     "identity:ec2_get_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
     "identity:ec2_list_credentials": "rule:admin_or_owner",
     "identity:ec2_create_credential": "rule:admin_or_owner",
     "identity:ec2_delete_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
-    "identity:get_endpoint": "rule:admin_required",
-    "identity:list_endpoints": "rule:admin_required",
-    "identity:create_endpoint": "rule:admin_required",
-    "identity:update_endpoint": "rule:admin_required",
-    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_role": "rule:admin_required",
+    "identity:list_roles": "rule:admin_required",
+    "identity:create_role": "rule:admin_required",
+    "identity:update_role": "rule:admin_required",
+    "identity:delete_role": "rule:admin_required",
+    "identity:get_domain_role": "rule:admin_required",
+    "identity:list_domain_roles": "rule:admin_required",
+    "identity:create_domain_role": "rule:admin_required",
+    "identity:update_domain_role": "rule:admin_required",
+    "identity:delete_domain_role": "rule:admin_required",
+
+    "identity:get_implied_role": "rule:admin_required ",
+    "identity:list_implied_roles": "rule:admin_required",
+    "identity:create_implied_role": "rule:admin_required",
+    "identity:delete_implied_role": "rule:admin_required",
+    "identity:list_role_inference_rules": "rule:admin_required",
+    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:check_grant": "rule:admin_required",
+    "identity:list_grants": "rule:admin_required",
+    "identity:create_grant": "rule:admin_required",
+    "identity:revoke_grant": "rule:admin_required",
+
+    "identity:list_role_assignments": "rule:admin_required",
+    "identity:list_role_assignments_for_tree": "rule:admin_required",
+
+    "identity:get_policy": "rule:admin_required",
+    "identity:list_policies": "rule:admin_required",
+    "identity:create_policy": "rule:admin_required",
+    "identity:update_policy": "rule:admin_required",
+    "identity:delete_policy": "rule:admin_required",
+
+    "identity:check_token": "rule:admin_or_token_subject",
+    "identity:validate_token": "rule:service_admin_or_token_subject",
+    "identity:validate_token_head": "rule:service_or_admin",
+    "identity:revocation_list": "rule:service_or_admin",
+    "identity:revoke_token": "rule:admin_or_token_subject",
+
+    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
+    "identity:list_trusts": "",
+    "identity:list_roles_for_trust": "",
+    "identity:get_role_for_trust": "",
+    "identity:delete_trust": "",
+
+    "identity:create_consumer": "rule:admin_required",
+    "identity:get_consumer": "rule:admin_required",
+    "identity:list_consumers": "rule:admin_required",
+    "identity:delete_consumer": "rule:admin_required",
+    "identity:update_consumer": "rule:admin_required",
+
+    "identity:authorize_request_token": "rule:admin_required",
+    "identity:list_access_token_roles": "rule:admin_required",
+    "identity:get_access_token_role": "rule:admin_required",
+    "identity:list_access_tokens": "rule:admin_required",
+    "identity:get_access_token": "rule:admin_required",
+    "identity:delete_access_token": "rule:admin_required",
+
+    "identity:list_projects_for_endpoint": "rule:admin_required",
+    "identity:add_endpoint_to_project": "rule:admin_required",
+    "identity:check_endpoint_in_project": "rule:admin_required",
+    "identity:list_endpoints_for_project": "rule:admin_required",
+    "identity:remove_endpoint_from_project": "rule:admin_required",
+
     "identity:create_endpoint_group": "rule:admin_required",
     "identity:list_endpoint_groups": "rule:admin_required",
     "identity:get_endpoint_group": "rule:admin_required",
@@ -57,41 +144,40 @@
     "identity:list_endpoint_groups_for_project": "rule:admin_required",
     "identity:add_endpoint_group_to_project": "rule:admin_required",
     "identity:remove_endpoint_group_from_project": "rule:admin_required",
-    "identity:check_grant": "rule:admin_required",
-    "identity:list_grants": "rule:admin_required",
-    "identity:create_grant": "rule:admin_required",
-    "identity:revoke_grant": "rule:admin_required",
-    "identity:get_group": "rule:admin_required",
-    "identity:list_groups": "rule:admin_required",
-    "identity:list_groups_for_user": "rule:admin_or_owner",
-    "identity:create_group": "rule:admin_required",
-    "identity:update_group": "rule:admin_required",
-    "identity:delete_group": "rule:admin_required",
-    "identity:list_users_in_group": "rule:admin_required",
-    "identity:remove_user_from_group": "rule:admin_required",
-    "identity:check_user_in_group": "rule:admin_required",
-    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:create_identity_provider": "rule:admin_required",
     "identity:list_identity_providers": "rule:admin_required",
-    "identity:get_identity_provider": "rule:admin_required",
+    "identity:get_identity_providers": "rule:admin_required",
     "identity:update_identity_provider": "rule:admin_required",
     "identity:delete_identity_provider": "rule:admin_required",
-    "identity:get_implied_role": "rule:admin_required",
-    "identity:list_implied_roles": "rule:admin_required",
-    "identity:create_implied_role": "rule:admin_required",
-    "identity:delete_implied_role": "rule:admin_required",
-    "identity:list_role_inference_rules": "rule:admin_required",
-    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:create_protocol": "rule:admin_required",
+    "identity:update_protocol": "rule:admin_required",
+    "identity:get_protocol": "rule:admin_required",
+    "identity:list_protocols": "rule:admin_required",
+    "identity:delete_protocol": "rule:admin_required",
+
     "identity:create_mapping": "rule:admin_required",
     "identity:get_mapping": "rule:admin_required",
     "identity:list_mappings": "rule:admin_required",
     "identity:delete_mapping": "rule:admin_required",
     "identity:update_mapping": "rule:admin_required",
-    "identity:get_policy": "rule:admin_required",
-    "identity:list_policies": "rule:admin_required",
-    "identity:create_policy": "rule:admin_required",
-    "identity:update_policy": "rule:admin_required",
-    "identity:delete_policy": "rule:admin_required",
+
+    "identity:create_service_provider": "rule:admin_required",
+    "identity:list_service_providers": "rule:admin_required",
+    "identity:get_service_provider": "rule:admin_required",
+    "identity:update_service_provider": "rule:admin_required",
+    "identity:delete_service_provider": "rule:admin_required",
+
+    "identity:get_auth_catalog": "",
+    "identity:get_auth_projects": "",
+    "identity:get_auth_domains": "",
+
+    "identity:list_projects_for_groups": "",
+    "identity:list_domains_for_groups": "",
+
+    "identity:list_revoke_events": "",
+
     "identity:create_policy_association_for_endpoint": "rule:admin_required",
     "identity:check_policy_association_for_endpoint": "rule:admin_required",
     "identity:delete_policy_association_for_endpoint": "rule:admin_required",
@@ -103,72 +189,10 @@
     "identity:delete_policy_association_for_region_and_service": "rule:admin_required",
     "identity:get_policy_for_endpoint": "rule:admin_required",
     "identity:list_endpoints_for_policy": "rule:admin_required",
-    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:list_projects": "rule:admin_required",
-    "identity:list_user_projects": "rule:admin_or_owner",
-    "identity:create_project": "rule:admin_required",
-    "identity:update_project": "rule:admin_required",
-    "identity:delete_project": "rule:admin_required",
-    "identity:list_project_tags": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:get_project_tag": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:update_project_tags": "rule:admin_required",
-    "identity:create_project_tag": "rule:admin_required",
-    "identity:delete_project_tags": "rule:admin_required",
-    "identity:delete_project_tag": "rule:admin_required",
-    "identity:list_projects_for_endpoint": "rule:admin_required",
-    "identity:add_endpoint_to_project": "rule:admin_required",
-    "identity:check_endpoint_in_project": "rule:admin_required",
-    "identity:list_endpoints_for_project": "rule:admin_required",
-    "identity:remove_endpoint_from_project": "rule:admin_required",
-    "identity:create_protocol": "rule:admin_required",
-    "identity:update_protocol": "rule:admin_required",
-    "identity:get_protocol": "rule:admin_required",
-    "identity:list_protocols": "rule:admin_required",
-    "identity:delete_protocol": "rule:admin_required",
-    "identity:get_region": "",
-    "identity:list_regions": "",
-    "identity:create_region": "rule:admin_required",
-    "identity:update_region": "rule:admin_required",
-    "identity:delete_region": "rule:admin_required",
-    "identity:list_revoke_events": "rule:service_or_admin",
-    "identity:get_role": "rule:admin_required",
-    "identity:list_roles": "rule:admin_required",
-    "identity:create_role": "rule:admin_required",
-    "identity:update_role": "rule:admin_required",
-    "identity:delete_role": "rule:admin_required",
-    "identity:get_domain_role": "rule:admin_required",
-    "identity:list_domain_roles": "rule:admin_required",
-    "identity:create_domain_role": "rule:admin_required",
-    "identity:update_domain_role": "rule:admin_required",
-    "identity:delete_domain_role": "rule:admin_required",
-    "identity:list_role_assignments": "rule:admin_required",
-    "identity:list_role_assignments_for_tree": "rule:admin_required",
-    "identity:get_service": "rule:admin_required",
-    "identity:list_services": "rule:admin_required",
-    "identity:create_service": "rule:admin_required",
-    "identity:update_service": "rule:admin_required",
-    "identity:delete_service": "rule:admin_required",
-    "identity:create_service_provider": "rule:admin_required",
-    "identity:list_service_providers": "rule:admin_required",
-    "identity:get_service_provider": "rule:admin_required",
-    "identity:update_service_provider": "rule:admin_required",
-    "identity:delete_service_provider": "rule:admin_required",
-    "identity:revocation_list": "rule:service_or_admin",
-    "identity:check_token": "rule:admin_or_token_subject",
-    "identity:validate_token": "rule:service_admin_or_token_subject",
-    "identity:validate_token_head": "rule:service_or_admin",
-    "identity:revoke_token": "rule:admin_or_token_subject",
-    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
-    "identity:list_trusts": "",
-    "identity:list_roles_for_trust": "",
-    "identity:get_role_for_trust": "",
-    "identity:delete_trust": "",
-    "identity:get_trust": "",
-    "identity:get_user": "rule:admin_or_owner",
-    "identity:list_users": "rule:admin_required",
-    "identity:list_projects_for_user": "",
-    "identity:list_domains_for_user": "",
-    "identity:create_user": "rule:admin_required",
-    "identity:update_user": "rule:admin_required",
-    "identity:delete_user": "rule:admin_required"
+
+    "identity:create_domain_config": "rule:admin_required",
+    "identity:get_domain_config": "rule:admin_required",
+    "identity:update_domain_config": "rule:admin_required",
+    "identity:delete_domain_config": "rule:admin_required",
+    "identity:get_domain_config_default": "rule:admin_required"
 }

2018-09-01 23:29:25,577 [salt.state       :1941][INFO    ][7294] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 23:29:25.577287 duration_in_ms=21.938
2018-09-01 23:29:25,577 [salt.state       :1770][INFO    ][7294] Running state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 23:29:25.577855
2018-09-01 23:29:25,578 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/etc/apache2/conf-available/openstack-dashboard.conf]
2018-09-01 23:29:25,596 [salt.fileclient  :1215][INFO    ][7294] Fetching file from saltenv 'base', ** done ** 'horizon/files/openstack-dashboard.conf.Debian'
2018-09-01 23:29:25,628 [salt.state       :290 ][INFO    ][7294] File changed:
New file
2018-09-01 23:29:25,628 [salt.state       :1941][INFO    ][7294] Completed state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 23:29:25.628808 duration_in_ms=50.953
2018-09-01 23:29:25,639 [salt.state       :1770][INFO    ][7294] Running state [wsgi] at time 23:29:25.638982
2018-09-01 23:29:25,639 [salt.state       :1803][INFO    ][7294] Executing state apache_module.enabled for [wsgi]
2018-09-01 23:29:25,642 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['a2enmod', 'wsgi'] in directory '/root'
2018-09-01 23:29:25,690 [salt.state       :290 ][INFO    ][7294] {'new': 'wsgi', 'old': None}
2018-09-01 23:29:25,691 [salt.state       :1941][INFO    ][7294] Completed state [wsgi] at time 23:29:25.691281 duration_in_ms=52.298
2018-09-01 23:29:25,696 [salt.state       :1770][INFO    ][7294] Running state [openstack-dashboard] at time 23:29:25.696054
2018-09-01 23:29:25,696 [salt.state       :1803][INFO    ][7294] Executing state apache_conf.enabled for [openstack-dashboard]
2018-09-01 23:29:25,697 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['a2enconf', 'openstack-dashboard'] in directory '/root'
2018-09-01 23:29:25,739 [salt.state       :290 ][INFO    ][7294] {'new': 'openstack-dashboard', 'old': None}
2018-09-01 23:29:25,739 [salt.state       :1941][INFO    ][7294] Completed state [openstack-dashboard] at time 23:29:25.739786 duration_in_ms=43.731
2018-09-01 23:29:26,169 [salt.state       :1770][INFO    ][7294] Running state [/var/log/horizon] at time 23:29:26.168945
2018-09-01 23:29:26,169 [salt.state       :1803][INFO    ][7294] Executing state file.directory for [/var/log/horizon]
2018-09-01 23:29:26,174 [salt.state       :290 ][INFO    ][7294] {'/var/log/horizon': 'New Dir'}
2018-09-01 23:29:26,175 [salt.state       :1941][INFO    ][7294] Completed state [/var/log/horizon] at time 23:29:26.175033 duration_in_ms=6.087
2018-09-01 23:29:26,176 [salt.state       :1770][INFO    ][7294] Running state [/var/log/horizon/horizon.log] at time 23:29:26.176460
2018-09-01 23:29:26,177 [salt.state       :1803][INFO    ][7294] Executing state file.managed for [/var/log/horizon/horizon.log]
2018-09-01 23:29:26,178 [salt.loaded.int.states.file:2150][WARNING ][7294] State for file: /var/log/horizon/horizon.log - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2018-09-01 23:29:26,182 [salt.state       :290 ][INFO    ][7294] {'new': 'file /var/log/horizon/horizon.log created', 'group': 'adm', 'mode': '0640', 'user': 'horizon'}
2018-09-01 23:29:26,182 [salt.state       :1941][INFO    ][7294] Completed state [/var/log/horizon/horizon.log] at time 23:29:26.182697 duration_in_ms=6.237
2018-09-01 23:29:26,185 [salt.state       :1770][INFO    ][7294] Running state [apache2] at time 23:29:26.184932
2018-09-01 23:29:26,185 [salt.state       :1803][INFO    ][7294] Executing state service.running for [apache2]
2018-09-01 23:29:26,187 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:29:26,203 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:26,220 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'apache2.service'] in directory '/root'
2018-09-01 23:29:27,444 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2018-09-01 23:29:27,468 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:27,494 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7294] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2018-09-01 23:29:27,516 [salt.state       :290 ][INFO    ][7294] {'apache2': True}
2018-09-01 23:29:27,517 [salt.state       :1941][INFO    ][7294] Completed state [apache2] at time 23:29:27.517629 duration_in_ms=1332.697
2018-09-01 23:29:27,522 [salt.minion      :1708][INFO    ][7294] Returning information for job: 20180901232603765218
2018-09-01 23:29:30,808 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command state.sls with jid 20180901232930804446
2018-09-01 23:29:30,830 [salt.minion      :1431][INFO    ][12954] Starting a new job with PID 12954
2018-09-01 23:29:35,467 [salt.state       :905 ][INFO    ][12954] Loading fresh modules for state activity
2018-09-01 23:29:35,509 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/init.sls'
2018-09-01 23:29:35,535 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/server.sls'
2018-09-01 23:29:35,597 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/server/users.sls'
2018-09-01 23:29:35,640 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/server/sites.sls'
2018-09-01 23:29:35,695 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-09-01 23:29:35,725 [salt.loaded.int.module.cmdmod:722 ][ERROR   ][12954] Command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' failed with return code: 1
2018-09-01 23:29:35,726 [salt.loaded.int.module.cmdmod:724 ][ERROR   ][12954] stdout: cat: /etc/ssl/certs/172.30.10.101-with-chain.crt: No such file or directory
2018-09-01 23:29:35,726 [salt.loaded.int.module.cmdmod:728 ][ERROR   ][12954] retcode: 1
2018-09-01 23:29:35,727 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt' in directory '/root'
2018-09-01 23:29:35,817 [salt.state       :1770][INFO    ][12954] Running state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 23:29:35.817649
2018-09-01 23:29:35,818 [salt.state       :1803][INFO    ][12954] Executing state cmd.run for [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt]
2018-09-01 23:29:35,819 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command '/bin/true' in directory '/root'
2018-09-01 23:29:35,844 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2018-09-01 23:29:35,861 [salt.state       :290 ][INFO    ][12954] {'pid': 12977, 'retcode': 0, 'stderr': '', 'stdout': ''}
2018-09-01 23:29:35,862 [salt.state       :1941][INFO    ][12954] Completed state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 23:29:35.862106 duration_in_ms=44.46
2018-09-01 23:29:35,913 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232935906239
2018-09-01 23:29:35,935 [salt.minion      :1431][INFO    ][12981] Starting a new job with PID 12981
2018-09-01 23:29:35,951 [salt.minion      :1708][INFO    ][12981] Returning information for job: 20180901232935906239
2018-09-01 23:29:37,149 [salt.state       :1770][INFO    ][12954] Running state [nginx] at time 23:29:37.149104
2018-09-01 23:29:37,149 [salt.state       :1803][INFO    ][12954] Executing state pkg.installed for [nginx]
2018-09-01 23:29:37,150 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:37,552 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['apt-cache', '-q', 'policy', 'nginx'] in directory '/root'
2018-09-01 23:29:37,677 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2018-09-01 23:29:39,442 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2018-09-01 23:29:39,471 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'nginx'] in directory '/root'
2018-09-01 23:29:46,107 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232946100957
2018-09-01 23:29:46,127 [salt.minion      :1431][INFO    ][13716] Starting a new job with PID 13716
2018-09-01 23:29:46,144 [salt.minion      :1708][INFO    ][13716] Returning information for job: 20180901232946100957
2018-09-01 23:29:52,386 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2018-09-01 23:29:52,430 [salt.state       :290 ][INFO    ][12954] Made the following changes:
'libgd3' changed from 'absent' to '2.1.1-4ubuntu0.16.04.10'
'nginx-core' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libxpm4' changed from 'absent' to '1:3.5.11-1ubuntu0.16.04.1'
'nginx' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'nginx-common' changed from 'absent' to '1.10.3-0ubuntu0.16.04.2'
'libfontconfig' changed from 'absent' to '1'
'fonts-dejavu-core' changed from 'absent' to '2.35-1'
'fontconfig-config' changed from 'absent' to '2.11.94-0ubuntu1.1'
'libvpx3' changed from 'absent' to '1.5.0-2ubuntu1'
'libfontconfig1' changed from 'absent' to '2.11.94-0ubuntu1.1'

2018-09-01 23:29:52,456 [salt.state       :905 ][INFO    ][12954] Loading fresh modules for state activity
2018-09-01 23:29:52,588 [salt.state       :1941][INFO    ][12954] Completed state [nginx] at time 23:29:52.588770 duration_in_ms=15439.666
2018-09-01 23:29:52,593 [salt.state       :1770][INFO    ][12954] Running state [apache2-utils] at time 23:29:52.593369
2018-09-01 23:29:52,593 [salt.state       :1803][INFO    ][12954] Executing state pkg.installed for [apache2-utils]
2018-09-01 23:29:53,062 [salt.state       :290 ][INFO    ][12954] All specified packages are already installed
2018-09-01 23:29:53,063 [salt.state       :1941][INFO    ][12954] Completed state [apache2-utils] at time 23:29:53.063144 duration_in_ms=469.774
2018-09-01 23:29:53,063 [salt.state       :1770][INFO    ][12954] Running state [openssl] at time 23:29:53.063621
2018-09-01 23:29:53,064 [salt.state       :1803][INFO    ][12954] Executing state pkg.installed for [openssl]
2018-09-01 23:29:53,070 [salt.state       :290 ][INFO    ][12954] All specified packages are already installed
2018-09-01 23:29:53,070 [salt.state       :1941][INFO    ][12954] Completed state [openssl] at time 23:29:53.070389 duration_in_ms=6.767
2018-09-01 23:29:53,072 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:53.072327
2018-09-01 23:29:53,072 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf]
2018-09-01 23:29:53,093 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/proxy.conf'
2018-09-01 23:29:53,140 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_limit.conf'
2018-09-01 23:29:53,167 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/headers/_strict_transport_security.conf'
2018-09-01 23:29:53,185 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_name.conf'
2018-09-01 23:29:53,203 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl.conf'
2018-09-01 23:29:53,254 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl_secure.conf'
2018-09-01 23:29:53,270 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_auth.conf'
2018-09-01 23:29:53,287 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_access_policy.conf'
2018-09-01 23:29:53,294 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:53,294 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:53.294858 duration_in_ms=222.529
2018-09-01 23:29:53,295 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:53.295475
2018-09-01 23:29:53,296 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf]
2018-09-01 23:29:53,298 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf'}
2018-09-01 23:29:53,298 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 23:29:53.298483 duration_in_ms=3.008
2018-09-01 23:29:53,298 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:53.298914
2018-09-01 23:29:53,299 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2018-09-01 23:29:53,299 [salt.state       :290 ][INFO    ][12954] File /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2018-09-01 23:29:53,300 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:53.300213 duration_in_ms=1.299
2018-09-01 23:29:53,300 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:53.300604
2018-09-01 23:29:53,301 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2018-09-01 23:29:53,301 [salt.state       :290 ][INFO    ][12954] File /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2018-09-01 23:29:53,301 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 23:29:53.301831 duration_in_ms=1.226
2018-09-01 23:29:53,302 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 23:29:53.302428
2018-09-01 23:29:53,302 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf]
2018-09-01 23:29:53,466 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:53,467 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 23:29:53.467465 duration_in_ms=165.037
2018-09-01 23:29:53,468 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 23:29:53.467962
2018-09-01 23:29:53,468 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf]
2018-09-01 23:29:53,470 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf'}
2018-09-01 23:29:53,470 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 23:29:53.470432 duration_in_ms=2.469
2018-09-01 23:29:53,471 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 23:29:53.471127
2018-09-01 23:29:53,471 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf]
2018-09-01 23:29:53,621 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:53,622 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 23:29:53.622425 duration_in_ms=151.298
2018-09-01 23:29:53,622 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 23:29:53.622891
2018-09-01 23:29:53,623 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf]
2018-09-01 23:29:53,624 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf'}
2018-09-01 23:29:53,625 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 23:29:53.625282 duration_in_ms=2.391
2018-09-01 23:29:53,626 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 23:29:53.625964
2018-09-01 23:29:53,626 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf]
2018-09-01 23:29:53,776 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:53,776 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 23:29:53.776748 duration_in_ms=150.783
2018-09-01 23:29:53,777 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 23:29:53.777203
2018-09-01 23:29:53,777 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf]
2018-09-01 23:29:53,779 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf'}
2018-09-01 23:29:53,779 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 23:29:53.779577 duration_in_ms=2.374
2018-09-01 23:29:53,780 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.780254
2018-09-01 23:29:53,780 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf]
2018-09-01 23:29:53,941 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:53,942 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.942598 duration_in_ms=162.343
2018-09-01 23:29:53,943 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.943063
2018-09-01 23:29:53,943 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf]
2018-09-01 23:29:53,945 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf'}
2018-09-01 23:29:53,945 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 23:29:53.945592 duration_in_ms=2.529
2018-09-01 23:29:53,946 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 23:29:53.946221
2018-09-01 23:29:53,946 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_novnc.conf]
2018-09-01 23:29:54,105 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,106 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 23:29:54.106075 duration_in_ms=159.852
2018-09-01 23:29:54,106 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 23:29:54.106604
2018-09-01 23:29:54,107 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf]
2018-09-01 23:29:54,109 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_novnc.conf'}
2018-09-01 23:29:54,109 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 23:29:54.109482 duration_in_ms=2.878
2018-09-01 23:29:54,110 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 23:29:54.110203
2018-09-01 23:29:54,110 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf]
2018-09-01 23:29:54,270 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,270 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 23:29:54.270921 duration_in_ms=160.717
2018-09-01 23:29:54,271 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 23:29:54.271403
2018-09-01 23:29:54,271 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf]
2018-09-01 23:29:54,273 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf'}
2018-09-01 23:29:54,273 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 23:29:54.273925 duration_in_ms=2.522
2018-09-01 23:29:54,274 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 23:29:54.274528
2018-09-01 23:29:54,274 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf]
2018-09-01 23:29:54,541 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,542 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 23:29:54.542791 duration_in_ms=268.262
2018-09-01 23:29:54,543 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 23:29:54.543439
2018-09-01 23:29:54,544 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf]
2018-09-01 23:29:54,546 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf'}
2018-09-01 23:29:54,547 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 23:29:54.546939 duration_in_ms=3.5
2018-09-01 23:29:54,548 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:54.547946
2018-09-01 23:29:54,548 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf]
2018-09-01 23:29:54,711 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,712 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:54.712217 duration_in_ms=164.27
2018-09-01 23:29:54,712 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:54.712735
2018-09-01 23:29:54,713 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf]
2018-09-01 23:29:54,715 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf'}
2018-09-01 23:29:54,715 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 23:29:54.715593 duration_in_ms=2.858
2018-09-01 23:29:54,716 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 23:29:54.716394
2018-09-01 23:29:54,716 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf]
2018-09-01 23:29:54,734 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/redirect.conf'
2018-09-01 23:29:54,746 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,746 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 23:29:54.746799 duration_in_ms=30.405
2018-09-01 23:29:54,747 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 23:29:54.747244
2018-09-01 23:29:54,747 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf]
2018-09-01 23:29:54,749 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf'}
2018-09-01 23:29:54,749 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 23:29:54.749646 duration_in_ms=2.402
2018-09-01 23:29:54,750 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 23:29:54.750328
2018-09-01 23:29:54,750 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_static_reclass_doc.conf]
2018-09-01 23:29:54,768 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/static.conf'
2018-09-01 23:29:54,807 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/_log.conf'
2018-09-01 23:29:54,899 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:54,899 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 23:29:54.899659 duration_in_ms=149.331
2018-09-01 23:29:54,900 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 23:29:54.900137
2018-09-01 23:29:54,900 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf]
2018-09-01 23:29:54,902 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf'}
2018-09-01 23:29:54,902 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 23:29:54.902748 duration_in_ms=2.611
2018-09-01 23:29:54,903 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 23:29:54.903472
2018-09-01 23:29:54,903 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf]
2018-09-01 23:29:55,059 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:55,060 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 23:29:55.060517 duration_in_ms=157.041
2018-09-01 23:29:55,061 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 23:29:55.060990
2018-09-01 23:29:55,061 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf]
2018-09-01 23:29:55,063 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf'}
2018-09-01 23:29:55,063 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 23:29:55.063388 duration_in_ms=2.398
2018-09-01 23:29:55,064 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 23:29:55.064054
2018-09-01 23:29:55,064 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_stats_stats.conf]
2018-09-01 23:29:55,081 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/stats.conf'
2018-09-01 23:29:55,089 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:55,089 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_stats_stats.conf] at time 23:29:55.089929 duration_in_ms=25.874
2018-09-01 23:29:55,090 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 23:29:55.090403
2018-09-01 23:29:55,090 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_stats_stats.conf]
2018-09-01 23:29:55,092 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_stats_stats.conf'}
2018-09-01 23:29:55,092 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_stats_stats.conf] at time 23:29:55.092907 duration_in_ms=2.503
2018-09-01 23:29:55,093 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 23:29:55.093685
2018-09-01 23:29:55,094 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf]
2018-09-01 23:29:55,249 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:55,249 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 23:29:55.249759 duration_in_ms=156.074
2018-09-01 23:29:55,250 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 23:29:55.250211
2018-09-01 23:29:55,250 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf]
2018-09-01 23:29:55,252 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf'}
2018-09-01 23:29:55,252 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 23:29:55.252483 duration_in_ms=2.271
2018-09-01 23:29:55,253 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:55.253165
2018-09-01 23:29:55,253 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf]
2018-09-01 23:29:55,412 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:55,413 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:55.413848 duration_in_ms=160.68
2018-09-01 23:29:55,414 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:55.414542
2018-09-01 23:29:55,415 [salt.state       :1803][INFO    ][12954] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf]
2018-09-01 23:29:55,418 [salt.state       :290 ][INFO    ][12954] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf'}
2018-09-01 23:29:55,418 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_ceilometer.conf] at time 23:29:55.418830 duration_in_ms=4.289
2018-09-01 23:29:55,419 [salt.state       :1770][INFO    ][12954] Running state [/usr/sbin/policy-rc.d] at time 23:29:55.419294
2018-09-01 23:29:55,419 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/usr/sbin/policy-rc.d]
2018-09-01 23:29:55,424 [salt.state       :290 ][INFO    ][12954] File changed:
New file
2018-09-01 23:29:55,425 [salt.state       :1941][INFO    ][12954] Completed state [/usr/sbin/policy-rc.d] at time 23:29:55.425249 duration_in_ms=5.865
2018-09-01 23:29:55,426 [salt.state       :1770][INFO    ][12954] Running state [/usr/sbin/policy-rc.d] at time 23:29:55.426122
2018-09-01 23:29:55,426 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/usr/sbin/policy-rc.d]
2018-09-01 23:29:55,427 [salt.state       :290 ][INFO    ][12954] {'removed': '/usr/sbin/policy-rc.d'}
2018-09-01 23:29:55,427 [salt.state       :1941][INFO    ][12954] Completed state [/usr/sbin/policy-rc.d] at time 23:29:55.427677 duration_in_ms=1.556
2018-09-01 23:29:55,428 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/conf.d/default.conf] at time 23:29:55.428527
2018-09-01 23:29:55,429 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/etc/nginx/conf.d/default.conf]
2018-09-01 23:29:55,429 [salt.state       :290 ][INFO    ][12954] File /etc/nginx/conf.d/default.conf is not present
2018-09-01 23:29:55,430 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/conf.d/default.conf] at time 23:29:55.429965 duration_in_ms=1.438
2018-09-01 23:29:55,430 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-enabled/default] at time 23:29:55.430465
2018-09-01 23:29:55,430 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/etc/nginx/sites-enabled/default]
2018-09-01 23:29:55,431 [salt.state       :290 ][INFO    ][12954] {'removed': '/etc/nginx/sites-enabled/default'}
2018-09-01 23:29:55,431 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-enabled/default] at time 23:29:55.431408 duration_in_ms=0.943
2018-09-01 23:29:55,431 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/sites-available/default] at time 23:29:55.431908
2018-09-01 23:29:55,432 [salt.state       :1803][INFO    ][12954] Executing state file.absent for [/etc/nginx/sites-available/default]
2018-09-01 23:29:55,432 [salt.state       :290 ][INFO    ][12954] {'removed': '/etc/nginx/sites-available/default'}
2018-09-01 23:29:55,432 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/sites-available/default] at time 23:29:55.432863 duration_in_ms=0.955
2018-09-01 23:29:55,433 [salt.state       :1770][INFO    ][12954] Running state [/etc/nginx/nginx.conf] at time 23:29:55.433372
2018-09-01 23:29:55,433 [salt.state       :1803][INFO    ][12954] Executing state file.managed for [/etc/nginx/nginx.conf]
2018-09-01 23:29:55,449 [salt.fileclient  :1215][INFO    ][12954] Fetching file from saltenv 'base', ** done ** 'nginx/files/nginx.conf'
2018-09-01 23:29:55,475 [salt.state       :290 ][INFO    ][12954] File changed:
--- 
+++ 
@@ -1,85 +1,102 @@
 user www-data;
 worker_processes auto;
+worker_rlimit_nofile 20000;
 pid /run/nginx.pid;
 
+
 events {
-	worker_connections 768;
-	# multi_accept on;
+        worker_connections 1024;
+        # multi_accept on;
 }
 
 http {
 
-	##
-	# Basic Settings
-	##
+        ##
+        # Basic Settings
+        ##
 
-	sendfile on;
-	tcp_nopush on;
-	tcp_nodelay on;
-	keepalive_timeout 65;
-	types_hash_max_size 2048;
-	# server_tokens off;
+        sendfile on;
+        tcp_nopush on;
+        tcp_nodelay on;
+        keepalive_timeout 65;
+        types_hash_max_size 2048;
+        server_tokens off;
 
-	# server_names_hash_bucket_size 64;
-	# server_name_in_redirect off;
+        server_names_hash_bucket_size 128;
+        # server_name_in_redirect off;
 
-	include /etc/nginx/mime.types;
-	default_type application/octet-stream;
+        variables_hash_bucket_size 128;
 
-	##
-	# SSL Settings
-	##
+        include /etc/nginx/mime.types;
+        default_type application/octet-stream;
 
-	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
-	ssl_prefer_server_ciphers on;
+        ##
+        # Logging Settings
+        ##
 
-	##
-	# Logging Settings
-	##
+        access_log /var/log/nginx/access.log;
+        error_log /var/log/nginx/error.log;
 
-	access_log /var/log/nginx/access.log;
-	error_log /var/log/nginx/error.log;
+        ##
+        # Gzip Settings
+        ##
 
-	##
-	# Gzip Settings
-	##
+        gzip on;
+        gzip_disable "msie6";
 
-	gzip on;
-	gzip_disable "msie6";
+        # gzip_vary on;
+        # gzip_proxied any;
+        # gzip_comp_level 6;
+        # gzip_buffers 16 8k;
+        # gzip_http_version 1.1;
+        # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
 
-	# gzip_vary on;
-	# gzip_proxied any;
-	# gzip_comp_level 6;
-	# gzip_buffers 16 8k;
-	# gzip_http_version 1.1;
-	# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
+        ##
+        # nginx-naxsi config
+        ##
+        # Uncomment it if you installed nginx-naxsi
+        ##
 
-	##
-	# Virtual Host Configs
-	##
+        #include /etc/nginx/naxsi_core.rules;
 
-	include /etc/nginx/conf.d/*.conf;
-	include /etc/nginx/sites-enabled/*;
+        ##
+        # nginx-passenger config
+        ##
+        # Uncomment it if you installed nginx-passenger
+        ##
+
+        #passenger_root /usr;
+        #passenger_ruby /usr/bin/ruby;
+
+
+
+        ##
+        # Virtual Host Configs
+        ##
+
+        include /etc/nginx/conf.d/*.conf;
+        include /etc/nginx/sites-enabled/*.conf;
 }
 
 
+
 #mail {
-#	# See sample authentication script at:
-#	# http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
-# 
-#	# auth_http localhost/auth.php;
-#	# pop3_capabilities "TOP" "USER";
-#	# imap_capabilities "IMAP4rev1" "UIDPLUS";
-# 
-#	server {
-#		listen     localhost:110;
-#		protocol   pop3;
-#		proxy      on;
-#	}
-# 
-#	server {
-#		listen     localhost:143;
-#		protocol   imap;
-#		proxy      on;
-#	}
+#       # See sample authentication script at:
+#       # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
+#
+#       # auth_http localhost/auth.php;
+#       # pop3_capabilities "TOP" "USER";
+#       # imap_capabilities "IMAP4rev1" "UIDPLUS";
+#
+#       server {
+#               listen     localhost:110;
+#               protocol   pop3;
+#               proxy      on;
+#       }
+#
+#       server {
+#               listen     localhost:143;
+#               protocol   imap;
+#               proxy      on;
+#       }
 #}

2018-09-01 23:29:55,475 [salt.state       :1941][INFO    ][12954] Completed state [/etc/nginx/nginx.conf] at time 23:29:55.475599 duration_in_ms=42.225
2018-09-01 23:29:55,476 [salt.state       :1770][INFO    ][12954] Running state [/etc/ssl/private] at time 23:29:55.476101
2018-09-01 23:29:55,476 [salt.state       :1803][INFO    ][12954] Executing state file.directory for [/etc/ssl/private]
2018-09-01 23:29:55,477 [salt.state       :290 ][INFO    ][12954] Directory /etc/ssl/private is in the correct state
Directory /etc/ssl/private updated
2018-09-01 23:29:55,477 [salt.state       :1941][INFO    ][12954] Completed state [/etc/ssl/private] at time 23:29:55.477415 duration_in_ms=1.314
2018-09-01 23:29:55,489 [salt.state       :1770][INFO    ][12954] Running state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 23:29:55.489646
2018-09-01 23:29:55,490 [salt.state       :1803][INFO    ][12954] Executing state cmd.run for [openssl dhparam -out /etc/ssl/dhparams.pem 2048]
2018-09-01 23:29:55,491 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command 'openssl dhparam -out /etc/ssl/dhparams.pem 2048' in directory '/root'
2018-09-01 23:29:56,303 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901232956296028
2018-09-01 23:29:56,323 [salt.minion      :1431][INFO    ][14034] Starting a new job with PID 14034
2018-09-01 23:29:56,336 [salt.minion      :1708][INFO    ][14034] Returning information for job: 20180901232956296028
2018-09-01 23:30:06,322 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233006315402
2018-09-01 23:30:06,344 [salt.minion      :1431][INFO    ][14043] Starting a new job with PID 14043
2018-09-01 23:30:06,360 [salt.minion      :1708][INFO    ][14043] Returning information for job: 20180901233006315402
2018-09-01 23:30:16,522 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233016514281
2018-09-01 23:30:16,542 [salt.minion      :1431][INFO    ][14052] Starting a new job with PID 14052
2018-09-01 23:30:16,556 [salt.minion      :1708][INFO    ][14052] Returning information for job: 20180901233016514281
2018-09-01 23:30:26,719 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command saltutil.find_job with jid 20180901233026712245
2018-09-01 23:30:26,743 [salt.minion      :1431][INFO    ][14061] Starting a new job with PID 14061
2018-09-01 23:30:26,758 [salt.minion      :1708][INFO    ][14061] Returning information for job: 20180901233026712245
2018-09-01 23:30:32,401 [salt.state       :290 ][INFO    ][12954] {'pid': 14030, 'retcode': 0, 'stderr': "Generating DH parameters, 2048 bit long safe prime, generator 2\nThis is going to take a long time\n.............................................................................................+.................................................................................+....................................................................................................................................................................+..........................................................................+.............................................................................................................................................................................................................+......................................................................+.........................................................+...................+...............+...............................................+.....................................................................................................................................+.....................................+..............................................................................+..................................................................+......................................+....+............................................................................................+....................................................................................................................................................................................................................+.................................................................................................................................................................................+...................................................................................................+........................+..........................................................................................+................................................................................................+.................................................................................................................................................................................................................+...............................................................................................................................................................................................................................+.....................................+......................+..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................+.......................................................................................+.......................................................................................................................+...................................................................................................................................................+.................................................+........................................+.................................................+............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................+....................................+................................................................................................+..........+.....................................................................+.....................................................+............................................................................................+...............................................................................+.............................+.................................................................................................................+.........................................+...............................+.....................................................................................+.................................................+......................................................................................................................................................+...................................................................+.................................................................................+.....................+.........................................+.................+..............................+.......................................+.............................................................................................................................................++*++*\nunable to write 'random state'", 'stdout': ''}
2018-09-01 23:30:32,403 [salt.state       :1941][INFO    ][12954] Completed state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 23:30:32.403539 duration_in_ms=36913.893
2018-09-01 23:30:32,406 [salt.state       :1770][INFO    ][12954] Running state [nginx] at time 23:30:32.406484
2018-09-01 23:30:32,406 [salt.state       :1803][INFO    ][12954] Executing state service.running for [nginx]
2018-09-01 23:30:32,407 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemctl', 'status', 'nginx.service', '-n', '0'] in directory '/root'
2018-09-01 23:30:32,422 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-09-01 23:30:32,439 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2018-09-01 23:30:32,455 [salt.state       :290 ][INFO    ][12954] The service nginx is already running
2018-09-01 23:30:32,455 [salt.state       :1941][INFO    ][12954] Completed state [nginx] at time 23:30:32.455762 duration_in_ms=49.277
2018-09-01 23:30:32,456 [salt.state       :1770][INFO    ][12954] Running state [nginx] at time 23:30:32.455978
2018-09-01 23:30:32,456 [salt.state       :1803][INFO    ][12954] Executing state service.mod_watch for [nginx]
2018-09-01 23:30:32,456 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2018-09-01 23:30:32,473 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12954] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'nginx.service'] in directory '/root'
2018-09-01 23:30:32,592 [salt.state       :290 ][INFO    ][12954] {'nginx': True}
2018-09-01 23:30:32,593 [salt.state       :1941][INFO    ][12954] Completed state [nginx] at time 23:30:32.593237 duration_in_ms=137.258
2018-09-01 23:30:32,595 [salt.minion      :1708][INFO    ][12954] Returning information for job: 20180901232930804446
2018-09-01 23:30:47,299 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command pillar.get with jid 20180901233047294059
2018-09-01 23:30:47,320 [salt.minion      :1431][INFO    ][14142] Starting a new job with PID 14142
2018-09-01 23:30:47,326 [salt.minion      :1708][INFO    ][14142] Returning information for job: 20180901233047294059
2018-09-01 23:30:47,937 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command cp.push with jid 20180901233047932631
2018-09-01 23:30:47,956 [salt.minion      :1431][INFO    ][14147] Starting a new job with PID 14147
2018-09-01 23:30:47,978 [salt.minion      :1708][INFO    ][14147] Returning information for job: 20180901233047932631
2018-09-01 23:31:53,875 [salt.minion      :1307][INFO    ][1654] User sudo_ubuntu Executing command cp.push_dir with jid 20180901233153869249
2018-09-01 23:31:53,899 [salt.minion      :1431][INFO    ][14182] Starting a new job with PID 14182
