2019-04-30 21:43:05,680 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:05,914 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:05,916 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:05,918 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:05,920 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,387 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,390 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,392 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,460 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,462 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,464 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,467 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:06,944 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,945 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,948 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,951 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,954 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,957 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,959 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,962 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,965 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,968 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,970 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,973 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,976 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,979 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,982 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,984 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,987 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,990 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,992 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,995 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:07,998 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,000 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,003 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,005 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,008 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,011 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,013 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,016 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,018 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,021 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,024 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,026 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,029 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,031 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,034 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,037 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,039 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,042 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,044 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,047 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,049 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,052 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,054 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,057 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:08,059 [salt.utils.decorators:82  ][ERROR   ][1971] Exception encountered when attempting to inspect frame in dependency decorator: list index out of range
2019-04-30 21:43:49,598 [salt.utils.decorators:613 ][WARNING ][1971] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2019-04-30 21:43:51,433 [salt.loaded.int.states.file:2298][WARNING ][1971] State for file: /etc/ssl/certs/ca-salt_master_ca.crt - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:43:54,205 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3331] Executing command ['systemctl', 'status', 'salt-minion.service', '-n', '0'] in directory '/root'
2019-04-30 21:43:54,226 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3331] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'salt-minion.service'] in directory '/root'
2019-04-30 21:43:54,255 [salt.utils.parsers:1051][WARNING ][1600] Minion received a SIGTERM. Exiting.
2019-04-30 21:43:55,057 [salt.cli.daemons :293 ][INFO    ][3390] Setting up the Salt Minion "prx01.mcp-ovs-ha.local"
2019-04-30 21:43:55,133 [salt.cli.daemons :82  ][INFO    ][3390] Starting up the Salt Minion
2019-04-30 21:43:55,134 [salt.utils.event :1017][INFO    ][3390] Starting pull socket on /var/run/salt/minion/minion_event_ff902ec8d4_pull.ipc
2019-04-30 21:43:55,535 [salt.minion      :976 ][INFO    ][3390] Creating minion process manager
2019-04-30 21:43:56,454 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][3390] Executing command ['date', '+%z'] in directory '/root'
2019-04-30 21:43:56,467 [salt.utils.schedule:568 ][INFO    ][3390] Updating job settings for scheduled job: __mine_interval
2019-04-30 21:43:56,523 [salt.minion      :1108][INFO    ][3390] Added mine.update to scheduler
2019-04-30 21:43:56,537 [salt.minion      :1975][INFO    ][3390] Minion is starting as user 'root'
2019-04-30 21:43:56,548 [salt.minion      :2336][INFO    ][3390] Minion is ready to receive requests!
2019-04-30 21:44:52,910 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command state.apply with jid 20190430214452901770
2019-04-30 21:44:52,920 [salt.minion      :1432][INFO    ][3479] Starting a new job with PID 3479
2019-04-30 21:44:56,645 [salt.state       :915 ][INFO    ][3479] Loading fresh modules for state activity
2019-04-30 21:44:56,849 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/init.sls'
2019-04-30 21:44:57,086 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/init.sls'
2019-04-30 21:44:57,871 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/env.sls'
2019-04-30 21:44:58,002 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/profile.sls'
2019-04-30 21:44:58,078 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/shell.sls'
2019-04-30 21:44:58,153 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/login_defs.sls'
2019-04-30 21:44:58,225 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/at.sls'
2019-04-30 21:44:58,843 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/cron.sls'
2019-04-30 21:44:58,922 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/repo.sls'
2019-04-30 21:44:59,038 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/package.sls'
2019-04-30 21:44:59,116 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/timezone.sls'
2019-04-30 21:44:59,756 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/kernel.sls'
2019-04-30 21:44:59,891 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/grub.sls'
2019-04-30 21:44:59,911 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/cpu.sls'
2019-04-30 21:44:59,987 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/sysfs.sls'
2019-04-30 21:45:00,065 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/locale.sls'
2019-04-30 21:45:00,137 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/user.sls'
2019-04-30 21:45:00,228 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/group.sls'
2019-04-30 21:45:00,304 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/limit.sls'
2019-04-30 21:45:01,054 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/service.sls'
2019-04-30 21:45:01,797 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/systemd.sls'
2019-04-30 21:45:01,913 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/file.sls'
2019-04-30 21:45:02,003 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/apt.sls'
2019-04-30 21:45:02,728 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/system/banner.sls'
2019-04-30 21:45:02,809 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/network/init.sls'
2019-04-30 21:45:02,897 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/network/hostname.sls'
2019-04-30 21:45:02,970 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/network/host.sls'
2019-04-30 21:45:03,683 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/network/interface.sls'
2019-04-30 21:45:03,840 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/network/proxy.sls'
2019-04-30 21:45:04,427 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/storage/init.sls'
2019-04-30 21:45:04,505 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/storage/mount.sls'
2019-04-30 21:45:04,591 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'ntp/init.sls'
2019-04-30 21:45:04,608 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'ntp/client.sls'
2019-04-30 21:45:04,639 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'ntp/server.sls'
2019-04-30 21:45:04,668 [salt.state       :1780][INFO    ][3479] Running state [/etc/environment] at time 21:45:04.668594
2019-04-30 21:45:04,668 [salt.state       :1813][INFO    ][3479] Executing state file.blockreplace for [/etc/environment]
2019-04-30 21:45:04,674 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -1 +1,4 @@
 PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
+# SALT MANAGED VARIABLES - DO NOT EDIT - START
+# 
+# SALT MANAGED VARIABLES - END

2019-04-30 21:45:04,674 [salt.state       :1951][INFO    ][3479] Completed state [/etc/environment] at time 21:45:04.674258 duration_in_ms=5.665
2019-04-30 21:45:04,674 [salt.state       :1780][INFO    ][3479] Running state [/etc/profile.d] at time 21:45:04.674435
2019-04-30 21:45:04,674 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/profile.d]
2019-04-30 21:45:04,700 [salt.state       :300 ][INFO    ][3479] Directory /etc/profile.d is in the correct state
Directory /etc/profile.d updated
2019-04-30 21:45:04,700 [salt.state       :1951][INFO    ][3479] Completed state [/etc/profile.d] at time 21:45:04.700392 duration_in_ms=25.957
2019-04-30 21:45:04,700 [salt.state       :1780][INFO    ][3479] Running state [/etc/bash.bashrc] at time 21:45:04.700608
2019-04-30 21:45:04,700 [salt.state       :1813][INFO    ][3479] Executing state file.blockreplace for [/etc/bash.bashrc]
2019-04-30 21:45:05,246 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['git', '--version'] in directory '/root'
2019-04-30 21:45:05,353 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'test -f /etc/bash.bashrc' in directory '/root'
2019-04-30 21:45:05,370 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -66,3 +66,6 @@
 		fi
 	}
 fi
+# BEGIN CIS 5.4.4 default user umask
+umask 027
+# END CIS 5.4.4 default user umask

2019-04-30 21:45:05,370 [salt.state       :1951][INFO    ][3479] Completed state [/etc/bash.bashrc] at time 21:45:05.370474 duration_in_ms=669.864
2019-04-30 21:45:05,370 [salt.state       :1780][INFO    ][3479] Running state [/etc/profile] at time 21:45:05.370858
2019-04-30 21:45:05,371 [salt.state       :1813][INFO    ][3479] Executing state file.blockreplace for [/etc/profile]
2019-04-30 21:45:05,375 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'test -f /etc/profile' in directory '/root'
2019-04-30 21:45:05,387 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -25,3 +25,6 @@
   done
   unset i
 fi
+# BEGIN CIS 5.4.4 default user umask
+umask 027
+# END CIS 5.4.4 default user umask

2019-04-30 21:45:05,387 [salt.state       :1951][INFO    ][3479] Completed state [/etc/profile] at time 21:45:05.387861 duration_in_ms=17.003
2019-04-30 21:45:05,388 [salt.state       :1780][INFO    ][3479] Running state [/etc/login.defs] at time 21:45:05.388215
2019-04-30 21:45:05,388 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/login.defs]
2019-04-30 21:45:05,408 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/login.defs.jinja'
2019-04-30 21:45:05,489 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -1,341 +1,28 @@
-#
-# /etc/login.defs - Configuration control definitions for the login package.
-#
-# Three items must be defined:  MAIL_DIR, ENV_SUPATH, and ENV_PATH.
-# If unspecified, some arbitrary (and possibly incorrect) value will
-# be assumed.  All other items are optional - if not specified then
-# the described action or option will be inhibited.
-#
-# Comment lines (lines beginning with "#") and blank lines are ignored.
-#
-# Modified for Linux.  --marekm
-
-# REQUIRED for useradd/userdel/usermod
-#   Directory where mailboxes reside, _or_ name of file, relative to the
-#   home directory.  If you _do_ define MAIL_DIR and MAIL_FILE,
-#   MAIL_DIR takes precedence.
-#
-#   Essentially:
-#      - MAIL_DIR defines the location of users mail spool files
-#        (for mbox use) by appending the username to MAIL_DIR as defined
-#        below.
-#      - MAIL_FILE defines the location of the users mail spool files as the
-#        fully-qualified filename obtained by prepending the user home
-#        directory before $MAIL_FILE
-#
-# NOTE: This is no more used for setting up users MAIL environment variable
-#       which is, starting from shadow 4.0.12-1 in Debian, entirely the
-#       job of the pam_mail PAM modules
-#       See default PAM configuration files provided for
-#       login, su, etc.
-#
-# This is a temporary situation: setting these variables will soon
-# move to /etc/default/useradd and the variables will then be
-# no more supported
-MAIL_DIR        /var/mail
-#MAIL_FILE      .mail
-
-#
-# Enable logging and display of /var/log/faillog login failure info.
-# This option conflicts with the pam_tally PAM module.
-#
-FAILLOG_ENAB		yes
-
-#
-# Enable display of unknown usernames when login failures are recorded.
-#
-# WARNING: Unknown usernames may become world readable. 
-# See #290803 and #298773 for details about how this could become a security
-# concern
-LOG_UNKFAIL_ENAB	no
-
-#
-# Enable logging of successful logins
-#
-LOG_OK_LOGINS		no
-
-#
-# Enable "syslog" logging of su activity - in addition to sulog file logging.
-# SYSLOG_SG_ENAB does the same for newgrp and sg.
-#
-SYSLOG_SU_ENAB		yes
-SYSLOG_SG_ENAB		yes
-
-#
-# If defined, all su activity is logged to this file.
-#
-#SULOG_FILE	/var/log/sulog
-
-#
-# If defined, file which maps tty line to TERM environment parameter.
-# Each line of the file is in a format something like "vt100  tty01".
-#
-#TTYTYPE_FILE	/etc/ttytype
-
-#
-# If defined, login failures will be logged here in a utmp format
-# last, when invoked as lastb, will read /var/log/btmp, so...
-#
-FTMP_FILE	/var/log/btmp
-
-#
-# If defined, the command name to display when running "su -".  For
-# example, if this is defined as "su" then a "ps" will display the
-# command is "-su".  If not defined, then "ps" would display the
-# name of the shell actually being run, e.g. something like "-sh".
-#
-SU_NAME		su
-
-#
-# If defined, file which inhibits all the usual chatter during the login
-# sequence.  If a full pathname, then hushed mode will be enabled if the
-# user's name or shell are found in the file.  If not a full pathname, then
-# hushed mode will be enabled if the file exists in the user's home directory.
-#
-HUSHLOGIN_FILE	.hushlogin
-#HUSHLOGIN_FILE	/etc/hushlogins
-
-#
-# *REQUIRED*  The default PATH settings, for superuser and normal users.
-#
-# (they are minimal, add the rest in the shell startup files)
-ENV_SUPATH	PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
-ENV_PATH	PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
-
-#
-# Terminal permissions
-#
-#	TTYGROUP	Login tty will be assigned this group ownership.
-#	TTYPERM		Login tty will be set to this permission.
-#
-# If you have a "write" program which is "setgid" to a special group
-# which owns the terminals, define TTYGROUP to the group number and
-# TTYPERM to 0620.  Otherwise leave TTYGROUP commented out and assign
-# TTYPERM to either 622 or 600.
-#
-# In Debian /usr/bin/bsd-write or similar programs are setgid tty
-# However, the default and recommended value for TTYPERM is still 0600
-# to not allow anyone to write to anyone else console or terminal
-
-# Users can still allow other people to write them by issuing 
-# the "mesg y" command.
-
-TTYGROUP	tty
-TTYPERM		0600
-
-#
-# Login configuration initializations:
-#
-#	ERASECHAR	Terminal ERASE character ('\010' = backspace).
-#	KILLCHAR	Terminal KILL character ('\025' = CTRL/U).
-#	UMASK		Default "umask" value.
-#
-# The ERASECHAR and KILLCHAR are used only on System V machines.
-# 
-# UMASK is the default umask value for pam_umask and is used by
-# useradd and newusers to set the mode of the new home directories.
-# 022 is the "historical" value in Debian for UMASK
-# 027, or even 077, could be considered better for privacy
-# There is no One True Answer here : each sysadmin must make up his/her
-# mind.
-#
-# If USERGROUPS_ENAB is set to "yes", that will modify this UMASK default value
-# for private user groups, i. e. the uid is the same as gid, and username is
-# the same as the primary group name: for these, the user permissions will be
-# used as group permissions, e. g. 022 will become 002.
-#
-# Prefix these values with "0" to get octal, "0x" to get hexadecimal.
-#
-ERASECHAR	0177
-KILLCHAR	025
-UMASK		022
-
-#
-# Password aging controls:
-#
-#	PASS_MAX_DAYS	Maximum number of days a password may be used.
-#	PASS_MIN_DAYS	Minimum number of days allowed between password changes.
-#	PASS_WARN_AGE	Number of days warning given before a password expires.
-#
-PASS_MAX_DAYS	99999
-PASS_MIN_DAYS	0
-PASS_WARN_AGE	7
-
-#
-# Min/max values for automatic uid selection in useradd
-#
-UID_MIN			 1000
-UID_MAX			60000
-# System accounts
-#SYS_UID_MIN		  100
-#SYS_UID_MAX		  999
-
-#
-# Min/max values for automatic gid selection in groupadd
-#
-GID_MIN			 1000
-GID_MAX			60000
-# System accounts
-#SYS_GID_MIN		  100
-#SYS_GID_MAX		  999
-
-#
-# Max number of login retries if password is bad. This will most likely be
-# overriden by PAM, since the default pam_unix module has it's own built
-# in of 3 retries. However, this is a safe fallback in case you are using
-# an authentication module that does not enforce PAM_MAXTRIES.
-#
-LOGIN_RETRIES		5
-
-#
-# Max time in seconds for login
-#
-LOGIN_TIMEOUT		60
-
-#
-# Which fields may be changed by regular users using chfn - use
-# any combination of letters "frwh" (full name, room number, work
-# phone, home phone).  If not defined, no changes are allowed.
-# For backward compatibility, "yes" = "rwh" and "no" = "frwh".
-# 
-CHFN_RESTRICT		rwh
-
-#
-# Should login be allowed if we can't cd to the home directory?
-# Default in no.
-#
-DEFAULT_HOME	yes
-
-#
-# If defined, this command is run when removing a user.
-# It should remove any at/cron/print jobs etc. owned by
-# the user to be removed (passed as the first argument).
-#
-#USERDEL_CMD	/usr/sbin/userdel_local
-
-#
-# Enable setting of the umask group bits to be the same as owner bits
-# (examples: 022 -> 002, 077 -> 007) for non-root users, if the uid is
-# the same as gid, and username is the same as the primary group name.
-#
-# If set to yes, userdel will remove the user´s group if it contains no
-# more members, and useradd will create by default a group with the name
-# of the user.
-#
-USERGROUPS_ENAB yes
-
-#
-# Instead of the real user shell, the program specified by this parameter
-# will be launched, although its visible name (argv[0]) will be the shell's.
-# The program may do whatever it wants (logging, additional authentification,
-# banner, ...) before running the actual shell.
-#
-# FAKE_SHELL /bin/fakeshell
-
-#
-# If defined, either full pathname of a file containing device names or
-# a ":" delimited list of device names.  Root logins will be allowed only
-# upon these devices.
-#
-# This variable is used by login and su.
-#
-#CONSOLE	/etc/consoles
-#CONSOLE	console:tty01:tty02:tty03:tty04
-
-#
-# List of groups to add to the user's supplementary group set
-# when logging in on the console (as determined by the CONSOLE
-# setting).  Default is none.
-#
-# Use with caution - it is possible for users to gain permanent
-# access to these groups, even when not logged in on the console.
-# How to do it is left as an exercise for the reader...
-#
-# This variable is used by login and su.
-#
-#CONSOLE_GROUPS		floppy:audio:cdrom
-
-#
-# If set to "yes", new passwords will be encrypted using the MD5-based
-# algorithm compatible with the one used by recent releases of FreeBSD.
-# It supports passwords of unlimited length and longer salt strings.
-# Set to "no" if you need to copy encrypted passwords to other systems
-# which don't understand the new algorithm.  Default is "no".
-#
-# This variable is deprecated. You should use ENCRYPT_METHOD.
-#
-#MD5_CRYPT_ENAB	no
-
-#
-# If set to MD5 , MD5-based algorithm will be used for encrypting password
-# If set to SHA256, SHA256-based algorithm will be used for encrypting password
-# If set to SHA512, SHA512-based algorithm will be used for encrypting password
-# If set to DES, DES-based algorithm will be used for encrypting password (default)
-# Overrides the MD5_CRYPT_ENAB option
-#
-# Note: It is recommended to use a value consistent with
-# the PAM modules configuration.
-#
-ENCRYPT_METHOD SHA512
-
-#
-# Only used if ENCRYPT_METHOD is set to SHA256 or SHA512.
-#
-# Define the number of SHA rounds.
-# With a lot of rounds, it is more difficult to brute forcing the password.
-# But note also that it more CPU resources will be needed to authenticate
-# users.
-#
-# If not specified, the libc will choose the default number of rounds (5000).
-# The values must be inside the 1000-999999999 range.
-# If only one of the MIN or MAX values is set, then this value will be used.
-# If MIN > MAX, the highest value will be used.
-#
-# SHA_CRYPT_MIN_ROUNDS 5000
-# SHA_CRYPT_MAX_ROUNDS 5000
-
-################# OBSOLETED BY PAM ##############
-#						#
-# These options are now handled by PAM. Please	#
-# edit the appropriate file in /etc/pam.d/ to	#
-# enable the equivelants of them.
-#
-###############
-
-#MOTD_FILE
-#DIALUPS_CHECK_ENAB
-#LASTLOG_ENAB
-#MAIL_CHECK_ENAB
-#OBSCURE_CHECKS_ENAB
-#PORTTIME_CHECKS_ENAB
-#SU_WHEEL_ONLY
-#CRACKLIB_DICTPATH
-#PASS_CHANGE_TRIES
-#PASS_ALWAYS_WARN
-#ENVIRON_FILE
-#NOLOGINS_FILE
-#ISSUE_FILE
-#PASS_MIN_LEN
-#PASS_MAX_LEN
-#ULIMIT
-#ENV_HZ
-#CHFN_AUTH
-#CHSH_AUTH
-#FAIL_DELAY
-
-################# OBSOLETED #######################
-#						  #
-# These options are no more handled by shadow.    #
-#                                                 #
-# Shadow utilities will display a warning if they #
-# still appear.                                   #
-#                                                 #
-###################################################
-
-# CLOSE_SESSIONS
-# LOGIN_STRING
-# NO_PASSWORD_CONSOLE
-# QMAIL_DIR
-
-
-
+# This file is managed by Salt, do not edit
+CHFN_RESTRICT        rwh
+DEFAULT_HOME         yes
+ENCRYPT_METHOD       SHA512
+ENV_PATH             PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ENV_SUPATH           PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ERASECHAR            0177
+GID_MAX              60000
+GID_MIN              1000
+HUSHLOGIN_FILE       .hushlogin
+KILLCHAR             025
+LOG_OK_LOGINS        no
+LOG_UNKFAIL_ENAB     no
+LOGIN_RETRIES        5
+LOGIN_TIMEOUT        60
+MAIL_DIR             /var/mail
+PASS_MAX_DAYS        90
+PASS_MIN_DAYS        7
+PASS_WARN_AGE        7
+SU_NAME              su
+SYSLOG_SG_ENAB       yes
+SYSLOG_SU_ENAB       yes
+TTYGROUP             tty
+TTYPERM              0600
+UID_MAX              60000
+UID_MIN              1000
+UMASK                022
+USERGROUPS_ENAB      yes

2019-04-30 21:45:05,490 [salt.state       :1951][INFO    ][3479] Completed state [/etc/login.defs] at time 21:45:05.490489 duration_in_ms=102.274
2019-04-30 21:45:05,495 [salt.state       :1780][INFO    ][3479] Running state [at] at time 21:45:05.495667
2019-04-30 21:45:05,495 [salt.state       :1813][INFO    ][3479] Executing state pkg.installed for [at]
2019-04-30 21:45:05,496 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:45:05,781 [salt.state       :300 ][INFO    ][3479] All specified packages are already installed
2019-04-30 21:45:05,782 [salt.state       :1951][INFO    ][3479] Completed state [at] at time 21:45:05.782011 duration_in_ms=286.344
2019-04-30 21:45:05,783 [salt.state       :1780][INFO    ][3479] Running state [atd] at time 21:45:05.783502
2019-04-30 21:45:05,783 [salt.state       :1813][INFO    ][3479] Executing state service.running for [atd]
2019-04-30 21:45:05,784 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'atd.service', '-n', '0'] in directory '/root'
2019-04-30 21:45:05,802 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'atd.service'] in directory '/root'
2019-04-30 21:45:05,812 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'atd.service'] in directory '/root'
2019-04-30 21:45:05,821 [salt.state       :300 ][INFO    ][3479] The service atd is already running
2019-04-30 21:45:05,821 [salt.state       :1951][INFO    ][3479] Completed state [atd] at time 21:45:05.821463 duration_in_ms=37.961
2019-04-30 21:45:05,823 [salt.state       :1780][INFO    ][3479] Running state [cron] at time 21:45:05.823241
2019-04-30 21:45:05,823 [salt.state       :1813][INFO    ][3479] Executing state pkg.installed for [cron]
2019-04-30 21:45:05,829 [salt.state       :300 ][INFO    ][3479] All specified packages are already installed
2019-04-30 21:45:05,829 [salt.state       :1951][INFO    ][3479] Completed state [cron] at time 21:45:05.829381 duration_in_ms=6.14
2019-04-30 21:45:05,830 [salt.state       :1780][INFO    ][3479] Running state [/etc/at.allow] at time 21:45:05.830022
2019-04-30 21:45:05,830 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/at.allow]
2019-04-30 21:45:05,844 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/cron_users.jinja'
2019-04-30 21:45:05,850 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:05,851 [salt.state       :1951][INFO    ][3479] Completed state [/etc/at.allow] at time 21:45:05.851027 duration_in_ms=21.005
2019-04-30 21:45:05,851 [salt.state       :1780][INFO    ][3479] Running state [/etc/at.deny] at time 21:45:05.851206
2019-04-30 21:45:05,851 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/at.deny]
2019-04-30 21:45:05,851 [salt.state       :300 ][INFO    ][3479] {'removed': '/etc/at.deny'}
2019-04-30 21:45:05,851 [salt.state       :1951][INFO    ][3479] Completed state [/etc/at.deny] at time 21:45:05.851803 duration_in_ms=0.597
2019-04-30 21:45:05,852 [salt.state       :1780][INFO    ][3479] Running state [cron] at time 21:45:05.852564
2019-04-30 21:45:05,852 [salt.state       :1813][INFO    ][3479] Executing state service.running for [cron]
2019-04-30 21:45:05,853 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'cron.service', '-n', '0'] in directory '/root'
2019-04-30 21:45:05,864 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'cron.service'] in directory '/root'
2019-04-30 21:45:05,874 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'cron.service'] in directory '/root'
2019-04-30 21:45:05,885 [salt.state       :300 ][INFO    ][3479] The service cron is already running
2019-04-30 21:45:05,885 [salt.state       :1951][INFO    ][3479] Completed state [cron] at time 21:45:05.885385 duration_in_ms=32.821
2019-04-30 21:45:05,886 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.allow] at time 21:45:05.886605
2019-04-30 21:45:05,886 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/cron.allow]
2019-04-30 21:45:05,903 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:05,903 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.allow] at time 21:45:05.903316 duration_in_ms=16.711
2019-04-30 21:45:05,903 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.deny] at time 21:45:05.903495
2019-04-30 21:45:05,903 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/cron.deny]
2019-04-30 21:45:05,903 [salt.state       :300 ][INFO    ][3479] File /etc/cron.deny is not present
2019-04-30 21:45:05,904 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.deny] at time 21:45:05.904064 duration_in_ms=0.569
2019-04-30 21:45:05,904 [salt.state       :1780][INFO    ][3479] Running state [/etc/crontab] at time 21:45:05.904726
2019-04-30 21:45:05,904 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/crontab]
2019-04-30 21:45:05,905 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,905 [salt.state       :1951][INFO    ][3479] Completed state [/etc/crontab] at time 21:45:05.905741 duration_in_ms=1.015
2019-04-30 21:45:05,906 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.d] at time 21:45:05.906401
2019-04-30 21:45:05,906 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/cron.d]
2019-04-30 21:45:05,907 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,907 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.d] at time 21:45:05.907595 duration_in_ms=1.194
2019-04-30 21:45:05,908 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.daily] at time 21:45:05.908229
2019-04-30 21:45:05,908 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/cron.daily]
2019-04-30 21:45:05,915 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,915 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.daily] at time 21:45:05.915770 duration_in_ms=7.542
2019-04-30 21:45:05,916 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.hourly] at time 21:45:05.916417
2019-04-30 21:45:05,916 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/cron.hourly]
2019-04-30 21:45:05,921 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,921 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.hourly] at time 21:45:05.921360 duration_in_ms=4.943
2019-04-30 21:45:05,922 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.monthly] at time 21:45:05.922003
2019-04-30 21:45:05,922 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/cron.monthly]
2019-04-30 21:45:05,923 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,923 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.monthly] at time 21:45:05.923142 duration_in_ms=1.139
2019-04-30 21:45:05,923 [salt.state       :1780][INFO    ][3479] Running state [/etc/cron.weekly] at time 21:45:05.923794
2019-04-30 21:45:05,923 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/cron.weekly]
2019-04-30 21:45:05,928 [salt.state       :300 ][INFO    ][3479] {'mode': '0600'}
2019-04-30 21:45:05,928 [salt.state       :1951][INFO    ][3479] Completed state [/etc/cron.weekly] at time 21:45:05.928591 duration_in_ms=4.797
2019-04-30 21:45:05,930 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 21:45:05.930741
2019-04-30 21:45:05,930 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/apt/apt.conf.d/99prefer_ipv4-salt]
2019-04-30 21:45:06,126 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/apt.conf'
2019-04-30 21:45:06,135 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:06,135 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/99prefer_ipv4-salt] at time 21:45:06.135164 duration_in_ms=204.423
2019-04-30 21:45:06,135 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 21:45:06.135368
2019-04-30 21:45:06,135 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/apt/apt.conf.d/99allow_downgrades-salt]
2019-04-30 21:45:06,150 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:06,151 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/99allow_downgrades-salt] at time 21:45:06.150974 duration_in_ms=15.606
2019-04-30 21:45:06,152 [salt.state       :1780][INFO    ][3479] Running state [linux_repo_prereq_pkgs] at time 21:45:06.152124
2019-04-30 21:45:06,152 [salt.state       :1813][INFO    ][3479] Executing state pkg.installed for [linux_repo_prereq_pkgs]
2019-04-30 21:45:06,158 [salt.state       :300 ][INFO    ][3479] All specified packages are already installed
2019-04-30 21:45:06,158 [salt.state       :1951][INFO    ][3479] Completed state [linux_repo_prereq_pkgs] at time 21:45:06.158667 duration_in_ms=6.543
2019-04-30 21:45:06,158 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/99proxies-salt] at time 21:45:06.158851
2019-04-30 21:45:06,159 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt]
2019-04-30 21:45:06,159 [salt.state       :300 ][INFO    ][3479] File /etc/apt/apt.conf.d/99proxies-salt is not present
2019-04-30 21:45:06,159 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/99proxies-salt] at time 21:45:06.159385 duration_in_ms=0.534
2019-04-30 21:45:06,159 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 21:45:06.159542
2019-04-30 21:45:06,159 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack]
2019-04-30 21:45:06,159 [salt.state       :300 ][INFO    ][3479] File /etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack is not present
2019-04-30 21:45:06,160 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack] at time 21:45:06.160051 duration_in_ms=0.509
2019-04-30 21:45:06,160 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/preferences.d/mirantis_openstack] at time 21:45:06.160207
2019-04-30 21:45:06,160 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/preferences.d/mirantis_openstack]
2019-04-30 21:45:06,160 [salt.state       :300 ][INFO    ][3479] File /etc/apt/preferences.d/mirantis_openstack is not present
2019-04-30 21:45:06,160 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/preferences.d/mirantis_openstack] at time 21:45:06.160716 duration_in_ms=0.509
2019-04-30 21:45:06,160 [salt.state       :1780][INFO    ][3479] Running state [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -] at time 21:45:06.160886
2019-04-30 21:45:06,161 [salt.state       :1813][INFO    ][3479] Executing state cmd.run for [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -]
2019-04-30 21:45:06,161 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -' in directory '/root'
2019-04-30 21:45:06,799 [salt.state       :300 ][INFO    ][3479] {'pid': 3594, 'retcode': 0, 'stderr': '', 'stdout': 'OK'}
2019-04-30 21:45:06,800 [salt.state       :1951][INFO    ][3479] Completed state [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -] at time 21:45:06.800072 duration_in_ms=639.186
2019-04-30 21:45:06,803 [salt.state       :1780][INFO    ][3479] Running state [deb http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial main] at time 21:45:06.803732
2019-04-30 21:45:06,804 [salt.state       :1813][INFO    ][3479] Executing state pkgrepo.managed for [deb http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial main]
2019-04-30 21:45:06,879 [salt.state       :300 ][INFO    ][3479] {'repo': 'deb http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial main'}
2019-04-30 21:45:06,879 [salt.state       :1951][INFO    ][3479] Completed state [deb http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial main] at time 21:45:06.879762 duration_in_ms=76.03
2019-04-30 21:45:06,880 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack_backports] at time 21:45:06.880043
2019-04-30 21:45:06,880 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack_backports]
2019-04-30 21:45:06,880 [salt.state       :300 ][INFO    ][3479] File /etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack_backports is not present
2019-04-30 21:45:06,880 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/99proxies-salt-mirantis_openstack_backports] at time 21:45:06.880945 duration_in_ms=0.902
2019-04-30 21:45:06,881 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/preferences.d/mirantis_openstack_backports] at time 21:45:06.881183
2019-04-30 21:45:06,881 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/preferences.d/mirantis_openstack_backports]
2019-04-30 21:45:06,881 [salt.state       :300 ][INFO    ][3479] File /etc/apt/preferences.d/mirantis_openstack_backports is not present
2019-04-30 21:45:06,882 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/preferences.d/mirantis_openstack_backports] at time 21:45:06.881955 duration_in_ms=0.772
2019-04-30 21:45:06,882 [salt.state       :1780][INFO    ][3479] Running state [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -] at time 21:45:06.882204
2019-04-30 21:45:06,882 [salt.state       :1813][INFO    ][3479] Executing state cmd.run for [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -]
2019-04-30 21:45:06,883 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -' in directory '/root'
2019-04-30 21:45:06,974 [salt.state       :300 ][INFO    ][3479] {'pid': 3754, 'retcode': 0, 'stderr': '', 'stdout': 'OK'}
2019-04-30 21:45:06,974 [salt.state       :1951][INFO    ][3479] Completed state [echo 'LS0tLS1CRUdJTiBQR1AgUFVCTElDIEtFWSBCTE9DSy0tLS0tClZlcnNpb246IEdudVBHIHYxCgptUUVOQkZXdDhvZ0JDQUN0VC9qNFdNR3VoRUk0ODZWdjl6VlYwR1dHZWZIRTVoQmxnSlNqU2dyRXhMRnFRMkZvClNjYUFCQ2Z2elVldVhITm9oL2MyZUxqeDNZRTZvRnJkaXc1dGFtME5GbFpNTStQU3VmY2lUeFF6OHZyWEhHeDcKVkI1cmcyVFhLb3FPdjljVzY5MEZzUkFlT3RLVHRCeFp2WVZUTEVQbjJHSlcwOVh5OUNCYStuMjNYQkhUQnZLcwpqM2h4a24yNU95NzBXZ3hrL0JKcXB5blhHbm8rTnp1QW5JYmIrZitYN2k2ZmlYd3J2dHA1ek9ZT0plVXdTK2ZVCklNL21YYmV0T2Qvc0h0SnFjOU5VWXBUaXA0bkVsRXFBWVJDc1hEVGJ1TU5kelNyOFZsU01NOGI2MW1CR2VsTEgKWEplK0VQUCtMb2djNUtYTzhhZG9HZ1docWxiRDZuN3creW5IQUJFQkFBRzBMbVoxWld3dGFXNW1jbUVnS0VWNApZVzF3YkdVZ2EyVjVLU0E4WkdWMmIzQnpRRzFwY21GdWRHbHpMbU52YlQ2SkFUZ0VFd0VDQUNJRkFsV3Q4b2dDCkd3TUdDd2tJQndNQ0JoVUlBZ2tLQ3dRV0FnTUJBaDRCQWhlQUFBb0pFTHpsekVZZm9pc0lrdVFJQUpsMGNGSjUKQlNLTVhIaFJZZjBCZUR6aGRoM3BtY09Ycy9qU3puVEl4QjRPRTVPZHdyTWdLeW9Ja1NJUDhBRXR0dkIrQnVPdgpCSG1oVEw3a3ZSaFA1eGlLZGJDd21EdG9FUm9hcXhoUlJiWkpjSitwSHZsN21rRXU4R2oyS1plMmxmRTRaNlpGCjZxMDBHeDlIWWZzZTErVmdVUjV5bWg0MW5aQ3ZSVE5FbllCcDFSUWNQb2dpTHkycll2WmJ4WW5VdGc0amFEN0QKdnV1RVF3cmZFSGRLRlVsV0JDSVZibCtlM0s2WlNuaU9jcXF5SEs3Mi9ISTBTWXVacEdmQ3p6dzVkZU9EY2pXbQpHejRuWnI0MWNCM2VIWGtmbUczbmdkaG1iMk1wVnI4M3UrSmViT292anp1c2Y3MW9JZFpCVEZOWXNaTlNWS3JuCmwwcnJSdURJTUhiUU11UzVBUTBFVmEzeWlBRUlBTFpxZExHWFNHWkFnVVhsN3poUEg1d25JUXRkbzZpTUlvdloKelFOVzk1UkRUMm5tLzNZZGRpUnk2RnVPVGJhSFh3MDdENFpVbDRkR1ZIekV3QmxsaFVMeGNIVjNPT2RRM2dWcAo0bUJBWjhrdjBFZWx6cVBmRFFXUjJDcTBoaTdJSjRRNGVQcFpoUUZpYXN6OHFiVjdEN0NZYlpkREFtUUt4cUFrCjBYWU9qYkIzanpCMnI2TUhmbEFLbUp6VHAzK05BRTliRExBd1hhMG90MlRIRGJwUGRCNFI2cHhwRDZZM2p3ZVcKdUxVQ25JZnZ5SUJ3aEhvYmFVMjhwdy9CQSswZGtDOWpuTG5vTytUcnpCOVlENTgzOUxjM2N0cmRQQkxpRlBNRwp3ZGZBVlJDeWZnTGpPeVVMcWpUdWR4MU1vK0RnejkreHJjVEZvZWhJN1VZb1pucmFFS2tBRVFFQUFZa0JId1FZCkFRSUFDUVVDVmEzeWlBSWJEQUFLQ1JDODVjeEdINklyQ1BINUIvMFVjK09oTVNDa1JvczFZdjV0QTRic0VjanQKOCtzSjJTNnBVcUNiWnhtWHB6S3NwS3BuanAzREpqbVFLREIycTRVUERWRWxWRE1NZEJsc3RUeDFSUlpEZjh5awpuRHZSQlN6YXdrN1hoZmxvcm84TjJMeHY2Z1doaE12SFVZSXR5TzZLTWJBWnVaMk0xSTEvT0ZIRy9mLy83b1BNCjBRcE5iaWhmK0dxRS9kV1J6OVpEeit4bFNGbGk2QVIvM2xkcTdONmdrQ3NFRmRpM2o2WkRmMHFMc1pwYXpQVUkKd2lDQy9hQVlMa1JEdFRKVjFHNkVzV2lqbU9UTk5sQ0VGUy9YRExRM04yRXYvMXNnQU8wQWxCTWRYcVNucVVJMQoxaC9lU0tDaUdta3dGV2xDZi80SG5KVlA3UXBTZVJQTHl3Nzg1RnZ0M3A5dlQrNjRpc1owWks2Y3BjajgKPTBhUUQKLS0tLS1FTkQgUEdQIFBVQkxJQyBLRVkgQkxPQ0stLS0tLQotLS0tLUJFR0lOIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0KVmVyc2lvbjogR251UEcgdjEKCm1RRU5CRnRZVlk4QkNBQzNvbGk5M2h1c0cwWlZ0di9MOEk0L2JjVzYwTEZDeUIwRHV3RXpuR2xTYWoxZmpPUXUKQzdRWDl3dkdScThtUlo4bWZaNnNieEdtZ3MwTG5WNVFJQmxlMWw1STNCK0FNR2tzZjZVR0VXZ29OL3ZxODZnKwowSmc2a0pQL0Qwc2pHWHZkbGZ5K2JnQXFqc3gyYldPTGpRR3RIU0l4aGU0Y0U5SFBCZk1pWXNGd0dRdWEzWE4zCnRpR0tjaWZzenZEQTZ1cWRqUzZEdVRFUEN6eUtpU3lVZXZuV3RCaDBvVXRVdC8vWDRsRzJNeDBsVTkxdVVRR2oKS2VaK2ZZWE9McWdabS9GeExWVDV3M2cvVUdLOUNiejVoNGtHQ0pPZmswRXdJWnAwSVJSczFwaE9DNmdWTXdvVgp5V0tDdGRIbWc3T2I4STRBWjhPVzVISm4xVVBIVHByeGNIQm5BQkVCQUFHMExFRjFkRzlpZFdsc1pHVnlJRHhwCmJtWnlZU3RoZFhScFluVnBiR1JsY2tCdGFYSmhiblJwY3k1amIyMCtpUUU0QkJNQkFnQWlCUUpiV0ZXUEFoc0QKQmdzSkNBY0RBZ1lWQ0FJSkNnc0VGZ0lEQVFJZUFRSVhnQUFLQ1JDUlpWcDVURktKNzBjSkIvOUFyV3JTRnlFeApxczdUeW85TTVXQ1BqcXc3eTJGN2pkNEV0M2hxd2M1ang2S2x4R3BnMTdTSHQ0b1djbXRNTDNWQngremlCQWkwCjVSeTRaNHcwUXFGVzZnQXFRZXBlVzc2WXEvT1A1U29xRUk5c1V3ekxmVVk3cmFLL1AxYnV2WEIxZVpoNG1NdzQKVEZmNEhnbzh5VVEzZ2VZTm5VQkJmYVNma21peUJKR3NNWEJmVzJ6aGxwVkl5QjZDeWU1UjgyM0Z4R05KZStsaQpoZ2dOQ1FuS1lxckd0cjU1Uk82eFlJMXY4OWNnR3JPMkVWd1BrRkxBL01VblFFYjQzM0NrK3NqcDFOWkRVZnVKClUzZ2c4UzBoVCtDZjVYaWtuVC94cUloaFRZL0t6bE5teW5adC81MUR6WnpzYk0rUk82SlpGWUpMMkx1QzY5Z0IKK1I1anJtYUd1OWZHCj1zcUluCi0tLS0tRU5EIFBHUCBQVUJMSUMgS0VZIEJMT0NLLS0tLS0=' | base64 -d | apt-key add -] at time 21:45:06.974702 duration_in_ms=92.497
2019-04-30 21:45:06,978 [salt.state       :1780][INFO    ][3479] Running state [deb http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial main] at time 21:45:06.978295
2019-04-30 21:45:06,978 [salt.state       :1813][INFO    ][3479] Executing state pkgrepo.managed for [deb http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial main]
2019-04-30 21:45:07,096 [salt.state       :300 ][INFO    ][3479] {'repo': 'deb http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial main'}
2019-04-30 21:45:07,097 [salt.state       :1951][INFO    ][3479] Completed state [deb http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial main] at time 21:45:07.097033 duration_in_ms=118.738
2019-04-30 21:45:07,097 [salt.state       :1780][INFO    ][3479] Running state [pkg.refresh_db] at time 21:45:07.097411
2019-04-30 21:45:07,097 [salt.state       :1813][INFO    ][3479] Executing state module.run for [pkg.refresh_db]
2019-04-30 21:45:07,098 [salt.utils.decorators:613 ][WARNING ][3479] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2019-04-30 21:45:07,098 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 21:45:07,969 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214507958416
2019-04-30 21:45:07,982 [salt.minion      :1432][INFO    ][4177] Starting a new job with PID 4177
2019-04-30 21:45:07,999 [salt.minion      :1711][INFO    ][4177] Returning information for job: 20190430214507958416
2019-04-30 21:45:11,430 [salt.state       :300 ][INFO    ][3479] {'ret': {'http://security.ubuntu.com/ubuntu xenial-security InRelease': None, 'http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial/main amd64 Packages': True, 'http://archive.ubuntu.com/ubuntu xenial-backports InRelease': None, 'http://archive.ubuntu.com/ubuntu xenial-updates InRelease': None, 'http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial InRelease': True, 'http://mirror.mirantis.com/nightly//openstack-rocky//xenial xenial InRelease': True, 'http://mirror.mirantis.com/nightly//openstack-queens/xenial xenial/main amd64 Packages': True, 'http://repo.saltstack.com/apt/ubuntu/16.04/amd64/2017.7 xenial InRelease': None, 'http://archive.ubuntu.com/ubuntu xenial InRelease': None}}
2019-04-30 21:45:11,431 [salt.state       :1951][INFO    ][3479] Completed state [pkg.refresh_db] at time 21:45:11.431240 duration_in_ms=4333.828
2019-04-30 21:45:11,431 [salt.state       :1780][INFO    ][3479] Running state [linux_extra_packages_removed] at time 21:45:11.431589
2019-04-30 21:45:11,432 [salt.state       :1813][INFO    ][3479] Executing state pkg.removed for [linux_extra_packages_removed]
2019-04-30 21:45:11,448 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', 'remove', 'telnet'] in directory '/root'
2019-04-30 21:45:14,028 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:45:14,055 [salt.state       :300 ][INFO    ][3479] Made the following changes:
'telnet-client' changed from '1' to 'absent'
'telnet' changed from '0.17-40' to 'absent'

2019-04-30 21:45:14,074 [salt.state       :915 ][INFO    ][3479] Loading fresh modules for state activity
2019-04-30 21:45:14,097 [salt.state       :1951][INFO    ][3479] Completed state [linux_extra_packages_removed] at time 21:45:14.097474 duration_in_ms=2665.885
2019-04-30 21:45:14,101 [salt.state       :1780][INFO    ][3479] Running state [linux_extra_packages_latest] at time 21:45:14.101289
2019-04-30 21:45:14,101 [salt.state       :1813][INFO    ][3479] Executing state pkg.latest for [linux_extra_packages_latest]
2019-04-30 21:45:14,581 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-cache', '-q', 'policy', 'python-tornado'] in directory '/root'
2019-04-30 21:45:14,620 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-cache', '-q', 'policy', 'smartmontools'] in directory '/root'
2019-04-30 21:45:14,657 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-cache', '-q', 'policy', 'libapache2-mod-wsgi'] in directory '/root'
2019-04-30 21:45:14,701 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 21:45:14,718 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-tornado', 'smartmontools', 'libapache2-mod-wsgi'] in directory '/root'
2019-04-30 21:45:38,029 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214538018303
2019-04-30 21:45:38,045 [salt.minion      :1432][INFO    ][4656] Starting a new job with PID 4656
2019-04-30 21:45:38,201 [salt.minion      :1711][INFO    ][4656] Returning information for job: 20190430214538018303
2019-04-30 21:45:57,675 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:45:57,702 [salt.state       :300 ][INFO    ][3479] Made the following changes:
'mail-reader' changed from 'absent' to '1'
'smartmontools' changed from 'absent' to '6.4+svn4214-1'
'libapr1' changed from 'absent' to '1.5.2-3'
'libpython2.7' changed from 'absent' to '2.7.12-1ubuntu0~16.04.4'
'libaprutil1-ldap' changed from 'absent' to '1.5.4-1build1'
'libapache2-mod-wsgi' changed from 'absent' to '4.4.15-0.1.1~u16.04+mcp2'
'imap-client' changed from 'absent' to '1'
'mailx' changed from 'absent' to '1'
'apache2-api-20120211' changed from 'absent' to '1'
'libaprutil1' changed from 'absent' to '1.5.4-1build1'
's-nail' changed from 'absent' to '14.8.6-1'
'httpd-wsgi' changed from 'absent' to '1'
'python-singledispatch' changed from 'absent' to '3.4.0.3-2'
'liblua5.1-0' changed from 'absent' to '5.1.5-8ubuntu1'
'libaprutil1-dbd-sqlite3' changed from 'absent' to '1.5.4-1build1'
'python-tornado' changed from '4.2.1-2~ds+1' to '4.5.3-1.0~u16.04+mcp1'
'python-backports-abc' changed from 'absent' to '0.5-2.0~u16.04+mcp1'
'apache2-bin' changed from 'absent' to '2.4.18-2ubuntu3.10'

2019-04-30 21:45:57,719 [salt.state       :915 ][INFO    ][3479] Loading fresh modules for state activity
2019-04-30 21:45:57,751 [salt.state       :1951][INFO    ][3479] Completed state [linux_extra_packages_latest] at time 21:45:57.751241 duration_in_ms=43649.95
2019-04-30 21:45:57,756 [salt.state       :1780][INFO    ][3479] Running state [UTC] at time 21:45:57.756578
2019-04-30 21:45:57,757 [salt.state       :1813][INFO    ][3479] Executing state timezone.system for [UTC]
2019-04-30 21:45:57,761 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['timedatectl'] in directory '/root'
2019-04-30 21:45:57,910 [salt.state       :300 ][INFO    ][3479] Timezone UTC already set, UTC already set to UTC
2019-04-30 21:45:57,910 [salt.state       :1951][INFO    ][3479] Completed state [UTC] at time 21:45:57.910664 duration_in_ms=154.085
2019-04-30 21:45:57,913 [salt.state       :1780][INFO    ][3479] Running state [/etc/default/grub.d] at time 21:45:57.913456
2019-04-30 21:45:57,913 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/default/grub.d]
2019-04-30 21:45:57,917 [salt.state       :300 ][INFO    ][3479] Directory /etc/default/grub.d is in the correct state
Directory /etc/default/grub.d updated
2019-04-30 21:45:57,917 [salt.state       :1951][INFO    ][3479] Completed state [/etc/default/grub.d] at time 21:45:57.917885 duration_in_ms=4.429
2019-04-30 21:45:57,918 [salt.state       :1780][INFO    ][3479] Running state [update-grub] at time 21:45:57.918640
2019-04-30 21:45:57,919 [salt.state       :1813][INFO    ][3479] Executing state cmd.wait for [update-grub]
2019-04-30 21:45:57,919 [salt.state       :300 ][INFO    ][3479] No changes made for update-grub
2019-04-30 21:45:57,919 [salt.state       :1951][INFO    ][3479] Completed state [update-grub] at time 21:45:57.919709 duration_in_ms=1.068
2019-04-30 21:45:57,921 [salt.state       :1780][INFO    ][3479] Running state [/boot/grub/grub.cfg] at time 21:45:57.921196
2019-04-30 21:45:57,921 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/boot/grub/grub.cfg]
2019-04-30 21:45:58,516 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'test -f /boot/grub/grub.cfg' in directory '/root'
2019-04-30 21:45:58,526 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /boot/grub/grub.cfg - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:45:58,527 [salt.state       :300 ][INFO    ][3479] {'mode': '0400'}
2019-04-30 21:45:58,527 [salt.state       :1951][INFO    ][3479] Completed state [/boot/grub/grub.cfg] at time 21:45:58.527433 duration_in_ms=606.237
2019-04-30 21:45:58,527 [salt.state       :1780][INFO    ][3479] Running state [nf_conntrack] at time 21:45:58.527873
2019-04-30 21:45:58,528 [salt.state       :1813][INFO    ][3479] Executing state kmod.present for [nf_conntrack]
2019-04-30 21:45:58,528 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'lsmod' in directory '/root'
2019-04-30 21:45:59,587 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'lsmod' in directory '/root'
2019-04-30 21:45:59,597 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'modprobe nf_conntrack' in directory '/root'
2019-04-30 21:45:59,617 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'lsmod' in directory '/root'
2019-04-30 21:45:59,681 [salt.state       :300 ][INFO    ][3479] {'nf_conntrack': 'loaded'}
2019-04-30 21:45:59,681 [salt.state       :1951][INFO    ][3479] Completed state [nf_conntrack] at time 21:45:59.681823 duration_in_ms=1153.95
2019-04-30 21:45:59,682 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d] at time 21:45:59.682273
2019-04-30 21:45:59,682 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/modprobe.d]
2019-04-30 21:45:59,683 [salt.state       :300 ][INFO    ][3479] Directory /etc/modprobe.d is in the correct state
Directory /etc/modprobe.d updated
2019-04-30 21:45:59,684 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d] at time 21:45:59.684055 duration_in_ms=1.781
2019-04-30 21:45:59,685 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/hfsplus.conf] at time 21:45:59.685635
2019-04-30 21:45:59,686 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/hfsplus.conf]
2019-04-30 21:45:59,701 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/modprobe.conf.jinja'
2019-04-30 21:45:59,780 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:59,781 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/hfsplus.conf] at time 21:45:59.781389 duration_in_ms=95.754
2019-04-30 21:45:59,782 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/rds.conf] at time 21:45:59.782371
2019-04-30 21:45:59,782 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/rds.conf]
2019-04-30 21:45:59,880 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:59,881 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/rds.conf] at time 21:45:59.881549 duration_in_ms=99.176
2019-04-30 21:45:59,882 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/cramfs.conf] at time 21:45:59.882594
2019-04-30 21:45:59,882 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/cramfs.conf]
2019-04-30 21:45:59,980 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:45:59,980 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/cramfs.conf] at time 21:45:59.980629 duration_in_ms=98.034
2019-04-30 21:45:59,981 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/freevxfs.conf] at time 21:45:59.981647
2019-04-30 21:45:59,982 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/freevxfs.conf]
2019-04-30 21:46:00,067 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,067 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/freevxfs.conf] at time 21:46:00.067749 duration_in_ms=86.1
2019-04-30 21:46:00,068 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/hfs.conf] at time 21:46:00.068826
2019-04-30 21:46:00,069 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/hfs.conf]
2019-04-30 21:46:00,156 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,156 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/hfs.conf] at time 21:46:00.156759 duration_in_ms=87.932
2019-04-30 21:46:00,157 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/squashfs.conf] at time 21:46:00.157809
2019-04-30 21:46:00,158 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/squashfs.conf]
2019-04-30 21:46:00,322 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,322 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/squashfs.conf] at time 21:46:00.322872 duration_in_ms=165.063
2019-04-30 21:46:00,323 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/udf.conf] at time 21:46:00.323917
2019-04-30 21:46:00,324 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/udf.conf]
2019-04-30 21:46:00,432 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,432 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/udf.conf] at time 21:46:00.432579 duration_in_ms=108.662
2019-04-30 21:46:00,433 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/vfat.conf] at time 21:46:00.433632
2019-04-30 21:46:00,434 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/vfat.conf]
2019-04-30 21:46:00,530 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,530 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/vfat.conf] at time 21:46:00.530850 duration_in_ms=97.217
2019-04-30 21:46:00,532 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/sctp.conf] at time 21:46:00.532581
2019-04-30 21:46:00,533 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/sctp.conf]
2019-04-30 21:46:00,638 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,639 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/sctp.conf] at time 21:46:00.639028 duration_in_ms=106.447
2019-04-30 21:46:00,640 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/jffs2.conf] at time 21:46:00.640100
2019-04-30 21:46:00,640 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/jffs2.conf]
2019-04-30 21:46:00,726 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,726 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/jffs2.conf] at time 21:46:00.726537 duration_in_ms=86.436
2019-04-30 21:46:00,727 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/tipc.conf] at time 21:46:00.727597
2019-04-30 21:46:00,727 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/tipc.conf]
2019-04-30 21:46:00,814 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,815 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/tipc.conf] at time 21:46:00.815238 duration_in_ms=87.641
2019-04-30 21:46:00,816 [salt.state       :1780][INFO    ][3479] Running state [/etc/modprobe.d/dccp.conf] at time 21:46:00.816307
2019-04-30 21:46:00,816 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/modprobe.d/dccp.conf]
2019-04-30 21:46:00,924 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:00,925 [salt.state       :1951][INFO    ][3479] Completed state [/etc/modprobe.d/dccp.conf] at time 21:46:00.925237 duration_in_ms=108.93
2019-04-30 21:46:00,925 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_congestion_control] at time 21:46:00.925579
2019-04-30 21:46:00,925 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_congestion_control]
2019-04-30 21:46:00,926 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_congestion_control="yeah"' in directory '/root'
2019-04-30 21:46:01,378 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_congestion_control': 'yeah'}
2019-04-30 21:46:01,379 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_congestion_control] at time 21:46:01.379435 duration_in_ms=453.856
2019-04-30 21:46:01,379 [salt.state       :1780][INFO    ][3479] Running state [net.core.netdev_budget] at time 21:46:01.379929
2019-04-30 21:46:01,380 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.core.netdev_budget]
2019-04-30 21:46:01,381 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.core.netdev_budget="600"' in directory '/root'
2019-04-30 21:46:01,390 [salt.state       :300 ][INFO    ][3479] {'net.core.netdev_budget': 600}
2019-04-30 21:46:01,391 [salt.state       :1951][INFO    ][3479] Completed state [net.core.netdev_budget] at time 21:46:01.391276 duration_in_ms=11.347
2019-04-30 21:46:01,391 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_slow_start_after_idle] at time 21:46:01.391740
2019-04-30 21:46:01,392 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_slow_start_after_idle]
2019-04-30 21:46:01,411 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_slow_start_after_idle="0"' in directory '/root'
2019-04-30 21:46:01,419 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_slow_start_after_idle': 0}
2019-04-30 21:46:01,419 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_slow_start_after_idle] at time 21:46:01.419752 duration_in_ms=28.012
2019-04-30 21:46:01,420 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.icmp_echo_ignore_broadcasts] at time 21:46:01.420227
2019-04-30 21:46:01,420 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.icmp_echo_ignore_broadcasts]
2019-04-30 21:46:01,440 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.icmp_echo_ignore_broadcasts="1"' in directory '/root'
2019-04-30 21:46:01,451 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.icmp_echo_ignore_broadcasts': 1}
2019-04-30 21:46:01,452 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.icmp_echo_ignore_broadcasts] at time 21:46:01.452191 duration_in_ms=31.963
2019-04-30 21:46:01,452 [salt.state       :1780][INFO    ][3479] Running state [net.nf_conntrack_max] at time 21:46:01.452848
2019-04-30 21:46:01,453 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.nf_conntrack_max]
2019-04-30 21:46:01,524 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.nf_conntrack_max="1048576"' in directory '/root'
2019-04-30 21:46:01,534 [salt.state       :300 ][INFO    ][3479] {'net.nf_conntrack_max': 1048576}
2019-04-30 21:46:01,534 [salt.state       :1951][INFO    ][3479] Completed state [net.nf_conntrack_max] at time 21:46:01.534571 duration_in_ms=81.722
2019-04-30 21:46:01,535 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.icmp_ignore_bogus_error_responses] at time 21:46:01.535056
2019-04-30 21:46:01,535 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.icmp_ignore_bogus_error_responses]
2019-04-30 21:46:01,536 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.icmp_ignore_bogus_error_responses="1"' in directory '/root'
2019-04-30 21:46:01,544 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.icmp_ignore_bogus_error_responses': 1}
2019-04-30 21:46:01,544 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.icmp_ignore_bogus_error_responses] at time 21:46:01.544830 duration_in_ms=9.774
2019-04-30 21:46:01,545 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.log_martians] at time 21:46:01.545266
2019-04-30 21:46:01,545 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.log_martians]
2019-04-30 21:46:01,546 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.log_martians="1"' in directory '/root'
2019-04-30 21:46:01,554 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.log_martians': 1}
2019-04-30 21:46:01,555 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.log_martians] at time 21:46:01.554945 duration_in_ms=9.679
2019-04-30 21:46:01,555 [salt.state       :1780][INFO    ][3479] Running state [net.core.somaxconn] at time 21:46:01.555384
2019-04-30 21:46:01,555 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.core.somaxconn]
2019-04-30 21:46:01,556 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.core.somaxconn="4096"' in directory '/root'
2019-04-30 21:46:01,565 [salt.state       :300 ][INFO    ][3479] {'net.core.somaxconn': 4096}
2019-04-30 21:46:01,566 [salt.state       :1951][INFO    ][3479] Completed state [net.core.somaxconn] at time 21:46:01.566369 duration_in_ms=10.984
2019-04-30 21:46:01,567 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.send_redirects] at time 21:46:01.566992
2019-04-30 21:46:01,567 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.send_redirects]
2019-04-30 21:46:01,638 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.send_redirects="0"' in directory '/root'
2019-04-30 21:46:01,652 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.send_redirects': 0}
2019-04-30 21:46:01,653 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.send_redirects] at time 21:46:01.653044 duration_in_ms=86.051
2019-04-30 21:46:01,653 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_tw_reuse] at time 21:46:01.653704
2019-04-30 21:46:01,654 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_tw_reuse]
2019-04-30 21:46:01,655 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_tw_reuse="1"' in directory '/root'
2019-04-30 21:46:01,666 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_tw_reuse': 1}
2019-04-30 21:46:01,667 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_tw_reuse] at time 21:46:01.667031 duration_in_ms=13.327
2019-04-30 21:46:01,667 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.rp_filter] at time 21:46:01.667657
2019-04-30 21:46:01,668 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.rp_filter]
2019-04-30 21:46:01,692 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.rp_filter="1"' in directory '/root'
2019-04-30 21:46:01,703 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.rp_filter': 1}
2019-04-30 21:46:01,703 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.rp_filter] at time 21:46:01.703496 duration_in_ms=35.84
2019-04-30 21:46:01,703 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_fin_timeout] at time 21:46:01.703940
2019-04-30 21:46:01,704 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_fin_timeout]
2019-04-30 21:46:01,715 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_fin_timeout="30"' in directory '/root'
2019-04-30 21:46:01,725 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_fin_timeout': 30}
2019-04-30 21:46:01,828 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_fin_timeout] at time 21:46:01.828113 duration_in_ms=124.171
2019-04-30 21:46:01,828 [salt.state       :1780][INFO    ][3479] Running state [net.core.netdev_budget_usecs] at time 21:46:01.828680
2019-04-30 21:46:01,829 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.core.netdev_budget_usecs]
2019-04-30 21:46:01,830 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.core.netdev_budget_usecs="5000"' in directory '/root'
2019-04-30 21:46:01,841 [salt.state       :300 ][INFO    ][3479] {'net.core.netdev_budget_usecs': 5000}
2019-04-30 21:46:01,841 [salt.state       :1951][INFO    ][3479] Completed state [net.core.netdev_budget_usecs] at time 21:46:01.841426 duration_in_ms=12.746
2019-04-30 21:46:01,841 [salt.state       :1780][INFO    ][3479] Running state [kernel.panic] at time 21:46:01.841876
2019-04-30 21:46:01,842 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [kernel.panic]
2019-04-30 21:46:01,843 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w kernel.panic="60"' in directory '/root'
2019-04-30 21:46:01,852 [salt.state       :300 ][INFO    ][3479] {'kernel.panic': 60}
2019-04-30 21:46:01,852 [salt.state       :1951][INFO    ][3479] Completed state [kernel.panic] at time 21:46:01.852641 duration_in_ms=10.765
2019-04-30 21:46:01,853 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_keepalive_probes] at time 21:46:01.853098
2019-04-30 21:46:01,853 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_keepalive_probes]
2019-04-30 21:46:01,854 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_keepalive_probes="8"' in directory '/root'
2019-04-30 21:46:01,864 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_keepalive_probes': 8}
2019-04-30 21:46:01,864 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_keepalive_probes] at time 21:46:01.864851 duration_in_ms=11.753
2019-04-30 21:46:01,865 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.log_martians] at time 21:46:01.865303
2019-04-30 21:46:01,865 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.log_martians]
2019-04-30 21:46:01,866 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.log_martians="1"' in directory '/root'
2019-04-30 21:46:01,876 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.log_martians': 1}
2019-04-30 21:46:01,876 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.log_martians] at time 21:46:01.876681 duration_in_ms=11.378
2019-04-30 21:46:01,877 [salt.state       :1780][INFO    ][3479] Running state [fs.suid_dumpable] at time 21:46:01.877112
2019-04-30 21:46:01,877 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [fs.suid_dumpable]
2019-04-30 21:46:01,878 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w fs.suid_dumpable="0"' in directory '/root'
2019-04-30 21:46:01,887 [salt.state       :300 ][INFO    ][3479] {'fs.suid_dumpable': 0}
2019-04-30 21:46:01,887 [salt.state       :1951][INFO    ][3479] Completed state [fs.suid_dumpable] at time 21:46:01.887572 duration_in_ms=10.461
2019-04-30 21:46:01,888 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.accept_redirects] at time 21:46:01.888010
2019-04-30 21:46:01,888 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.accept_redirects]
2019-04-30 21:46:01,889 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.accept_redirects="0"' in directory '/root'
2019-04-30 21:46:01,898 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.accept_redirects': 0}
2019-04-30 21:46:01,899 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.accept_redirects] at time 21:46:01.899096 duration_in_ms=11.085
2019-04-30 21:46:01,899 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.secure_redirects] at time 21:46:01.899527
2019-04-30 21:46:01,899 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.secure_redirects]
2019-04-30 21:46:01,900 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.secure_redirects="0"' in directory '/root'
2019-04-30 21:46:01,910 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.secure_redirects': 0}
2019-04-30 21:46:01,910 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.secure_redirects] at time 21:46:01.910455 duration_in_ms=10.929
2019-04-30 21:46:01,910 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.accept_source_route] at time 21:46:01.910903
2019-04-30 21:46:01,911 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.accept_source_route]
2019-04-30 21:46:01,912 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.accept_source_route="0"' in directory '/root'
2019-04-30 21:46:01,921 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.accept_source_route': 0}
2019-04-30 21:46:01,922 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.accept_source_route] at time 21:46:01.921950 duration_in_ms=11.047
2019-04-30 21:46:01,922 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_keepalive_intvl] at time 21:46:01.922389
2019-04-30 21:46:01,922 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_keepalive_intvl]
2019-04-30 21:46:01,974 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_keepalive_intvl="3"' in directory '/root'
2019-04-30 21:46:01,987 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_keepalive_intvl': 3}
2019-04-30 21:46:01,988 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_keepalive_intvl] at time 21:46:01.988495 duration_in_ms=66.105
2019-04-30 21:46:01,989 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_keepalive_time] at time 21:46:01.989073
2019-04-30 21:46:01,989 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_keepalive_time]
2019-04-30 21:46:02,033 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_keepalive_time="30"' in directory '/root'
2019-04-30 21:46:02,045 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_keepalive_time': 30}
2019-04-30 21:46:02,046 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_keepalive_time] at time 21:46:02.046285 duration_in_ms=57.211
2019-04-30 21:46:02,046 [salt.state       :1780][INFO    ][3479] Running state [kernel.randomize_va_space] at time 21:46:02.046843
2019-04-30 21:46:02,047 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [kernel.randomize_va_space]
2019-04-30 21:46:02,075 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w kernel.randomize_va_space="2"' in directory '/root'
2019-04-30 21:46:02,089 [salt.state       :300 ][INFO    ][3479] {'kernel.randomize_va_space': 2}
2019-04-30 21:46:02,089 [salt.state       :1951][INFO    ][3479] Completed state [kernel.randomize_va_space] at time 21:46:02.089715 duration_in_ms=42.873
2019-04-30 21:46:02,090 [salt.state       :1780][INFO    ][3479] Running state [fs.file-max] at time 21:46:02.090272
2019-04-30 21:46:02,090 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [fs.file-max]
2019-04-30 21:46:02,091 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w fs.file-max="124165"' in directory '/root'
2019-04-30 21:46:02,102 [salt.state       :300 ][INFO    ][3479] {'fs.file-max': 124165}
2019-04-30 21:46:02,103 [salt.state       :1951][INFO    ][3479] Completed state [fs.file-max] at time 21:46:02.103386 duration_in_ms=13.112
2019-04-30 21:46:02,103 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_syncookies] at time 21:46:02.103947
2019-04-30 21:46:02,104 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_syncookies]
2019-04-30 21:46:02,225 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_syncookies="1"' in directory '/root'
2019-04-30 21:46:02,238 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_syncookies': 1}
2019-04-30 21:46:02,239 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_syncookies] at time 21:46:02.239261 duration_in_ms=135.313
2019-04-30 21:46:02,239 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_max_syn_backlog] at time 21:46:02.239737
2019-04-30 21:46:02,240 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_max_syn_backlog]
2019-04-30 21:46:02,241 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_max_syn_backlog="8192"' in directory '/root'
2019-04-30 21:46:02,250 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_max_syn_backlog': 8192}
2019-04-30 21:46:02,251 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_max_syn_backlog] at time 21:46:02.251391 duration_in_ms=11.655
2019-04-30 21:46:02,251 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.rp_filter] at time 21:46:02.251850
2019-04-30 21:46:02,252 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.rp_filter]
2019-04-30 21:46:02,405 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.rp_filter="1"' in directory '/root'
2019-04-30 21:46:02,417 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.rp_filter': 1}
2019-04-30 21:46:02,417 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.rp_filter] at time 21:46:02.417683 duration_in_ms=165.833
2019-04-30 21:46:02,418 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.accept_source_route] at time 21:46:02.418208
2019-04-30 21:46:02,418 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.accept_source_route]
2019-04-30 21:46:02,419 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.accept_source_route="0"' in directory '/root'
2019-04-30 21:46:02,428 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.accept_source_route': 0}
2019-04-30 21:46:02,428 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.accept_source_route] at time 21:46:02.428633 duration_in_ms=10.425
2019-04-30 21:46:02,429 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.tcp_retries2] at time 21:46:02.429096
2019-04-30 21:46:02,429 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.tcp_retries2]
2019-04-30 21:46:02,531 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.tcp_retries2="5"' in directory '/root'
2019-04-30 21:46:02,542 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.tcp_retries2': 5}
2019-04-30 21:46:02,542 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.tcp_retries2] at time 21:46:02.542898 duration_in_ms=113.802
2019-04-30 21:46:02,543 [salt.state       :1780][INFO    ][3479] Running state [net.core.netdev_max_backlog] at time 21:46:02.543389
2019-04-30 21:46:02,543 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.core.netdev_max_backlog]
2019-04-30 21:46:02,545 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.core.netdev_max_backlog="261144"' in directory '/root'
2019-04-30 21:46:02,554 [salt.state       :300 ][INFO    ][3479] {'net.core.netdev_max_backlog': 261144}
2019-04-30 21:46:02,555 [salt.state       :1951][INFO    ][3479] Completed state [net.core.netdev_max_backlog] at time 21:46:02.555172 duration_in_ms=11.783
2019-04-30 21:46:02,555 [salt.state       :1780][INFO    ][3479] Running state [vm.swappiness] at time 21:46:02.555662
2019-04-30 21:46:02,556 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [vm.swappiness]
2019-04-30 21:46:02,560 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w vm.swappiness="10"' in directory '/root'
2019-04-30 21:46:02,571 [salt.state       :300 ][INFO    ][3479] {'vm.swappiness': 10}
2019-04-30 21:46:02,572 [salt.state       :1951][INFO    ][3479] Completed state [vm.swappiness] at time 21:46:02.572191 duration_in_ms=16.528
2019-04-30 21:46:02,572 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.secure_redirects] at time 21:46:02.572655
2019-04-30 21:46:02,573 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.secure_redirects]
2019-04-30 21:46:02,574 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.secure_redirects="0"' in directory '/root'
2019-04-30 21:46:02,584 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.secure_redirects': 0}
2019-04-30 21:46:02,584 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.secure_redirects] at time 21:46:02.584628 duration_in_ms=11.972
2019-04-30 21:46:02,585 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.neigh.default.gc_thresh1] at time 21:46:02.585103
2019-04-30 21:46:02,585 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh1]
2019-04-30 21:46:02,586 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh1="4096"' in directory '/root'
2019-04-30 21:46:02,597 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.neigh.default.gc_thresh1': 4096}
2019-04-30 21:46:02,597 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.neigh.default.gc_thresh1] at time 21:46:02.597831 duration_in_ms=12.727
2019-04-30 21:46:02,598 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.neigh.default.gc_thresh2] at time 21:46:02.598318
2019-04-30 21:46:02,598 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh2]
2019-04-30 21:46:02,686 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh2="8192"' in directory '/root'
2019-04-30 21:46:02,700 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.neigh.default.gc_thresh2': 8192}
2019-04-30 21:46:02,701 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.neigh.default.gc_thresh2] at time 21:46:02.701464 duration_in_ms=103.145
2019-04-30 21:46:02,702 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.neigh.default.gc_thresh3] at time 21:46:02.701989
2019-04-30 21:46:02,702 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.neigh.default.gc_thresh3]
2019-04-30 21:46:02,722 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.neigh.default.gc_thresh3="16384"' in directory '/root'
2019-04-30 21:46:02,732 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.neigh.default.gc_thresh3': 16384}
2019-04-30 21:46:02,733 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.neigh.default.gc_thresh3] at time 21:46:02.733169 duration_in_ms=31.18
2019-04-30 21:46:02,733 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.default.send_redirects] at time 21:46:02.733656
2019-04-30 21:46:02,734 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.default.send_redirects]
2019-04-30 21:46:02,735 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.default.send_redirects="0"' in directory '/root'
2019-04-30 21:46:02,744 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.default.send_redirects': 0}
2019-04-30 21:46:02,744 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.default.send_redirects] at time 21:46:02.744575 duration_in_ms=10.919
2019-04-30 21:46:02,745 [salt.state       :1780][INFO    ][3479] Running state [net.ipv4.conf.all.accept_redirects] at time 21:46:02.745049
2019-04-30 21:46:02,745 [salt.state       :1813][INFO    ][3479] Executing state sysctl.present for [net.ipv4.conf.all.accept_redirects]
2019-04-30 21:46:02,746 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'sysctl -w net.ipv4.conf.all.accept_redirects="0"' in directory '/root'
2019-04-30 21:46:02,759 [salt.state       :300 ][INFO    ][3479] {'net.ipv4.conf.all.accept_redirects': 0}
2019-04-30 21:46:02,759 [salt.state       :1951][INFO    ][3479] Completed state [net.ipv4.conf.all.accept_redirects] at time 21:46:02.759728 duration_in_ms=14.678
2019-04-30 21:46:02,760 [salt.state       :1780][INFO    ][3479] Running state [linux_sysfs_package] at time 21:46:02.760226
2019-04-30 21:46:02,760 [salt.state       :1813][INFO    ][3479] Executing state pkg.installed for [linux_sysfs_package]
2019-04-30 21:46:02,777 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-cache', '-q', 'policy', 'sysfsutils'] in directory '/root'
2019-04-30 21:46:02,820 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 21:46:04,429 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 21:46:04,447 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'sysfsutils'] in directory '/root'
2019-04-30 21:46:08,215 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214608199739
2019-04-30 21:46:08,227 [salt.minion      :1432][INFO    ][6148] Starting a new job with PID 6148
2019-04-30 21:46:08,331 [salt.minion      :1711][INFO    ][6148] Returning information for job: 20190430214608199739
2019-04-30 21:46:12,146 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:46:12,173 [salt.state       :300 ][INFO    ][3479] Made the following changes:
'libsysfs2' changed from 'absent' to '2.1.0+repack-4'
'sysfsutils' changed from 'absent' to '2.1.0+repack-4'

2019-04-30 21:46:12,189 [salt.state       :915 ][INFO    ][3479] Loading fresh modules for state activity
2019-04-30 21:46:12,219 [salt.state       :1951][INFO    ][3479] Completed state [linux_sysfs_package] at time 21:46:12.219127 duration_in_ms=9458.899
2019-04-30 21:46:12,223 [salt.state       :1780][INFO    ][3479] Running state [/etc/sysfs.d] at time 21:46:12.222979
2019-04-30 21:46:12,223 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/etc/sysfs.d]
2019-04-30 21:46:12,226 [salt.state       :300 ][INFO    ][3479] Directory /etc/sysfs.d is in the correct state
Directory /etc/sysfs.d updated
2019-04-30 21:46:12,226 [salt.state       :1951][INFO    ][3479] Completed state [/etc/sysfs.d] at time 21:46:12.226195 duration_in_ms=3.215
2019-04-30 21:46:12,544 [salt.state       :1780][INFO    ][3479] Running state [ondemand] at time 21:46:12.544585
2019-04-30 21:46:12,544 [salt.state       :1813][INFO    ][3479] Executing state service.dead for [ondemand]
2019-04-30 21:46:12,545 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'ondemand.service', '-n', '0'] in directory '/root'
2019-04-30 21:46:12,556 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,566 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,577 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'systemctl', 'stop', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,614 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,626 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,637 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,656 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', '/usr/sbin/update-rc.d', '-f', 'ondemand', 'remove'] in directory '/root'
2019-04-30 21:46:12,768 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'ondemand.service'] in directory '/root'
2019-04-30 21:46:12,784 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'runlevel' in directory '/root'
2019-04-30 21:46:12,794 [salt.state       :300 ][INFO    ][3479] {'ondemand': True}
2019-04-30 21:46:12,795 [salt.state       :1951][INFO    ][3479] Completed state [ondemand] at time 21:46:12.795289 duration_in_ms=250.704
2019-04-30 21:46:12,796 [salt.state       :1780][INFO    ][3479] Running state [en_US.UTF-8] at time 21:46:12.796348
2019-04-30 21:46:12,796 [salt.state       :1813][INFO    ][3479] Executing state locale.present for [en_US.UTF-8]
2019-04-30 21:46:12,797 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'locale -a' in directory '/root'
2019-04-30 21:46:12,807 [salt.state       :300 ][INFO    ][3479] Locale en_US.UTF-8 is already present
2019-04-30 21:46:12,807 [salt.state       :1951][INFO    ][3479] Completed state [en_US.UTF-8] at time 21:46:12.807749 duration_in_ms=11.401
2019-04-30 21:46:12,809 [salt.state       :1780][INFO    ][3479] Running state [en_US.UTF-8] at time 21:46:12.809781
2019-04-30 21:46:12,810 [salt.state       :1813][INFO    ][3479] Executing state locale.system for [en_US.UTF-8]
2019-04-30 21:46:12,810 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'localectl' in directory '/root'
2019-04-30 21:46:12,923 [salt.state       :300 ][INFO    ][3479] System locale en_US.UTF-8 already set
2019-04-30 21:46:12,924 [salt.state       :1951][INFO    ][3479] Completed state [en_US.UTF-8] at time 21:46:12.924013 duration_in_ms=114.232
2019-04-30 21:46:12,925 [salt.state       :1780][INFO    ][3479] Running state [root] at time 21:46:12.925401
2019-04-30 21:46:12,925 [salt.state       :1813][INFO    ][3479] Executing state group.present for [root]
2019-04-30 21:46:12,926 [salt.state       :300 ][INFO    ][3479] Group root is present and up to date
2019-04-30 21:46:12,927 [salt.state       :1951][INFO    ][3479] Completed state [root] at time 21:46:12.927065 duration_in_ms=1.664
2019-04-30 21:46:12,929 [salt.state       :1780][INFO    ][3479] Running state [root] at time 21:46:12.929950
2019-04-30 21:46:12,930 [salt.state       :1813][INFO    ][3479] Executing state user.present for [root]
2019-04-30 21:46:12,935 [salt.state       :300 ][INFO    ][3479] User root is present and up to date
2019-04-30 21:46:12,936 [salt.state       :1951][INFO    ][3479] Completed state [root] at time 21:46:12.936174 duration_in_ms=6.223
2019-04-30 21:46:12,938 [salt.state       :1780][INFO    ][3479] Running state [/root] at time 21:46:12.938552
2019-04-30 21:46:12,939 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/root]
2019-04-30 21:46:12,940 [salt.state       :300 ][INFO    ][3479] Directory /root is in the correct state
Directory /root updated
2019-04-30 21:46:12,941 [salt.state       :1951][INFO    ][3479] Completed state [/root] at time 21:46:12.941584 duration_in_ms=3.031
2019-04-30 21:46:12,942 [salt.state       :1780][INFO    ][3479] Running state [/etc/sudoers.d/90-salt-user-root] at time 21:46:12.942171
2019-04-30 21:46:12,942 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/sudoers.d/90-salt-user-root]
2019-04-30 21:46:12,943 [salt.state       :300 ][INFO    ][3479] File /etc/sudoers.d/90-salt-user-root is not present
2019-04-30 21:46:12,943 [salt.state       :1951][INFO    ][3479] Completed state [/etc/sudoers.d/90-salt-user-root] at time 21:46:12.943359 duration_in_ms=1.188
2019-04-30 21:46:12,943 [salt.state       :1780][INFO    ][3479] Running state [ubuntu] at time 21:46:12.943696
2019-04-30 21:46:12,944 [salt.state       :1813][INFO    ][3479] Executing state group.present for [ubuntu]
2019-04-30 21:46:12,944 [salt.state       :300 ][INFO    ][3479] Group ubuntu is present and up to date
2019-04-30 21:46:12,944 [salt.state       :1951][INFO    ][3479] Completed state [ubuntu] at time 21:46:12.944773 duration_in_ms=1.077
2019-04-30 21:46:12,945 [salt.state       :1780][INFO    ][3479] Running state [ubuntu] at time 21:46:12.945960
2019-04-30 21:46:12,946 [salt.state       :1813][INFO    ][3479] Executing state user.present for [ubuntu]
2019-04-30 21:46:12,949 [salt.state       :300 ][INFO    ][3479] {'passwd': 'XXX-REDACTED-XXX'}
2019-04-30 21:46:12,949 [salt.state       :1951][INFO    ][3479] Completed state [ubuntu] at time 21:46:12.949541 duration_in_ms=3.581
2019-04-30 21:46:12,950 [salt.state       :1780][INFO    ][3479] Running state [/home/ubuntu] at time 21:46:12.950842
2019-04-30 21:46:12,951 [salt.state       :1813][INFO    ][3479] Executing state file.directory for [/home/ubuntu]
2019-04-30 21:46:12,952 [salt.state       :300 ][INFO    ][3479] {'mode': '0700'}
2019-04-30 21:46:12,952 [salt.state       :1951][INFO    ][3479] Completed state [/home/ubuntu] at time 21:46:12.952482 duration_in_ms=1.639
2019-04-30 21:46:12,953 [salt.state       :1780][INFO    ][3479] Running state [/etc/sudoers.d/90-salt-user-ubuntu] at time 21:46:12.953529
2019-04-30 21:46:12,953 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/sudoers.d/90-salt-user-ubuntu]
2019-04-30 21:46:12,971 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/sudoer'
2019-04-30 21:46:12,983 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command '/usr/sbin/visudo -c -f /tmp/__salt.tmp.dHZAHx' in directory '/root'
2019-04-30 21:46:13,021 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:13,022 [salt.state       :1951][INFO    ][3479] Completed state [/etc/sudoers.d/90-salt-user-ubuntu] at time 21:46:13.022039 duration_in_ms=68.51
2019-04-30 21:46:13,022 [salt.state       :1780][INFO    ][3479] Running state [/etc/security/limits.d/90-salt-cis.conf] at time 21:46:13.022589
2019-04-30 21:46:13,023 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/security/limits.d/90-salt-cis.conf]
2019-04-30 21:46:13,046 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/limits.conf'
2019-04-30 21:46:13,126 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:13,127 [salt.state       :1951][INFO    ][3479] Completed state [/etc/security/limits.d/90-salt-cis.conf] at time 21:46:13.127275 duration_in_ms=104.686
2019-04-30 21:46:13,127 [salt.state       :1780][INFO    ][3479] Running state [/etc/security/limits.d/90-salt-default.conf] at time 21:46:13.127623
2019-04-30 21:46:13,127 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/security/limits.d/90-salt-default.conf]
2019-04-30 21:46:13,201 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:13,202 [salt.state       :1951][INFO    ][3479] Completed state [/etc/security/limits.d/90-salt-default.conf] at time 21:46:13.202021 duration_in_ms=74.398
2019-04-30 21:46:13,202 [salt.state       :1780][INFO    ][3479] Running state [autofs] at time 21:46:13.202360
2019-04-30 21:46:13,202 [salt.state       :1813][INFO    ][3479] Executing state service.disabled for [autofs]
2019-04-30 21:46:13,203 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'autofs.service', '-n', '0'] in directory '/root'
2019-04-30 21:46:13,215 [salt.state       :300 ][INFO    ][3479] The named service autofs is not available
2019-04-30 21:46:13,216 [salt.state       :1951][INFO    ][3479] Completed state [autofs] at time 21:46:13.216283 duration_in_ms=13.923
2019-04-30 21:46:13,216 [salt.state       :1780][INFO    ][3479] Running state [/etc/systemd/system.conf.d/90-salt.conf] at time 21:46:13.216737
2019-04-30 21:46:13,217 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/systemd/system.conf.d/90-salt.conf]
2019-04-30 21:46:13,232 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/systemd.conf'
2019-04-30 21:46:13,303 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:13,303 [salt.state       :1951][INFO    ][3479] Completed state [/etc/systemd/system.conf.d/90-salt.conf] at time 21:46:13.303662 duration_in_ms=86.925
2019-04-30 21:46:13,305 [salt.state       :1780][INFO    ][3479] Running state [service.systemctl_reload] at time 21:46:13.305755
2019-04-30 21:46:13,306 [salt.state       :1813][INFO    ][3479] Executing state module.wait for [service.systemctl_reload]
2019-04-30 21:46:13,306 [salt.state       :300 ][INFO    ][3479] No changes made for service.systemctl_reload
2019-04-30 21:46:13,306 [salt.state       :1951][INFO    ][3479] Completed state [service.systemctl_reload] at time 21:46:13.306746 duration_in_ms=0.99
2019-04-30 21:46:13,307 [salt.state       :1780][INFO    ][3479] Running state [service.systemctl_reload] at time 21:46:13.307057
2019-04-30 21:46:13,307 [salt.state       :1813][INFO    ][3479] Executing state module.mod_watch for [service.systemctl_reload]
2019-04-30 21:46:13,307 [salt.utils.decorators:613 ][WARNING ][3479] The function "module.run" is using its deprecated version and will expire in version "Sodium".
2019-04-30 21:46:13,308 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', '--system', 'daemon-reload'] in directory '/root'
2019-04-30 21:46:13,426 [salt.state       :300 ][INFO    ][3479] {'ret': True}
2019-04-30 21:46:13,427 [salt.state       :1951][INFO    ][3479] Completed state [service.systemctl_reload] at time 21:46:13.427446 duration_in_ms=120.389
2019-04-30 21:46:13,427 [salt.state       :1780][INFO    ][3479] Running state [/etc/shadow] at time 21:46:13.427788
2019-04-30 21:46:13,428 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/shadow]
2019-04-30 21:46:13,428 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/shadow - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,429 [salt.state       :300 ][INFO    ][3479] File /etc/shadow exists with proper permissions. No changes made.
2019-04-30 21:46:13,429 [salt.state       :1951][INFO    ][3479] Completed state [/etc/shadow] at time 21:46:13.429276 duration_in_ms=1.489
2019-04-30 21:46:13,429 [salt.state       :1780][INFO    ][3479] Running state [/etc/gshadow] at time 21:46:13.429465
2019-04-30 21:46:13,429 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/gshadow]
2019-04-30 21:46:13,429 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/gshadow - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,430 [salt.state       :300 ][INFO    ][3479] File /etc/gshadow exists with proper permissions. No changes made.
2019-04-30 21:46:13,430 [salt.state       :1951][INFO    ][3479] Completed state [/etc/gshadow] at time 21:46:13.430510 duration_in_ms=1.046
2019-04-30 21:46:13,430 [salt.state       :1780][INFO    ][3479] Running state [/etc/group-] at time 21:46:13.430684
2019-04-30 21:46:13,432 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/group-]
2019-04-30 21:46:13,432 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/group- - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,432 [salt.state       :300 ][INFO    ][3479] File /etc/group- exists with proper permissions. No changes made.
2019-04-30 21:46:13,433 [salt.state       :1951][INFO    ][3479] Completed state [/etc/group-] at time 21:46:13.433042 duration_in_ms=2.358
2019-04-30 21:46:13,433 [salt.state       :1780][INFO    ][3479] Running state [/etc/shadow-] at time 21:46:13.433211
2019-04-30 21:46:13,433 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/shadow-]
2019-04-30 21:46:13,433 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/shadow- - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,434 [salt.state       :300 ][INFO    ][3479] File /etc/shadow- exists with proper permissions. No changes made.
2019-04-30 21:46:13,434 [salt.state       :1951][INFO    ][3479] Completed state [/etc/shadow-] at time 21:46:13.434195 duration_in_ms=0.984
2019-04-30 21:46:13,434 [salt.state       :1780][INFO    ][3479] Running state [/etc/passwd-] at time 21:46:13.434354
2019-04-30 21:46:13,434 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/passwd-]
2019-04-30 21:46:13,434 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/passwd- - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,435 [salt.state       :300 ][INFO    ][3479] File /etc/passwd- exists with proper permissions. No changes made.
2019-04-30 21:46:13,435 [salt.state       :1951][INFO    ][3479] Completed state [/etc/passwd-] at time 21:46:13.435347 duration_in_ms=0.993
2019-04-30 21:46:13,435 [salt.state       :1780][INFO    ][3479] Running state [/etc/passwd] at time 21:46:13.435511
2019-04-30 21:46:13,435 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/passwd]
2019-04-30 21:46:13,435 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/passwd - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,436 [salt.state       :300 ][INFO    ][3479] File /etc/passwd exists with proper permissions. No changes made.
2019-04-30 21:46:13,436 [salt.state       :1951][INFO    ][3479] Completed state [/etc/passwd] at time 21:46:13.436469 duration_in_ms=0.957
2019-04-30 21:46:13,436 [salt.state       :1780][INFO    ][3479] Running state [/etc/gshadow-] at time 21:46:13.436628
2019-04-30 21:46:13,436 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/gshadow-]
2019-04-30 21:46:13,437 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/gshadow- - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,437 [salt.state       :300 ][INFO    ][3479] File /etc/gshadow- exists with proper permissions. No changes made.
2019-04-30 21:46:13,437 [salt.state       :1951][INFO    ][3479] Completed state [/etc/gshadow-] at time 21:46:13.437599 duration_in_ms=0.971
2019-04-30 21:46:13,437 [salt.state       :1780][INFO    ][3479] Running state [/etc/group] at time 21:46:13.437759
2019-04-30 21:46:13,437 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/group]
2019-04-30 21:46:13,438 [salt.loaded.int.states.file:2298][WARNING ][3479] State for file: /etc/group - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 21:46:13,438 [salt.state       :300 ][INFO    ][3479] File /etc/group exists with proper permissions. No changes made.
2019-04-30 21:46:13,438 [salt.state       :1951][INFO    ][3479] Completed state [/etc/group] at time 21:46:13.438771 duration_in_ms=1.012
2019-04-30 21:46:13,439 [salt.state       :1780][INFO    ][3479] Running state [/etc/issue] at time 21:46:13.439003
2019-04-30 21:46:13,439 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/issue]
2019-04-30 21:46:13,446 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -1,2 +1,9 @@
-Ubuntu 16.04.6 LTS \n \l
-
+=================================== WARNING ====================================
+You have accessed a computer managed by OPNFV.
+You are required to have authorization from OPNFV
+before you proceed and you are strictly limited to use set out within that
+authorization. Unauthorized access to or misuse of this system is prohibited
+and constitutes an offence under the Computer Misuse Act 1990.
+If you disclose any information obtained through this system without authority
+OPNFV may take legal action against you.
+================================================================================

2019-04-30 21:46:13,447 [salt.state       :1951][INFO    ][3479] Completed state [/etc/issue] at time 21:46:13.447022 duration_in_ms=8.019
2019-04-30 21:46:13,447 [salt.state       :1780][INFO    ][3479] Running state [/etc/hostname] at time 21:46:13.447227
2019-04-30 21:46:13,447 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/hostname]
2019-04-30 21:46:13,462 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/hostname'
2019-04-30 21:46:13,469 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -1 +1 @@
-ubuntu
+prx01

2019-04-30 21:46:13,469 [salt.state       :1951][INFO    ][3479] Completed state [/etc/hostname] at time 21:46:13.469797 duration_in_ms=22.57
2019-04-30 21:46:13,471 [salt.state       :1780][INFO    ][3479] Running state [hostname prx01] at time 21:46:13.471906
2019-04-30 21:46:13,472 [salt.state       :1813][INFO    ][3479] Executing state cmd.run for [hostname prx01]
2019-04-30 21:46:13,472 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'test "$(hostname)" = "prx01"' in directory '/root'
2019-04-30 21:46:13,482 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'hostname prx01' in directory '/root'
2019-04-30 21:46:13,490 [salt.state       :300 ][INFO    ][3479] {'pid': 6518, 'retcode': 0, 'stderr': '', 'stdout': ''}
2019-04-30 21:46:13,491 [salt.state       :1951][INFO    ][3479] Completed state [hostname prx01] at time 21:46:13.491030 duration_in_ms=19.124
2019-04-30 21:46:13,492 [salt.state       :1780][INFO    ][3479] Running state [mdb02] at time 21:46:13.492289
2019-04-30 21:46:13,492 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb02]
2019-04-30 21:46:13,493 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb02'}
2019-04-30 21:46:13,493 [salt.state       :1951][INFO    ][3479] Completed state [mdb02] at time 21:46:13.493557 duration_in_ms=1.268
2019-04-30 21:46:13,493 [salt.state       :1780][INFO    ][3479] Running state [mdb02.mcp-ovs-ha.local] at time 21:46:13.493834
2019-04-30 21:46:13,494 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb02.mcp-ovs-ha.local]
2019-04-30 21:46:13,494 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,494 [salt.state       :1951][INFO    ][3479] Completed state [mdb02.mcp-ovs-ha.local] at time 21:46:13.494916 duration_in_ms=1.083
2019-04-30 21:46:13,495 [salt.state       :1780][INFO    ][3479] Running state [mdb03] at time 21:46:13.495200
2019-04-30 21:46:13,495 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb03]
2019-04-30 21:46:13,500 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb03'}
2019-04-30 21:46:13,500 [salt.state       :1951][INFO    ][3479] Completed state [mdb03] at time 21:46:13.500534 duration_in_ms=5.334
2019-04-30 21:46:13,500 [salt.state       :1780][INFO    ][3479] Running state [mdb03.mcp-ovs-ha.local] at time 21:46:13.500803
2019-04-30 21:46:13,500 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb03.mcp-ovs-ha.local]
2019-04-30 21:46:13,506 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb03.mcp-ovs-ha.local'}
2019-04-30 21:46:13,506 [salt.state       :1951][INFO    ][3479] Completed state [mdb03.mcp-ovs-ha.local] at time 21:46:13.506792 duration_in_ms=5.988
2019-04-30 21:46:13,507 [salt.state       :1780][INFO    ][3479] Running state [mdb01] at time 21:46:13.507141
2019-04-30 21:46:13,507 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb01]
2019-04-30 21:46:13,512 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb01'}
2019-04-30 21:46:13,512 [salt.state       :1951][INFO    ][3479] Completed state [mdb01] at time 21:46:13.512697 duration_in_ms=5.556
2019-04-30 21:46:13,513 [salt.state       :1780][INFO    ][3479] Running state [mdb01.mcp-ovs-ha.local] at time 21:46:13.513148
2019-04-30 21:46:13,513 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb01.mcp-ovs-ha.local]
2019-04-30 21:46:13,518 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,518 [salt.state       :1951][INFO    ][3479] Completed state [mdb01.mcp-ovs-ha.local] at time 21:46:13.518677 duration_in_ms=5.53
2019-04-30 21:46:13,519 [salt.state       :1780][INFO    ][3479] Running state [mdb] at time 21:46:13.519042
2019-04-30 21:46:13,519 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb]
2019-04-30 21:46:13,524 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb'}
2019-04-30 21:46:13,524 [salt.state       :1951][INFO    ][3479] Completed state [mdb] at time 21:46:13.524613 duration_in_ms=5.571
2019-04-30 21:46:13,524 [salt.state       :1780][INFO    ][3479] Running state [mdb.mcp-ovs-ha.local] at time 21:46:13.524898
2019-04-30 21:46:13,525 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mdb.mcp-ovs-ha.local]
2019-04-30 21:46:13,530 [salt.state       :300 ][INFO    ][3479] {'host': 'mdb.mcp-ovs-ha.local'}
2019-04-30 21:46:13,530 [salt.state       :1951][INFO    ][3479] Completed state [mdb.mcp-ovs-ha.local] at time 21:46:13.530612 duration_in_ms=5.714
2019-04-30 21:46:13,531 [salt.state       :1780][INFO    ][3479] Running state [cfg01] at time 21:46:13.531027
2019-04-30 21:46:13,531 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cfg01]
2019-04-30 21:46:13,536 [salt.state       :300 ][INFO    ][3479] {'host': 'cfg01'}
2019-04-30 21:46:13,536 [salt.state       :1951][INFO    ][3479] Completed state [cfg01] at time 21:46:13.536649 duration_in_ms=5.622
2019-04-30 21:46:13,536 [salt.state       :1780][INFO    ][3479] Running state [cfg01.mcp-ovs-ha.local] at time 21:46:13.536937
2019-04-30 21:46:13,537 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2019-04-30 21:46:13,542 [salt.state       :300 ][INFO    ][3479] {'host': 'cfg01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,542 [salt.state       :1951][INFO    ][3479] Completed state [cfg01.mcp-ovs-ha.local] at time 21:46:13.542610 duration_in_ms=5.672
2019-04-30 21:46:13,542 [salt.state       :1780][INFO    ][3479] Running state [prx01] at time 21:46:13.542910
2019-04-30 21:46:13,543 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx01]
2019-04-30 21:46:13,590 [salt.state       :300 ][INFO    ][3479] {'host': 'prx01'}
2019-04-30 21:46:13,591 [salt.state       :1951][INFO    ][3479] Completed state [prx01] at time 21:46:13.590943 duration_in_ms=48.032
2019-04-30 21:46:13,592 [salt.state       :1780][INFO    ][3479] Running state [prx01.mcp-ovs-ha.local] at time 21:46:13.591995
2019-04-30 21:46:13,592 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx01.mcp-ovs-ha.local]
2019-04-30 21:46:13,601 [salt.state       :300 ][INFO    ][3479] {'host': 'prx01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,602 [salt.state       :1951][INFO    ][3479] Completed state [prx01.mcp-ovs-ha.local] at time 21:46:13.602200 duration_in_ms=10.204
2019-04-30 21:46:13,602 [salt.state       :1780][INFO    ][3479] Running state [kvm01] at time 21:46:13.602724
2019-04-30 21:46:13,603 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm01]
2019-04-30 21:46:13,607 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm01'}
2019-04-30 21:46:13,608 [salt.state       :1951][INFO    ][3479] Completed state [kvm01] at time 21:46:13.608168 duration_in_ms=5.444
2019-04-30 21:46:13,608 [salt.state       :1780][INFO    ][3479] Running state [kvm01.mcp-ovs-ha.local] at time 21:46:13.608685
2019-04-30 21:46:13,609 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm01.mcp-ovs-ha.local]
2019-04-30 21:46:13,614 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,614 [salt.state       :1951][INFO    ][3479] Completed state [kvm01.mcp-ovs-ha.local] at time 21:46:13.614221 duration_in_ms=5.537
2019-04-30 21:46:13,614 [salt.state       :1780][INFO    ][3479] Running state [kvm03] at time 21:46:13.614741
2019-04-30 21:46:13,615 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm03]
2019-04-30 21:46:13,620 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm03'}
2019-04-30 21:46:13,620 [salt.state       :1951][INFO    ][3479] Completed state [kvm03] at time 21:46:13.620203 duration_in_ms=5.462
2019-04-30 21:46:13,620 [salt.state       :1780][INFO    ][3479] Running state [kvm03.mcp-ovs-ha.local] at time 21:46:13.620552
2019-04-30 21:46:13,620 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm03.mcp-ovs-ha.local]
2019-04-30 21:46:13,625 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm03.mcp-ovs-ha.local'}
2019-04-30 21:46:13,626 [salt.state       :1951][INFO    ][3479] Completed state [kvm03.mcp-ovs-ha.local] at time 21:46:13.626119 duration_in_ms=5.567
2019-04-30 21:46:13,626 [salt.state       :1780][INFO    ][3479] Running state [kvm02] at time 21:46:13.626558
2019-04-30 21:46:13,626 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm02]
2019-04-30 21:46:13,631 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm02'}
2019-04-30 21:46:13,632 [salt.state       :1951][INFO    ][3479] Completed state [kvm02] at time 21:46:13.632054 duration_in_ms=5.496
2019-04-30 21:46:13,632 [salt.state       :1780][INFO    ][3479] Running state [kvm02.mcp-ovs-ha.local] at time 21:46:13.632335
2019-04-30 21:46:13,632 [salt.state       :1813][INFO    ][3479] Executing state host.present for [kvm02.mcp-ovs-ha.local]
2019-04-30 21:46:13,637 [salt.state       :300 ][INFO    ][3479] {'host': 'kvm02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,638 [salt.state       :1951][INFO    ][3479] Completed state [kvm02.mcp-ovs-ha.local] at time 21:46:13.638090 duration_in_ms=5.755
2019-04-30 21:46:13,638 [salt.state       :1780][INFO    ][3479] Running state [dbs] at time 21:46:13.638377
2019-04-30 21:46:13,638 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs]
2019-04-30 21:46:13,643 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs'}
2019-04-30 21:46:13,644 [salt.state       :1951][INFO    ][3479] Completed state [dbs] at time 21:46:13.644099 duration_in_ms=5.722
2019-04-30 21:46:13,644 [salt.state       :1780][INFO    ][3479] Running state [dbs.mcp-ovs-ha.local] at time 21:46:13.644397
2019-04-30 21:46:13,644 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs.mcp-ovs-ha.local]
2019-04-30 21:46:13,650 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs.mcp-ovs-ha.local'}
2019-04-30 21:46:13,650 [salt.state       :1951][INFO    ][3479] Completed state [dbs.mcp-ovs-ha.local] at time 21:46:13.650183 duration_in_ms=5.786
2019-04-30 21:46:13,650 [salt.state       :1780][INFO    ][3479] Running state [prx] at time 21:46:13.650508
2019-04-30 21:46:13,650 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx]
2019-04-30 21:46:13,656 [salt.state       :300 ][INFO    ][3479] {'host': 'prx'}
2019-04-30 21:46:13,656 [salt.state       :1951][INFO    ][3479] Completed state [prx] at time 21:46:13.656248 duration_in_ms=5.74
2019-04-30 21:46:13,656 [salt.state       :1780][INFO    ][3479] Running state [prx.mcp-ovs-ha.local] at time 21:46:13.656584
2019-04-30 21:46:13,656 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx.mcp-ovs-ha.local]
2019-04-30 21:46:13,662 [salt.state       :300 ][INFO    ][3479] {'host': 'prx.mcp-ovs-ha.local'}
2019-04-30 21:46:13,662 [salt.state       :1951][INFO    ][3479] Completed state [prx.mcp-ovs-ha.local] at time 21:46:13.662224 duration_in_ms=5.64
2019-04-30 21:46:13,662 [salt.state       :1780][INFO    ][3479] Running state [prx02] at time 21:46:13.662548
2019-04-30 21:46:13,662 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx02]
2019-04-30 21:46:13,667 [salt.state       :300 ][INFO    ][3479] {'host': 'prx02'}
2019-04-30 21:46:13,668 [salt.state       :1951][INFO    ][3479] Completed state [prx02] at time 21:46:13.668121 duration_in_ms=5.573
2019-04-30 21:46:13,668 [salt.state       :1780][INFO    ][3479] Running state [prx02.mcp-ovs-ha.local] at time 21:46:13.668418
2019-04-30 21:46:13,668 [salt.state       :1813][INFO    ][3479] Executing state host.present for [prx02.mcp-ovs-ha.local]
2019-04-30 21:46:13,673 [salt.state       :300 ][INFO    ][3479] {'host': 'prx02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,674 [salt.state       :1951][INFO    ][3479] Completed state [prx02.mcp-ovs-ha.local] at time 21:46:13.674128 duration_in_ms=5.709
2019-04-30 21:46:13,674 [salt.state       :1780][INFO    ][3479] Running state [msg02] at time 21:46:13.674419
2019-04-30 21:46:13,674 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg02]
2019-04-30 21:46:13,679 [salt.state       :300 ][INFO    ][3479] {'host': 'msg02'}
2019-04-30 21:46:13,680 [salt.state       :1951][INFO    ][3479] Completed state [msg02] at time 21:46:13.680102 duration_in_ms=5.683
2019-04-30 21:46:13,680 [salt.state       :1780][INFO    ][3479] Running state [msg02.mcp-ovs-ha.local] at time 21:46:13.680380
2019-04-30 21:46:13,680 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg02.mcp-ovs-ha.local]
2019-04-30 21:46:13,686 [salt.state       :300 ][INFO    ][3479] {'host': 'msg02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,686 [salt.state       :1951][INFO    ][3479] Completed state [msg02.mcp-ovs-ha.local] at time 21:46:13.686140 duration_in_ms=5.759
2019-04-30 21:46:13,686 [salt.state       :1780][INFO    ][3479] Running state [msg03] at time 21:46:13.686437
2019-04-30 21:46:13,686 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg03]
2019-04-30 21:46:13,732 [salt.state       :300 ][INFO    ][3479] {'host': 'msg03'}
2019-04-30 21:46:13,732 [salt.state       :1951][INFO    ][3479] Completed state [msg03] at time 21:46:13.732435 duration_in_ms=45.998
2019-04-30 21:46:13,732 [salt.state       :1780][INFO    ][3479] Running state [msg03.mcp-ovs-ha.local] at time 21:46:13.732803
2019-04-30 21:46:13,733 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg03.mcp-ovs-ha.local]
2019-04-30 21:46:13,740 [salt.state       :300 ][INFO    ][3479] {'host': 'msg03.mcp-ovs-ha.local'}
2019-04-30 21:46:13,740 [salt.state       :1951][INFO    ][3479] Completed state [msg03.mcp-ovs-ha.local] at time 21:46:13.740336 duration_in_ms=7.532
2019-04-30 21:46:13,740 [salt.state       :1780][INFO    ][3479] Running state [msg01] at time 21:46:13.740647
2019-04-30 21:46:13,740 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg01]
2019-04-30 21:46:13,746 [salt.state       :300 ][INFO    ][3479] {'host': 'msg01'}
2019-04-30 21:46:13,746 [salt.state       :1951][INFO    ][3479] Completed state [msg01] at time 21:46:13.746279 duration_in_ms=5.632
2019-04-30 21:46:13,746 [salt.state       :1780][INFO    ][3479] Running state [msg01.mcp-ovs-ha.local] at time 21:46:13.746571
2019-04-30 21:46:13,746 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg01.mcp-ovs-ha.local]
2019-04-30 21:46:13,752 [salt.state       :300 ][INFO    ][3479] {'host': 'msg01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,752 [salt.state       :1951][INFO    ][3479] Completed state [msg01.mcp-ovs-ha.local] at time 21:46:13.752264 duration_in_ms=5.693
2019-04-30 21:46:13,752 [salt.state       :1780][INFO    ][3479] Running state [msg] at time 21:46:13.752563
2019-04-30 21:46:13,752 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg]
2019-04-30 21:46:13,758 [salt.state       :300 ][INFO    ][3479] {'host': 'msg'}
2019-04-30 21:46:13,758 [salt.state       :1951][INFO    ][3479] Completed state [msg] at time 21:46:13.758279 duration_in_ms=5.716
2019-04-30 21:46:13,758 [salt.state       :1780][INFO    ][3479] Running state [msg.mcp-ovs-ha.local] at time 21:46:13.758571
2019-04-30 21:46:13,758 [salt.state       :1813][INFO    ][3479] Executing state host.present for [msg.mcp-ovs-ha.local]
2019-04-30 21:46:13,764 [salt.state       :300 ][INFO    ][3479] {'host': 'msg.mcp-ovs-ha.local'}
2019-04-30 21:46:13,764 [salt.state       :1951][INFO    ][3479] Completed state [msg.mcp-ovs-ha.local] at time 21:46:13.764316 duration_in_ms=5.745
2019-04-30 21:46:13,764 [salt.state       :1780][INFO    ][3479] Running state [cfg01] at time 21:46:13.764610
2019-04-30 21:46:13,764 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cfg01]
2019-04-30 21:46:13,765 [salt.state       :300 ][INFO    ][3479] Host cfg01 (10.167.4.11) already present
2019-04-30 21:46:13,765 [salt.state       :1951][INFO    ][3479] Completed state [cfg01] at time 21:46:13.765261 duration_in_ms=0.651
2019-04-30 21:46:13,765 [salt.state       :1780][INFO    ][3479] Running state [cfg01.mcp-ovs-ha.local] at time 21:46:13.765536
2019-04-30 21:46:13,765 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cfg01.mcp-ovs-ha.local]
2019-04-30 21:46:13,766 [salt.state       :300 ][INFO    ][3479] Host cfg01.mcp-ovs-ha.local (10.167.4.11) already present
2019-04-30 21:46:13,766 [salt.state       :1951][INFO    ][3479] Completed state [cfg01.mcp-ovs-ha.local] at time 21:46:13.766200 duration_in_ms=0.663
2019-04-30 21:46:13,766 [salt.state       :1780][INFO    ][3479] Running state [cmp002] at time 21:46:13.766480
2019-04-30 21:46:13,766 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cmp002]
2019-04-30 21:46:13,770 [salt.state       :300 ][INFO    ][3479] {'host': 'cmp002'}
2019-04-30 21:46:13,770 [salt.state       :1951][INFO    ][3479] Completed state [cmp002] at time 21:46:13.770321 duration_in_ms=3.842
2019-04-30 21:46:13,770 [salt.state       :1780][INFO    ][3479] Running state [cmp002.mcp-ovs-ha.local] at time 21:46:13.770599
2019-04-30 21:46:13,770 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cmp002.mcp-ovs-ha.local]
2019-04-30 21:46:13,776 [salt.state       :300 ][INFO    ][3479] {'host': 'cmp002.mcp-ovs-ha.local'}
2019-04-30 21:46:13,776 [salt.state       :1951][INFO    ][3479] Completed state [cmp002.mcp-ovs-ha.local] at time 21:46:13.776303 duration_in_ms=5.703
2019-04-30 21:46:13,776 [salt.state       :1780][INFO    ][3479] Running state [cmp001] at time 21:46:13.776583
2019-04-30 21:46:13,776 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cmp001]
2019-04-30 21:46:13,782 [salt.state       :300 ][INFO    ][3479] {'host': 'cmp001'}
2019-04-30 21:46:13,782 [salt.state       :1951][INFO    ][3479] Completed state [cmp001] at time 21:46:13.782318 duration_in_ms=5.736
2019-04-30 21:46:13,782 [salt.state       :1780][INFO    ][3479] Running state [cmp001.mcp-ovs-ha.local] at time 21:46:13.782610
2019-04-30 21:46:13,782 [salt.state       :1813][INFO    ][3479] Executing state host.present for [cmp001.mcp-ovs-ha.local]
2019-04-30 21:46:13,788 [salt.state       :300 ][INFO    ][3479] {'host': 'cmp001.mcp-ovs-ha.local'}
2019-04-30 21:46:13,788 [salt.state       :1951][INFO    ][3479] Completed state [cmp001.mcp-ovs-ha.local] at time 21:46:13.788333 duration_in_ms=5.723
2019-04-30 21:46:13,788 [salt.state       :1780][INFO    ][3479] Running state [dbs01] at time 21:46:13.788622
2019-04-30 21:46:13,788 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs01]
2019-04-30 21:46:13,794 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs01'}
2019-04-30 21:46:13,794 [salt.state       :1951][INFO    ][3479] Completed state [dbs01] at time 21:46:13.794314 duration_in_ms=5.692
2019-04-30 21:46:13,794 [salt.state       :1780][INFO    ][3479] Running state [dbs01.mcp-ovs-ha.local] at time 21:46:13.794614
2019-04-30 21:46:13,794 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs01.mcp-ovs-ha.local]
2019-04-30 21:46:13,800 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,800 [salt.state       :1951][INFO    ][3479] Completed state [dbs01.mcp-ovs-ha.local] at time 21:46:13.800361 duration_in_ms=5.747
2019-04-30 21:46:13,800 [salt.state       :1780][INFO    ][3479] Running state [dbs02] at time 21:46:13.800672
2019-04-30 21:46:13,800 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs02]
2019-04-30 21:46:13,806 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs02'}
2019-04-30 21:46:13,806 [salt.state       :1951][INFO    ][3479] Completed state [dbs02] at time 21:46:13.806414 duration_in_ms=5.742
2019-04-30 21:46:13,806 [salt.state       :1780][INFO    ][3479] Running state [dbs02.mcp-ovs-ha.local] at time 21:46:13.806948
2019-04-30 21:46:13,807 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs02.mcp-ovs-ha.local]
2019-04-30 21:46:13,812 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,812 [salt.state       :1951][INFO    ][3479] Completed state [dbs02.mcp-ovs-ha.local] at time 21:46:13.812384 duration_in_ms=5.436
2019-04-30 21:46:13,812 [salt.state       :1780][INFO    ][3479] Running state [dbs03] at time 21:46:13.812691
2019-04-30 21:46:13,812 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs03]
2019-04-30 21:46:13,818 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs03'}
2019-04-30 21:46:13,818 [salt.state       :1951][INFO    ][3479] Completed state [dbs03] at time 21:46:13.818383 duration_in_ms=5.692
2019-04-30 21:46:13,818 [salt.state       :1780][INFO    ][3479] Running state [dbs03.mcp-ovs-ha.local] at time 21:46:13.818688
2019-04-30 21:46:13,818 [salt.state       :1813][INFO    ][3479] Executing state host.present for [dbs03.mcp-ovs-ha.local]
2019-04-30 21:46:13,836 [salt.state       :300 ][INFO    ][3479] {'host': 'dbs03.mcp-ovs-ha.local'}
2019-04-30 21:46:13,836 [salt.state       :1951][INFO    ][3479] Completed state [dbs03.mcp-ovs-ha.local] at time 21:46:13.836529 duration_in_ms=17.841
2019-04-30 21:46:13,836 [salt.state       :1780][INFO    ][3479] Running state [mas01] at time 21:46:13.836906
2019-04-30 21:46:13,837 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mas01]
2019-04-30 21:46:13,869 [salt.state       :300 ][INFO    ][3479] {'host': 'mas01'}
2019-04-30 21:46:13,870 [salt.state       :1951][INFO    ][3479] Completed state [mas01] at time 21:46:13.870132 duration_in_ms=33.226
2019-04-30 21:46:13,870 [salt.state       :1780][INFO    ][3479] Running state [mas01.mcp-ovs-ha.local] at time 21:46:13.870524
2019-04-30 21:46:13,870 [salt.state       :1813][INFO    ][3479] Executing state host.present for [mas01.mcp-ovs-ha.local]
2019-04-30 21:46:13,877 [salt.state       :300 ][INFO    ][3479] {'host': 'mas01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,877 [salt.state       :1951][INFO    ][3479] Completed state [mas01.mcp-ovs-ha.local] at time 21:46:13.877948 duration_in_ms=7.424
2019-04-30 21:46:13,878 [salt.state       :1780][INFO    ][3479] Running state [ctl02] at time 21:46:13.878459
2019-04-30 21:46:13,878 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl02]
2019-04-30 21:46:13,883 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl02'}
2019-04-30 21:46:13,883 [salt.state       :1951][INFO    ][3479] Completed state [ctl02] at time 21:46:13.883834 duration_in_ms=5.375
2019-04-30 21:46:13,884 [salt.state       :1780][INFO    ][3479] Running state [ctl02.mcp-ovs-ha.local] at time 21:46:13.884147
2019-04-30 21:46:13,884 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl02.mcp-ovs-ha.local]
2019-04-30 21:46:13,889 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl02.mcp-ovs-ha.local'}
2019-04-30 21:46:13,890 [salt.state       :1951][INFO    ][3479] Completed state [ctl02.mcp-ovs-ha.local] at time 21:46:13.889963 duration_in_ms=5.815
2019-04-30 21:46:13,890 [salt.state       :1780][INFO    ][3479] Running state [ctl03] at time 21:46:13.890527
2019-04-30 21:46:13,890 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl03]
2019-04-30 21:46:13,895 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl03'}
2019-04-30 21:46:13,895 [salt.state       :1951][INFO    ][3479] Completed state [ctl03] at time 21:46:13.895906 duration_in_ms=5.379
2019-04-30 21:46:13,896 [salt.state       :1780][INFO    ][3479] Running state [ctl03.mcp-ovs-ha.local] at time 21:46:13.896456
2019-04-30 21:46:13,896 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl03.mcp-ovs-ha.local]
2019-04-30 21:46:13,901 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl03.mcp-ovs-ha.local'}
2019-04-30 21:46:13,901 [salt.state       :1951][INFO    ][3479] Completed state [ctl03.mcp-ovs-ha.local] at time 21:46:13.901914 duration_in_ms=5.458
2019-04-30 21:46:13,902 [salt.state       :1780][INFO    ][3479] Running state [ctl01] at time 21:46:13.902268
2019-04-30 21:46:13,902 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl01]
2019-04-30 21:46:13,907 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl01'}
2019-04-30 21:46:13,907 [salt.state       :1951][INFO    ][3479] Completed state [ctl01] at time 21:46:13.907922 duration_in_ms=5.654
2019-04-30 21:46:13,908 [salt.state       :1780][INFO    ][3479] Running state [ctl01.mcp-ovs-ha.local] at time 21:46:13.908378
2019-04-30 21:46:13,908 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl01.mcp-ovs-ha.local]
2019-04-30 21:46:13,913 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl01.mcp-ovs-ha.local'}
2019-04-30 21:46:13,913 [salt.state       :1951][INFO    ][3479] Completed state [ctl01.mcp-ovs-ha.local] at time 21:46:13.913926 duration_in_ms=5.548
2019-04-30 21:46:13,914 [salt.state       :1780][INFO    ][3479] Running state [ctl] at time 21:46:13.914286
2019-04-30 21:46:13,914 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl]
2019-04-30 21:46:13,919 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl'}
2019-04-30 21:46:13,919 [salt.state       :1951][INFO    ][3479] Completed state [ctl] at time 21:46:13.919848 duration_in_ms=5.561
2019-04-30 21:46:13,920 [salt.state       :1780][INFO    ][3479] Running state [ctl.mcp-ovs-ha.local] at time 21:46:13.920148
2019-04-30 21:46:13,920 [salt.state       :1813][INFO    ][3479] Executing state host.present for [ctl.mcp-ovs-ha.local]
2019-04-30 21:46:13,925 [salt.state       :300 ][INFO    ][3479] {'host': 'ctl.mcp-ovs-ha.local'}
2019-04-30 21:46:13,925 [salt.state       :1951][INFO    ][3479] Completed state [ctl.mcp-ovs-ha.local] at time 21:46:13.925848 duration_in_ms=5.701
2019-04-30 21:46:13,926 [salt.state       :1780][INFO    ][3479] Running state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 21:46:13.926046
2019-04-30 21:46:13,926 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/network/interfaces.d/50-cloud-init.cfg]
2019-04-30 21:46:13,926 [salt.state       :300 ][INFO    ][3479] {'removed': '/etc/network/interfaces.d/50-cloud-init.cfg'}
2019-04-30 21:46:13,926 [salt.state       :1951][INFO    ][3479] Completed state [/etc/network/interfaces.d/50-cloud-init.cfg] at time 21:46:13.926699 duration_in_ms=0.653
2019-04-30 21:46:13,927 [salt.state       :1780][INFO    ][3479] Running state [ens3] at time 21:46:13.927305
2019-04-30 21:46:13,927 [salt.state       :1813][INFO    ][3479] Executing state network.managed for [ens3]
2019-04-30 21:46:14,037 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['ifup', 'ens3'] in directory '/root'
2019-04-30 21:46:14,055 [salt.loaded.int.module.cmdmod:730 ][ERROR   ][3479] Command '['ifup', 'ens3']' failed with return code: 1
2019-04-30 21:46:14,056 [salt.loaded.int.module.cmdmod:732 ][ERROR   ][3479] stdout: RTNETLINK answers: File exists
Failed to bring up ens3.
2019-04-30 21:46:14,056 [salt.loaded.int.module.cmdmod:736 ][ERROR   ][3479] retcode: 1
2019-04-30 21:46:14,458 [salt.state       :300 ][INFO    ][3479] {'interface': 'Added network interface.', 'status': 'Interface ens3 is up'}
2019-04-30 21:46:14,458 [salt.state       :1951][INFO    ][3479] Completed state [ens3] at time 21:46:14.458502 duration_in_ms=531.196
2019-04-30 21:46:14,458 [salt.state       :1780][INFO    ][3479] Running state [linux_system_network] at time 21:46:14.458849
2019-04-30 21:46:14,459 [salt.state       :1813][INFO    ][3479] Executing state network.system for [linux_system_network]
2019-04-30 21:46:14,459 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'networking.service', '-n', '0'] in directory '/root'
2019-04-30 21:46:14,476 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'networking.service'] in directory '/root'
2019-04-30 21:46:14,493 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'NetworkManager.service', '-n', '0'] in directory '/root'
2019-04-30 21:46:14,507 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'systemctl', 'enable', 'networking.service'] in directory '/root'
2019-04-30 21:46:14,883 [salt.loaded.int.module.debian_ip:1964][WARNING ][3479] The network state sls is requiring a reboot of the system to properly apply network configuration.
2019-04-30 21:46:14,883 [salt.state       :300 ][INFO    ][3479] {'network_settings': u'--- \n+++ \n@@ -1,2 +1,4 @@\n NETWORKING=yes\n\n HOSTNAME=prx01\n\n+DOMAIN=mcp-ovs-ha.local\n\n+SEARCH=maas\n'}
2019-04-30 21:46:14,884 [salt.state       :1951][INFO    ][3479] Completed state [linux_system_network] at time 21:46:14.884132 duration_in_ms=425.283
2019-04-30 21:46:14,884 [salt.state       :1780][INFO    ][3479] Running state [ens2] at time 21:46:14.884424
2019-04-30 21:46:14,884 [salt.state       :1813][INFO    ][3479] Executing state network.managed for [ens2]
2019-04-30 21:46:14,903 [salt.state       :300 ][INFO    ][3479] {'interface': 'Added network interface.'}
2019-04-30 21:46:14,904 [salt.state       :1951][INFO    ][3479] Completed state [ens2] at time 21:46:14.904273 duration_in_ms=19.849
2019-04-30 21:46:14,904 [salt.state       :1780][INFO    ][3479] Running state [ens4] at time 21:46:14.904550
2019-04-30 21:46:14,904 [salt.state       :1813][INFO    ][3479] Executing state network.managed for [ens4]
2019-04-30 21:46:15,002 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['ifup', 'ens4'] in directory '/root'
2019-04-30 21:46:15,575 [salt.state       :300 ][INFO    ][3479] {'interface': 'Added network interface.', 'status': 'Interface ens4 is up'}
2019-04-30 21:46:15,575 [salt.state       :1951][INFO    ][3479] Completed state [ens4] at time 21:46:15.575887 duration_in_ms=671.336
2019-04-30 21:46:15,576 [salt.state       :1780][INFO    ][3479] Running state [/etc/udev/rules.d/60-net-txqueue.rules] at time 21:46:15.576308
2019-04-30 21:46:15,576 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/udev/rules.d/60-net-txqueue.rules]
2019-04-30 21:46:15,593 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'linux/files/60-net-txqueue.rules'
2019-04-30 21:46:15,602 [salt.state       :300 ][INFO    ][3479] File changed:
New file
2019-04-30 21:46:15,602 [salt.state       :1951][INFO    ][3479] Completed state [/etc/udev/rules.d/60-net-txqueue.rules] at time 21:46:15.602781 duration_in_ms=26.473
2019-04-30 21:46:15,604 [salt.state       :1780][INFO    ][3479] Running state [/bin/udevadm control --reload-rules] at time 21:46:15.604638
2019-04-30 21:46:15,605 [salt.state       :1813][INFO    ][3479] Executing state cmd.run for [/bin/udevadm control --reload-rules]
2019-04-30 21:46:15,605 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command '/bin/udevadm control --reload-rules' in directory '/root'
2019-04-30 21:46:15,621 [salt.state       :300 ][INFO    ][3479] {'pid': 6805, 'retcode': 0, 'stderr': '', 'stdout': ''}
2019-04-30 21:46:15,622 [salt.state       :1951][INFO    ][3479] Completed state [/bin/udevadm control --reload-rules] at time 21:46:15.622036 duration_in_ms=17.398
2019-04-30 21:46:15,624 [salt.state       :1780][INFO    ][3479] Running state [/bin/udevadm trigger --attr-match=subsystem=net] at time 21:46:15.624112
2019-04-30 21:46:15,624 [salt.state       :1813][INFO    ][3479] Executing state cmd.run for [/bin/udevadm trigger --attr-match=subsystem=net]
2019-04-30 21:46:15,625 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command '/bin/udevadm trigger --attr-match=subsystem=net' in directory '/root'
2019-04-30 21:46:15,651 [salt.state       :300 ][INFO    ][3479] {'pid': 6810, 'retcode': 0, 'stderr': '', 'stdout': ''}
2019-04-30 21:46:15,652 [salt.state       :1951][INFO    ][3479] Completed state [/bin/udevadm trigger --attr-match=subsystem=net] at time 21:46:15.651984 duration_in_ms=27.872
2019-04-30 21:46:15,652 [salt.state       :1780][INFO    ][3479] Running state [/etc/profile.d/proxy.sh] at time 21:46:15.652477
2019-04-30 21:46:15,652 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/profile.d/proxy.sh]
2019-04-30 21:46:15,653 [salt.state       :300 ][INFO    ][3479] File /etc/profile.d/proxy.sh is not present
2019-04-30 21:46:15,653 [salt.state       :1951][INFO    ][3479] Completed state [/etc/profile.d/proxy.sh] at time 21:46:15.653760 duration_in_ms=1.283
2019-04-30 21:46:15,654 [salt.state       :1780][INFO    ][3479] Running state [/etc/apt/apt.conf.d/95proxies] at time 21:46:15.654660
2019-04-30 21:46:15,655 [salt.state       :1813][INFO    ][3479] Executing state file.absent for [/etc/apt/apt.conf.d/95proxies]
2019-04-30 21:46:15,655 [salt.state       :300 ][INFO    ][3479] File /etc/apt/apt.conf.d/95proxies is not present
2019-04-30 21:46:15,655 [salt.state       :1951][INFO    ][3479] Completed state [/etc/apt/apt.conf.d/95proxies] at time 21:46:15.655719 duration_in_ms=1.058
2019-04-30 21:46:15,657 [salt.state       :1780][INFO    ][3479] Running state [/dev/shm] at time 21:46:15.657084
2019-04-30 21:46:15,657 [salt.state       :1813][INFO    ][3479] Executing state mount.mounted for [/dev/shm]
2019-04-30 21:46:15,658 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -l' in directory '/root'
2019-04-30 21:46:15,671 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'blkid' in directory '/root'
2019-04-30 21:46:15,693 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -l' in directory '/root'
2019-04-30 21:46:15,706 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -o rw,nosuid,nodev,noexec,relatime,remount -t tmpfs shm /dev/shm' in directory '/root'
2019-04-30 21:46:15,717 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -l' in directory '/root'
2019-04-30 21:46:15,729 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -o rw,nosuid,nodev,noexec,relatime,remount -t tmpfs shm /dev/shm' in directory '/root'
2019-04-30 21:46:15,743 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -l' in directory '/root'
2019-04-30 21:46:15,755 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'umount /dev/shm' in directory '/root'
2019-04-30 21:46:15,765 [salt.loaded.int.module.cmdmod:730 ][ERROR   ][3479] Command '['umount', '/dev/shm']' failed with return code: 32
2019-04-30 21:46:15,766 [salt.loaded.int.module.cmdmod:734 ][ERROR   ][3479] stderr: umount: /dev/shm: target is busy
        (In some cases useful info about processes that
         use the device is found by lsof(8) or fuser(1).)
2019-04-30 21:46:15,767 [salt.loaded.int.module.cmdmod:736 ][ERROR   ][3479] retcode: 32
2019-04-30 21:46:15,767 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'mount -l' in directory '/root'
2019-04-30 21:46:15,779 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command 'blkid' in directory '/root'
2019-04-30 21:46:15,789 [salt.state       :300 ][INFO    ][3479] {'umount': "Forced unmount because devices don't match. Wanted: shm, current: tmpfs, /tmpfs"}
2019-04-30 21:46:15,789 [salt.state       :1951][INFO    ][3479] Completed state [/dev/shm] at time 21:46:15.789489 duration_in_ms=132.404
2019-04-30 21:46:15,791 [salt.state       :1780][INFO    ][3479] Running state [ntp] at time 21:46:15.791306
2019-04-30 21:46:15,791 [salt.state       :1813][INFO    ][3479] Executing state pkg.installed for [ntp]
2019-04-30 21:46:15,935 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 21:46:15,956 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'ntp'] in directory '/root'
2019-04-30 21:46:22,561 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:46:22,587 [salt.state       :300 ][INFO    ][3479] Made the following changes:
'ntp' changed from 'absent' to '1:4.2.8p4+dfsg-3ubuntu5.9'
'libopts25' changed from 'absent' to '1:5.18.7-3'

2019-04-30 21:46:22,603 [salt.state       :915 ][INFO    ][3479] Loading fresh modules for state activity
2019-04-30 21:46:22,631 [salt.state       :1951][INFO    ][3479] Completed state [ntp] at time 21:46:22.631115 duration_in_ms=6839.807
2019-04-30 21:46:22,635 [salt.state       :1780][INFO    ][3479] Running state [/etc/ntp.conf] at time 21:46:22.635234
2019-04-30 21:46:22,635 [salt.state       :1813][INFO    ][3479] Executing state file.managed for [/etc/ntp.conf]
2019-04-30 21:46:22,654 [salt.fileclient  :1219][INFO    ][3479] Fetching file from saltenv 'base', ** done ** 'ntp/files/ntp.conf'
2019-04-30 21:46:22,701 [salt.state       :300 ][INFO    ][3479] File changed:
--- 
+++ 
@@ -1,66 +1,25 @@
-# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help
 
-driftfile /var/lib/ntp/ntp.drift
 
-# Enable this if you want statistics to be logged.
-#statsdir /var/log/ntpstats/
+# ntpd will only synchronize your clock.
 
-statistics loopstats peerstats clockstats
-filegen loopstats file loopstats type day enable
-filegen peerstats file peerstats type day enable
-filegen clockstats file clockstats type day enable
+# For details, see:
+# - the ntp.conf man page
+# - http://support.ntp.org/bin/view/Support/GettingStarted
+# - https://wiki.archlinux.org/index.php/Network_Time_Protocol_daemon
 
-# Specify one or more NTP servers.
+# Associate to cloud NTP pool servers
+logfile /var/log/ntp.log
+server 1.pool.ntp.org iburst
 
-# Use servers from the NTP Pool Project. Approved by Ubuntu Technical Board
-# on 2011-02-08 (LP: #104525). See http://www.pool.ntp.org/join.html for
-# more information.
-pool 0.ubuntu.pool.ntp.org iburst
-pool 1.ubuntu.pool.ntp.org iburst
-pool 2.ubuntu.pool.ntp.org iburst
-pool 3.ubuntu.pool.ntp.org iburst
+# Exchange time with everybody, but don't allow configuration.
+restrict -4 default kod nomodify notrap nopeer noquery
+restrict -6 default kod nomodify notrap nopeer noquery
 
-# Use Ubuntu's ntp server as a fallback.
-pool ntp.ubuntu.com
-
-# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for
-# details.  The web page <http://support.ntp.org/bin/view/Support/AccessRestrictions>
-# might also be helpful.
-#
-# Note that "restrict" applies to both servers and clients, so a configuration
-# that might be intended to block requests from certain clients could also end
-# up blocking replies from your own upstream servers.
-
-# By default, exchange time with everybody, but don't allow configuration.
-restrict -4 default kod notrap nomodify nopeer noquery limited
-restrict -6 default kod notrap nomodify nopeer noquery limited
-
-# Local users may interrogate the ntp server more closely.
+# Only allow read-only access from localhost
 restrict 127.0.0.1
 restrict ::1
 
-# Needed for adding pool entries
-restrict source notrap nomodify noquery
+# mode7 is required for collectd monitoring
 
-# Clients from this (example!) subnet have unlimited access, but only if
-# cryptographically authenticated.
-#restrict 192.168.123.0 mask 255.255.255.0 notrust
-
-
-# If you want to provide time to your local subnet, change the next line.
-# (Again, the address is an example only.)
-#broadcast 192.168.123.255
-
-# If you want to listen to time broadcasts on your local subnet, de-comment the
-# next lines.  Please do this only if you trust everybody on the network!
-#disable auth
-#broadcastclient
-
-#Changes recquired to use pps synchonisation as explained in documentation:
-#http://www.ntp.org/ntpfaq/NTP-s-config-adv.htm#AEN3918
-
-#server 127.127.8.1 mode 135 prefer    # Meinberg GPS167 with PPS
-#fudge 127.127.8.1 time1 0.0042        # relative to PPS for my hardware
-
-#server 127.127.22.1                   # ATOM(PPS)
-#fudge 127.127.22.1 flag3 1            # enable PPS API
+# Location of drift file
+driftfile /var/lib/ntp/ntp.drift

2019-04-30 21:46:22,709 [salt.state       :1951][INFO    ][3479] Completed state [/etc/ntp.conf] at time 21:46:22.708945 duration_in_ms=73.71
2019-04-30 21:46:23,113 [salt.state       :1780][INFO    ][3479] Running state [ntp] at time 21:46:23.113031
2019-04-30 21:46:23,113 [salt.state       :1813][INFO    ][3479] Executing state service.running for [ntp]
2019-04-30 21:46:23,114 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'status', 'ntp.service', '-n', '0'] in directory '/root'
2019-04-30 21:46:23,128 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2019-04-30 21:46:23,139 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-enabled', 'ntp.service'] in directory '/root'
2019-04-30 21:46:23,151 [salt.state       :300 ][INFO    ][3479] The service ntp is already running
2019-04-30 21:46:23,152 [salt.state       :1951][INFO    ][3479] Completed state [ntp] at time 21:46:23.152331 duration_in_ms=39.299
2019-04-30 21:46:23,152 [salt.state       :1780][INFO    ][3479] Running state [ntp] at time 21:46:23.152675
2019-04-30 21:46:23,153 [salt.state       :1813][INFO    ][3479] Executing state service.mod_watch for [ntp]
2019-04-30 21:46:23,154 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemctl', 'is-active', 'ntp.service'] in directory '/root'
2019-04-30 21:46:23,165 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3479] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'ntp.service'] in directory '/root'
2019-04-30 21:46:23,225 [salt.state       :300 ][INFO    ][3479] {'ntp': True}
2019-04-30 21:46:23,225 [salt.state       :1951][INFO    ][3479] Completed state [ntp] at time 21:46:23.225636 duration_in_ms=72.961
2019-04-30 21:46:23,235 [salt.minion      :1711][INFO    ][3479] Returning information for job: 20190430214452901770
2019-04-30 21:46:26,176 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command ssh.set_auth_key with jid 20190430214626163802
2019-04-30 21:46:26,194 [salt.minion      :1432][INFO    ][7782] Starting a new job with PID 7782
2019-04-30 21:46:26,211 [salt.minion      :1711][INFO    ][7782] Returning information for job: 20190430214626163802
2019-04-30 21:46:26,864 [salt.minion      :1308][INFO    ][3390] User sudo_ubuntu Executing command cmd.run with jid 20190430214626851130
2019-04-30 21:46:26,905 [salt.minion      :1432][INFO    ][7787] Starting a new job with PID 7787
2019-04-30 21:46:26,911 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][7787] Executing command 'reboot' in directory '/root'
2019-04-30 21:46:27,196 [salt.utils.parsers:1051][WARNING ][3390] Minion received a SIGTERM. Exiting.
2019-04-30 21:46:27,196 [salt.cli.daemons :82  ][INFO    ][3390] Shutting down the Salt Minion
2019-04-30 21:46:29,967 [salt.minion      :1711][INFO    ][7787] Returning information for job: 20190430214626851130
2019-04-30 21:47:25,264 [salt.cli.daemons :293 ][INFO    ][1820] Setting up the Salt Minion "prx01.mcp-ovs-ha.local"
2019-04-30 21:47:28,731 [salt.cli.daemons :82  ][INFO    ][1820] Starting up the Salt Minion
2019-04-30 21:47:28,741 [salt.utils.event :1017][INFO    ][1820] Starting pull socket on /var/run/salt/minion/minion_event_ff902ec8d4_pull.ipc
2019-04-30 21:47:32,002 [salt.minion      :976 ][INFO    ][1820] Creating minion process manager
2019-04-30 21:47:34,357 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1820] Executing command ['date', '+%z'] in directory '/root'
2019-04-30 21:47:34,426 [salt.utils.schedule:568 ][INFO    ][1820] Updating job settings for scheduled job: __mine_interval
2019-04-30 21:47:34,427 [salt.minion      :1108][INFO    ][1820] Added mine.update to scheduler
2019-04-30 21:47:34,462 [salt.minion      :1975][INFO    ][1820] Minion is starting as user 'root'
2019-04-30 21:47:34,473 [salt.minion      :2336][INFO    ][1820] Minion is ready to receive requests!
2019-04-30 21:48:08,267 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command test.ping with jid 20190430214808259937
2019-04-30 21:48:08,283 [salt.minion      :1432][INFO    ][1905] Starting a new job with PID 1905
2019-04-30 21:48:08,530 [salt.minion      :1711][INFO    ][1905] Returning information for job: 20190430214808259937
2019-04-30 21:48:09,212 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command pkg.upgrade with jid 20190430214809204670
2019-04-30 21:48:09,233 [salt.minion      :1432][INFO    ][1910] Starting a new job with PID 1910
2019-04-30 21:48:09,674 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1910] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:48:11,766 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1910] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'dist-upgrade'] in directory '/root'
2019-04-30 21:48:24,270 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214824262928
2019-04-30 21:48:24,283 [salt.minion      :1432][INFO    ][2013] Starting a new job with PID 2013
2019-04-30 21:48:24,535 [salt.minion      :1711][INFO    ][2013] Returning information for job: 20190430214824262928
2019-04-30 21:48:54,497 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214854487319
2019-04-30 21:48:54,510 [salt.minion      :1432][INFO    ][2226] Starting a new job with PID 2226
2019-04-30 21:48:54,583 [salt.minion      :1711][INFO    ][2226] Returning information for job: 20190430214854487319
2019-04-30 21:49:24,591 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214924585554
2019-04-30 21:49:24,607 [salt.minion      :1432][INFO    ][2404] Starting a new job with PID 2404
2019-04-30 21:49:24,621 [salt.minion      :1711][INFO    ][2404] Returning information for job: 20190430214924585554
2019-04-30 21:49:54,784 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430214954779556
2019-04-30 21:49:54,798 [salt.minion      :1432][INFO    ][2895] Starting a new job with PID 2895
2019-04-30 21:49:54,817 [salt.minion      :1711][INFO    ][2895] Returning information for job: 20190430214954779556
2019-04-30 21:50:25,002 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430215024997446
2019-04-30 21:50:25,016 [salt.minion      :1432][INFO    ][3291] Starting a new job with PID 3291
2019-04-30 21:50:25,029 [salt.minion      :1711][INFO    ][3291] Returning information for job: 20190430215024997446
2019-04-30 21:50:34,884 [salt.loader.192.168.11.2.int.module.cmdmod:395 ][INFO    ][1910] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:50:34,907 [salt.minion      :1711][INFO    ][1910] Returning information for job: 20190430214809204670
2019-04-30 21:51:11,670 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command test.ping with jid 20190430215111669999
2019-04-30 21:51:11,686 [salt.minion      :1432][INFO    ][3442] Starting a new job with PID 3442
2019-04-30 21:51:11,704 [salt.minion      :1711][INFO    ][3442] Returning information for job: 20190430215111669999
2019-04-30 21:54:25,377 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command state.sls with jid 20190430215425366503
2019-04-30 21:54:25,390 [salt.minion      :1432][INFO    ][3447] Starting a new job with PID 3447
2019-04-30 21:54:28,217 [salt.state       :915 ][INFO    ][3447] Loading fresh modules for state activity
2019-04-30 21:54:28,282 [salt.fileclient  :1219][INFO    ][3447] Fetching file from saltenv 'base', ** done ** 'keepalived/init.sls'
2019-04-30 21:54:28,307 [salt.fileclient  :1219][INFO    ][3447] Fetching file from saltenv 'base', ** done ** 'keepalived/cluster.sls'
2019-04-30 21:54:29,932 [salt.state       :1780][INFO    ][3447] Running state [keepalived] at time 21:54:29.932887
2019-04-30 21:54:29,933 [salt.state       :1813][INFO    ][3447] Executing state pkg.installed for [keepalived]
2019-04-30 21:54:29,933 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:54:30,219 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['apt-cache', '-q', 'policy', 'keepalived'] in directory '/root'
2019-04-30 21:54:30,271 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 21:54:31,945 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 21:54:31,961 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'keepalived'] in directory '/root'
2019-04-30 21:54:39,300 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 21:54:39,325 [salt.state       :300 ][INFO    ][3447] Made the following changes:
'libsnmp30' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.2'
'libsensors4' changed from 'absent' to '1:3.4.0-2'
'libsnmp-base' changed from 'absent' to '5.7.3+dfsg-1ubuntu4.2'
'keepalived' changed from 'absent' to '1:1.2.24-1ubuntu0.16.04.1'
'ipvsadm' changed from 'absent' to '1:1.28-3'
'libnl-route-3-200' changed from 'absent' to '3.2.27-1ubuntu0.16.04.1'

2019-04-30 21:54:39,352 [salt.state       :915 ][INFO    ][3447] Loading fresh modules for state activity
2019-04-30 21:54:39,374 [salt.state       :1951][INFO    ][3447] Completed state [keepalived] at time 21:54:39.374581 duration_in_ms=9441.694
2019-04-30 21:54:39,378 [salt.state       :1780][INFO    ][3447] Running state [lsof] at time 21:54:39.378129
2019-04-30 21:54:39,378 [salt.state       :1813][INFO    ][3447] Executing state pkg.installed for [lsof]
2019-04-30 21:54:39,795 [salt.state       :300 ][INFO    ][3447] All specified packages are already installed
2019-04-30 21:54:39,796 [salt.state       :1951][INFO    ][3447] Completed state [lsof] at time 21:54:39.796117 duration_in_ms=417.988
2019-04-30 21:54:39,802 [salt.state       :1780][INFO    ][3447] Running state [/etc/keepalived/keepalived.conf] at time 21:54:39.802724
2019-04-30 21:54:39,802 [salt.state       :1813][INFO    ][3447] Executing state file.managed for [/etc/keepalived/keepalived.conf]
2019-04-30 21:54:39,824 [salt.fileclient  :1219][INFO    ][3447] Fetching file from saltenv 'base', ** done ** 'keepalived/files/keepalived.conf'
2019-04-30 21:54:39,857 [salt.state       :300 ][INFO    ][3447] File changed:
New file
2019-04-30 21:54:39,858 [salt.state       :1951][INFO    ][3447] Completed state [/etc/keepalived/keepalived.conf] at time 21:54:39.858004 duration_in_ms=55.279
2019-04-30 21:54:39,858 [salt.state       :1780][INFO    ][3447] Running state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 21:54:39.858183
2019-04-30 21:54:39,858 [salt.state       :1813][INFO    ][3447] Executing state file.managed for [/usr/local/bin/vrrp_script_check_pidof.sh]
2019-04-30 21:54:39,941 [salt.fileclient  :1219][INFO    ][3447] Fetching file from saltenv 'base', ** done ** 'keepalived/files/vrrp_script_check_pidof.sh'
2019-04-30 21:54:39,946 [salt.state       :300 ][INFO    ][3447] File changed:
New file
2019-04-30 21:54:39,946 [salt.state       :1951][INFO    ][3447] Completed state [/usr/local/bin/vrrp_script_check_pidof.sh] at time 21:54:39.946301 duration_in_ms=88.118
2019-04-30 21:54:39,947 [salt.state       :1780][INFO    ][3447] Running state [keepalived] at time 21:54:39.947484
2019-04-30 21:54:39,947 [salt.state       :1813][INFO    ][3447] Executing state service.running for [keepalived]
2019-04-30 21:54:39,948 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'status', 'keepalived.service', '-n', '0'] in directory '/root'
2019-04-30 21:54:39,959 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:39,968 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:39,977 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:40,092 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'is-active', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:40,107 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:40,117 [salt.loaded.int.module.cmdmod:395 ][INFO    ][3447] Executing command ['systemctl', 'is-enabled', 'keepalived.service'] in directory '/root'
2019-04-30 21:54:40,130 [salt.state       :300 ][INFO    ][3447] {'keepalived': True}
2019-04-30 21:54:40,130 [salt.state       :1951][INFO    ][3447] Completed state [keepalived] at time 21:54:40.130588 duration_in_ms=183.101
2019-04-30 21:54:40,132 [salt.minion      :1711][INFO    ][3447] Returning information for job: 20190430215425366503
2019-04-30 21:55:05,906 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command pillar.get with jid 20190430215505899701
2019-04-30 21:55:05,925 [salt.minion      :1432][INFO    ][4828] Starting a new job with PID 4828
2019-04-30 21:55:05,930 [salt.minion      :1711][INFO    ][4828] Returning information for job: 20190430215505899701
2019-04-30 22:06:23,950 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command state.sls with jid 20190430220623943219
2019-04-30 22:06:23,965 [salt.minion      :1432][INFO    ][5124] Starting a new job with PID 5124
2019-04-30 22:06:27,637 [salt.state       :915 ][INFO    ][5124] Loading fresh modules for state activity
2019-04-30 22:06:27,674 [salt.fileclient  :1219][INFO    ][5124] Fetching file from saltenv 'base', ** done ** 'memcached/init.sls'
2019-04-30 22:06:28,040 [salt.fileclient  :1219][INFO    ][5124] Fetching file from saltenv 'base', ** done ** 'memcached/server.sls'
2019-04-30 22:06:28,059 [salt.fileclient  :1219][INFO    ][5124] Fetching file from saltenv 'base', ** done ** 'memcached/map.jinja'
2019-04-30 22:06:28,543 [salt.state       :1780][INFO    ][5124] Running state [memcached] at time 22:06:28.543451
2019-04-30 22:06:28,544 [salt.state       :1813][INFO    ][5124] Executing state pkg.installed for [memcached]
2019-04-30 22:06:28,544 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:06:28,850 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['apt-cache', '-q', 'policy', 'memcached'] in directory '/root'
2019-04-30 22:06:28,907 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 22:06:30,465 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:06:30,483 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'memcached'] in directory '/root'
2019-04-30 22:06:35,259 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:06:35,283 [salt.state       :300 ][INFO    ][5124] Made the following changes:
'memcached' changed from 'absent' to '1.4.25-2ubuntu1.4'

2019-04-30 22:06:35,298 [salt.state       :915 ][INFO    ][5124] Loading fresh modules for state activity
2019-04-30 22:06:35,326 [salt.state       :1951][INFO    ][5124] Completed state [memcached] at time 22:06:35.326470 duration_in_ms=6783.018
2019-04-30 22:06:35,330 [salt.state       :1780][INFO    ][5124] Running state [python-memcache] at time 22:06:35.330518
2019-04-30 22:06:35,330 [salt.state       :1813][INFO    ][5124] Executing state pkg.installed for [python-memcache]
2019-04-30 22:06:35,774 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:06:35,794 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-memcache'] in directory '/root'
2019-04-30 22:06:37,924 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:06:37,948 [salt.state       :300 ][INFO    ][5124] Made the following changes:
'python-memcache' changed from 'absent' to '1.57+fixed-1~u16.04+mcp1'

2019-04-30 22:06:37,962 [salt.state       :915 ][INFO    ][5124] Loading fresh modules for state activity
2019-04-30 22:06:37,988 [salt.state       :1951][INFO    ][5124] Completed state [python-memcache] at time 22:06:37.988499 duration_in_ms=2657.981
2019-04-30 22:06:37,992 [salt.state       :1780][INFO    ][5124] Running state [/etc/memcached.conf] at time 22:06:37.992702
2019-04-30 22:06:37,993 [salt.state       :1813][INFO    ][5124] Executing state file.managed for [/etc/memcached.conf]
2019-04-30 22:06:38,014 [salt.fileclient  :1219][INFO    ][5124] Fetching file from saltenv 'base', ** done ** 'memcached/files/memcached.conf'
2019-04-30 22:06:38,039 [salt.state       :300 ][INFO    ][5124] File changed:
--- 
+++ 
@@ -1,11 +1,10 @@
+
 # memcached default config file
 # 2003 - Jay Bonci <jaybonci@debian.org>
-# This configuration file is read by the start-memcached script provided as
-# part of the Debian GNU/Linux distribution.
+# This configuration file is read by the start-memcached script provided as part of the Debian GNU/Linux distribution. 
 
 # Run memcached as a daemon. This command is implied, and is not needed for the
-# daemon to run. See the README.Debian that comes with this package for more
-# information.
+# daemon to run. See the README.Debian that comes with this package for more information.
 -d
 
 # Log memcached's output to /var/log/memcached
@@ -18,13 +17,11 @@
 # -vv
 
 # Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
-# Note that the daemon will grow to this size, but does not start out holding this much
-# memory
+# Note that the daemon will grow to this size, but does not start out holding this much memory
 -m 64
 
 # Default connection port is 11211
 -p 11211
-
 # Run the daemon as root. The start-memcached will default to running as root if no
 # -u command is present in this config file
 -u memcache
@@ -32,10 +29,12 @@
 # Specify which IP address to listen on. The default is to listen on all IP addresses
 # This parameter is one of the only security measures that memcached has, so make sure
 # it's listening on a firewalled interface.
--l 127.0.0.1
+-l 0.0.0.0
 
 # Limit the number of simultaneous incoming connections. The daemon default is 1024
 # -c 1024
+# Mirantis
+-c 8192
 
 # Lock down all paged memory. Consult with the README and homepage before you do this
 # -k
@@ -45,3 +44,9 @@
 
 # Maximize core file limit
 # -r
+
+# Number of threads to use to process incoming requests.
+-t 1
+
+# Set size of each slab page. Default value for this parameter is 1m, minimum is 1k, max is 128m.
+-I 1m

2019-04-30 22:06:38,040 [salt.state       :1951][INFO    ][5124] Completed state [/etc/memcached.conf] at time 22:06:38.040303 duration_in_ms=47.601
2019-04-30 22:06:38,356 [salt.state       :1780][INFO    ][5124] Running state [memcached] at time 22:06:38.356161
2019-04-30 22:06:38,356 [salt.state       :1813][INFO    ][5124] Executing state service.running for [memcached]
2019-04-30 22:06:38,357 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemctl', 'status', 'memcached.service', '-n', '0'] in directory '/root'
2019-04-30 22:06:38,367 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2019-04-30 22:06:38,376 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemctl', 'is-enabled', 'memcached.service'] in directory '/root'
2019-04-30 22:06:38,385 [salt.state       :300 ][INFO    ][5124] The service memcached is already running
2019-04-30 22:06:38,385 [salt.state       :1951][INFO    ][5124] Completed state [memcached] at time 22:06:38.385487 duration_in_ms=29.324
2019-04-30 22:06:38,385 [salt.state       :1780][INFO    ][5124] Running state [memcached] at time 22:06:38.385817
2019-04-30 22:06:38,386 [salt.state       :1813][INFO    ][5124] Executing state service.mod_watch for [memcached]
2019-04-30 22:06:38,386 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemctl', 'is-active', 'memcached.service'] in directory '/root'
2019-04-30 22:06:38,395 [salt.loaded.int.module.cmdmod:395 ][INFO    ][5124] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'memcached.service'] in directory '/root'
2019-04-30 22:06:38,413 [salt.state       :300 ][INFO    ][5124] {'memcached': True}
2019-04-30 22:06:38,414 [salt.state       :1951][INFO    ][5124] Completed state [memcached] at time 22:06:38.414481 duration_in_ms=28.664
2019-04-30 22:06:38,416 [salt.minion      :1711][INFO    ][5124] Returning information for job: 20190430220623943219
2019-04-30 22:45:16,223 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command state.sls with jid 20190430224516212732
2019-04-30 22:45:16,233 [salt.minion      :1432][INFO    ][7053] Starting a new job with PID 7053
2019-04-30 22:45:19,776 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:45:19,814 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/init.sls'
2019-04-30 22:45:19,844 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/init.sls'
2019-04-30 22:45:19,867 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/service/init.sls'
2019-04-30 22:45:20,001 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/service/modules.sls'
2019-04-30 22:45:20,057 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/service/mpm.sls'
2019-04-30 22:45:20,108 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/site.sls'
2019-04-30 22:45:20,203 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/users.sls'
2019-04-30 22:45:20,250 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/server/robots.sls'
2019-04-30 22:45:20,295 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/init.sls'
2019-04-30 22:45:20,317 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/server/init.sls'
2019-04-30 22:45:20,335 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/server/service.sls'
2019-04-30 22:45:20,403 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/server/plugin.sls'
2019-04-30 22:45:20,924 [salt.state       :1780][INFO    ][7053] Running state [apache2] at time 22:45:20.924450
2019-04-30 22:45:20,924 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [apache2]
2019-04-30 22:45:20,925 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:45:21,212 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['apt-cache', '-q', 'policy', 'apache2'] in directory '/root'
2019-04-30 22:45:21,269 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 22:45:22,936 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:45:22,952 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'apache2'] in directory '/root'
2019-04-30 22:45:31,347 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224531334876
2019-04-30 22:45:31,358 [salt.minion      :1432][INFO    ][8004] Starting a new job with PID 8004
2019-04-30 22:45:31,376 [salt.minion      :1711][INFO    ][8004] Returning information for job: 20190430224531334876
2019-04-30 22:45:35,325 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:45:35,351 [salt.state       :300 ][INFO    ][7053] Made the following changes:
'apache2-data' changed from 'absent' to '2.4.18-2ubuntu3.10'
'httpd-cgi' changed from 'absent' to '1'
'apache2-utils' changed from 'absent' to '2.4.18-2ubuntu3.10'
'httpd' changed from 'absent' to '1'
'ssl-cert' changed from 'absent' to '1.0.37'
'apache2' changed from 'absent' to '2.4.18-2ubuntu3.10'

2019-04-30 22:45:35,364 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:45:35,387 [salt.state       :1951][INFO    ][7053] Completed state [apache2] at time 22:45:35.387323 duration_in_ms=14462.874
2019-04-30 22:45:35,391 [salt.state       :1780][INFO    ][7053] Running state [openssl] at time 22:45:35.391116
2019-04-30 22:45:35,391 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [openssl]
2019-04-30 22:45:35,822 [salt.state       :300 ][INFO    ][7053] All specified packages are already installed
2019-04-30 22:45:35,822 [salt.state       :1951][INFO    ][7053] Completed state [openssl] at time 22:45:35.822334 duration_in_ms=431.218
2019-04-30 22:45:35,823 [salt.state       :1780][INFO    ][7053] Running state [a2enmod ssl] at time 22:45:35.823300
2019-04-30 22:45:35,823 [salt.state       :1813][INFO    ][7053] Executing state cmd.run for [a2enmod ssl]
2019-04-30 22:45:35,823 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command 'a2enmod ssl' in directory '/root'
2019-04-30 22:45:35,863 [salt.state       :300 ][INFO    ][7053] {'pid': 8520, 'retcode': 0, 'stderr': '', 'stdout': 'Considering dependency setenvif for ssl:\nModule setenvif already enabled\nConsidering dependency mime for ssl:\nModule mime already enabled\nConsidering dependency socache_shmcb for ssl:\nEnabling module socache_shmcb.\nEnabling module ssl.\nSee /usr/share/doc/apache2/README.Debian.gz on how to configure SSL and create self-signed certificates.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2019-04-30 22:45:35,863 [salt.state       :1951][INFO    ][7053] Completed state [a2enmod ssl] at time 22:45:35.863836 duration_in_ms=40.535
2019-04-30 22:45:35,864 [salt.state       :1780][INFO    ][7053] Running state [a2enmod rewrite] at time 22:45:35.864416
2019-04-30 22:45:35,864 [salt.state       :1813][INFO    ][7053] Executing state cmd.run for [a2enmod rewrite]
2019-04-30 22:45:35,865 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command 'a2enmod rewrite' in directory '/root'
2019-04-30 22:45:35,900 [salt.state       :300 ][INFO    ][7053] {'pid': 8533, 'retcode': 0, 'stderr': '', 'stdout': 'Enabling module rewrite.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2019-04-30 22:45:35,900 [salt.state       :1951][INFO    ][7053] Completed state [a2enmod rewrite] at time 22:45:35.900359 duration_in_ms=35.941
2019-04-30 22:45:35,900 [salt.state       :1780][INFO    ][7053] Running state [libapache2-mod-wsgi] at time 22:45:35.900926
2019-04-30 22:45:35,901 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [libapache2-mod-wsgi]
2019-04-30 22:45:35,964 [salt.state       :300 ][INFO    ][7053] All specified packages are already installed
2019-04-30 22:45:35,964 [salt.state       :1951][INFO    ][7053] Completed state [libapache2-mod-wsgi] at time 22:45:35.964505 duration_in_ms=63.578
2019-04-30 22:45:35,964 [salt.state       :1780][INFO    ][7053] Running state [a2enmod wsgi] at time 22:45:35.964865
2019-04-30 22:45:35,965 [salt.state       :1813][INFO    ][7053] Executing state cmd.run for [a2enmod wsgi]
2019-04-30 22:45:35,965 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command 'a2enmod wsgi' in directory '/root'
2019-04-30 22:45:36,000 [salt.state       :300 ][INFO    ][7053] {'pid': 8542, 'retcode': 0, 'stderr': '', 'stdout': 'Enabling module wsgi.\nTo activate the new configuration, you need to run:\n  service apache2 restart'}
2019-04-30 22:45:36,001 [salt.state       :1951][INFO    ][7053] Completed state [a2enmod wsgi] at time 22:45:36.001274 duration_in_ms=36.408
2019-04-30 22:45:36,001 [salt.state       :1780][INFO    ][7053] Running state [a2enmod status -q] at time 22:45:36.001879
2019-04-30 22:45:36,002 [salt.state       :1813][INFO    ][7053] Executing state cmd.run for [a2enmod status -q]
2019-04-30 22:45:36,002 [salt.state       :300 ][INFO    ][7053] /etc/apache2/mods-enabled/status.load exists
2019-04-30 22:45:36,002 [salt.state       :1951][INFO    ][7053] Completed state [a2enmod status -q] at time 22:45:36.002721 duration_in_ms=0.842
2019-04-30 22:45:36,005 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/mods-enabled/mpm_prefork.load] at time 22:45:36.005601
2019-04-30 22:45:36,005 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_prefork.load]
2019-04-30 22:45:36,006 [salt.state       :300 ][INFO    ][7053] File /etc/apache2/mods-enabled/mpm_prefork.load is not present
2019-04-30 22:45:36,006 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/mods-enabled/mpm_prefork.load] at time 22:45:36.006331 duration_in_ms=0.731
2019-04-30 22:45:36,006 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/mods-enabled/mpm_prefork.conf] at time 22:45:36.006520
2019-04-30 22:45:36,006 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_prefork.conf]
2019-04-30 22:45:36,006 [salt.state       :300 ][INFO    ][7053] File /etc/apache2/mods-enabled/mpm_prefork.conf is not present
2019-04-30 22:45:36,007 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/mods-enabled/mpm_prefork.conf] at time 22:45:36.007101 duration_in_ms=0.582
2019-04-30 22:45:36,007 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/mods-enabled/mpm_worker.load] at time 22:45:36.007270
2019-04-30 22:45:36,007 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.load]
2019-04-30 22:45:36,007 [salt.state       :300 ][INFO    ][7053] File /etc/apache2/mods-enabled/mpm_worker.load is not present
2019-04-30 22:45:36,007 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/mods-enabled/mpm_worker.load] at time 22:45:36.007821 duration_in_ms=0.55
2019-04-30 22:45:36,008 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 22:45:36.007995
2019-04-30 22:45:36,008 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/mods-enabled/mpm_worker.conf]
2019-04-30 22:45:36,008 [salt.state       :300 ][INFO    ][7053] File /etc/apache2/mods-enabled/mpm_worker.conf is not present
2019-04-30 22:45:36,008 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/mods-enabled/mpm_worker.conf] at time 22:45:36.008538 duration_in_ms=0.542
2019-04-30 22:45:36,010 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/mods-available/mpm_event.conf] at time 22:45:36.010229
2019-04-30 22:45:36,010 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/etc/apache2/mods-available/mpm_event.conf]
2019-04-30 22:45:36,032 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/mpm/mpm_event.conf'
2019-04-30 22:45:36,067 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -5,14 +5,15 @@
 # ThreadsPerChild: constant number of worker threads in each server process
 # MaxRequestWorkers: maximum number of worker threads
 # MaxConnectionsPerChild: maximum number of requests a server process serves
+
 <IfModule mpm_event_module>
-	StartServers			 2
-	MinSpareThreads		 25
-	MaxSpareThreads		 75
-	ThreadLimit			 64
-	ThreadsPerChild		 25
-	MaxRequestWorkers	  150
-	MaxConnectionsPerChild   0
+    StartServers            5
+    MinSpareThreads         25
+    MaxSpareThreads         75
+    ThreadLimit             64
+    ThreadsPerChild         25
+    MaxRequestWorkers       150
+    MaxConnectionsPerChild  0
 </IfModule>
 
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+# vim: syntax=apache ts=4 sw=4 sts=4 sr et

2019-04-30 22:45:36,067 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/mods-available/mpm_event.conf] at time 22:45:36.067851 duration_in_ms=57.622
2019-04-30 22:45:36,069 [salt.state       :1780][INFO    ][7053] Running state [a2enmod mpm_event] at time 22:45:36.069096
2019-04-30 22:45:36,069 [salt.state       :1813][INFO    ][7053] Executing state cmd.run for [a2enmod mpm_event]
2019-04-30 22:45:36,069 [salt.state       :300 ][INFO    ][7053] /etc/apache2/mods-enabled/mpm_event.load exists
2019-04-30 22:45:36,069 [salt.state       :1951][INFO    ][7053] Completed state [a2enmod mpm_event] at time 22:45:36.069686 duration_in_ms=0.59
2019-04-30 22:45:36,070 [salt.state       :1780][INFO    ][7053] Running state [apache_server_service_task] at time 22:45:36.070540
2019-04-30 22:45:36,070 [salt.state       :1813][INFO    ][7053] Executing state test.show_notification for [apache_server_service_task]
2019-04-30 22:45:36,070 [salt.state       :300 ][INFO    ][7053] Running apache.server.service
2019-04-30 22:45:36,071 [salt.state       :1951][INFO    ][7053] Completed state [apache_server_service_task] at time 22:45:36.071020 duration_in_ms=0.48
2019-04-30 22:45:36,071 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/ports.conf] at time 22:45:36.071338
2019-04-30 22:45:36,071 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/etc/apache2/ports.conf]
2019-04-30 22:45:36,087 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/ports.conf'
2019-04-30 22:45:36,121 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -2,14 +2,4 @@
 # have to change the VirtualHost statement in
 # /etc/apache2/sites-enabled/000-default.conf
 
-Listen 80
-
-<IfModule ssl_module>
-	Listen 443
-</IfModule>
-
-<IfModule mod_gnutls.c>
-	Listen 443
-</IfModule>
-
 # vim: syntax=apache ts=4 sw=4 sts=4 sr noet

2019-04-30 22:45:36,121 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/ports.conf] at time 22:45:36.121299 duration_in_ms=49.961
2019-04-30 22:45:36,121 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/conf-available/security.conf] at time 22:45:36.121628
2019-04-30 22:45:36,121 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/etc/apache2/conf-available/security.conf]
2019-04-30 22:45:36,136 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/security.conf'
2019-04-30 22:45:36,168 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -1,73 +1,14 @@
-#
-# Disable access to the entire file system except for the directories that
-# are explicitly allowed later.
-#
-# This currently breaks the configurations that come with some web application
-# Debian packages.
-#
-#<Directory />
-#   AllowOverride None
-#   Require all denied
-#</Directory>
+ServerSignature Off
+TraceEnable Off
+ServerTokens Prod
+<DirectoryMatch "/\.svn">
+    Require all denied
+</DirectoryMatch>
 
+<DirectoryMatch "/\.git">
+    Require all denied
+</DirectoryMatch>
 
-# Changing the following options will not really affect the security of the
-# server, but might make attacks slightly more difficult in some cases.
-
-#
-# ServerTokens
-# This directive configures what you return as the Server HTTP response
-# Header. The default is 'Full' which sends information about the OS-Type
-# and compiled in modules.
-# Set to one of:  Full | OS | Minimal | Minor | Major | Prod
-# where Full conveys the most information, and Prod the least.
-#ServerTokens Minimal
-ServerTokens OS
-#ServerTokens Full
-
-#
-# Optionally add a line containing the server version and virtual host
-# name to server-generated pages (internal error documents, FTP directory
-# listings, mod_status and mod_info output etc., but not CGI generated
-# documents or custom error documents).
-# Set to "EMail" to also include a mailto: link to the ServerAdmin.
-# Set to one of:  On | Off | EMail
-#ServerSignature Off
-ServerSignature On
-
-#
-# Allow TRACE method
-#
-# Set to "extended" to also reflect the request body (only for testing and
-# diagnostic purposes).
-#
-# Set to one of:  On | Off | extended
-TraceEnable Off
-#TraceEnable On
-
-#
-# Forbid access to version control directories
-#
-# If you use version control systems in your document root, you should
-# probably deny access to their directories. For example, for subversion:
-#
-#<DirectoryMatch "/\.svn">
-#   Require all denied
-#</DirectoryMatch>
-
-#
-# Setting this header will prevent MSIE from interpreting files as something
-# else than declared by the content type in the HTTP headers.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Content-Type-Options: "nosniff"
-
-#
-# Setting this header will prevent other sites from embedding pages from this
-# site as frames. This defends against clickjacking attacks.
-# Requires mod_headers to be enabled.
-#
-#Header set X-Frame-Options: "sameorigin"
-
-
-# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
+<DirectoryMatch "/\.hg">
+    Require all denied
+</DirectoryMatch>

2019-04-30 22:45:36,169 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/conf-available/security.conf] at time 22:45:36.169028 duration_in_ms=47.4
2019-04-30 22:45:36,174 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/sites-enabled/000-default.conf] at time 22:45:36.174712
2019-04-30 22:45:36,174 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/sites-enabled/000-default.conf]
2019-04-30 22:45:36,175 [salt.state       :300 ][INFO    ][7053] {'removed': '/etc/apache2/sites-enabled/000-default.conf'}
2019-04-30 22:45:36,175 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/sites-enabled/000-default.conf] at time 22:45:36.175297 duration_in_ms=0.585
2019-04-30 22:45:36,175 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/sites-available/wsgi_openstack_web.conf] at time 22:45:36.175631
2019-04-30 22:45:36,175 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/etc/apache2/sites-available/wsgi_openstack_web.conf]
2019-04-30 22:45:36,189 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/wsgi.conf'
2019-04-30 22:45:36,208 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_name.conf'
2019-04-30 22:45:36,246 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_wsgi.conf'
2019-04-30 22:45:36,269 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_ssl.conf'
2019-04-30 22:45:36,323 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_core.conf'
2019-04-30 22:45:36,338 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_log.conf'
2019-04-30 22:45:36,355 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'apache/files/_limits.conf'
2019-04-30 22:45:36,361 [salt.state       :300 ][INFO    ][7053] File changed:
New file
2019-04-30 22:45:36,361 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/sites-available/wsgi_openstack_web.conf] at time 22:45:36.361791 duration_in_ms=186.158
2019-04-30 22:45:36,362 [salt.state       :1780][INFO    ][7053] Running state [openstack-dashboard] at time 22:45:36.362718
2019-04-30 22:45:36,362 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [openstack-dashboard]
2019-04-30 22:45:36,377 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:45:36,397 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'openstack-dashboard'] in directory '/root'
2019-04-30 22:46:01,369 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224601357520
2019-04-30 22:46:01,391 [salt.minion      :1432][INFO    ][8800] Starting a new job with PID 8800
2019-04-30 22:46:01,422 [salt.minion      :1711][INFO    ][8800] Returning information for job: 20190430224601357520
2019-04-30 22:46:31,396 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224631383444
2019-04-30 22:46:31,408 [salt.minion      :1432][INFO    ][9501] Starting a new job with PID 9501
2019-04-30 22:46:31,438 [salt.minion      :1711][INFO    ][9501] Returning information for job: 20190430224631383444
2019-04-30 22:47:01,433 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224701421910
2019-04-30 22:47:01,457 [salt.minion      :1432][INFO    ][10671] Starting a new job with PID 10671
2019-04-30 22:47:01,594 [salt.minion      :1711][INFO    ][10671] Returning information for job: 20190430224701421910
2019-04-30 22:47:31,592 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224731579795
2019-04-30 22:47:31,603 [salt.minion      :1432][INFO    ][11403] Starting a new job with PID 11403
2019-04-30 22:47:31,618 [salt.minion      :1711][INFO    ][11403] Returning information for job: 20190430224731579795
2019-04-30 22:47:35,476 [salt.utils.schedule:1377][INFO    ][1820] Running scheduled job: __mine_interval
2019-04-30 22:48:01,619 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224801607650
2019-04-30 22:48:01,641 [salt.minion      :1432][INFO    ][11723] Starting a new job with PID 11723
2019-04-30 22:48:01,668 [salt.minion      :1711][INFO    ][11723] Returning information for job: 20190430224801607650
2019-04-30 22:48:10,887 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:48:10,920 [salt.state       :300 ][INFO    ][7053] Made the following changes:
'python-os-service-types' changed from 'absent' to '1.3.0-1~u16.04+mcp'
'twitter-bootstrap' changed from 'absent' to '1'
'python-oslo.concurrency' changed from 'absent' to '3.27.0-2~u16.04+mcp7'
'python-xstatic-angular-fileupload' changed from 'absent' to '12.0.4.0+dfsg1-1.1~u16.04+mcp2'
'python-sqlparse' changed from 'absent' to '0.2.2-1~u16.04+mcp1'
'python-pint' changed from 'absent' to '0.6-1ubuntu1'
'python-monotonic' changed from 'absent' to '0.6-2'
'python2.7-pymongo' changed from 'absent' to '1'
'python-openstacksdk' changed from 'absent' to '0.17.2-2~u16.04+mcp13'
'python-deprecation' changed from 'absent' to '1.0.1-1~u16.04+mcp2'
'python2.7-bson' changed from 'absent' to '1'
'libtiff5' changed from 'absent' to '4.0.6-1ubuntu0.6'
'python-secretstorage' changed from 'absent' to '2.1.3-1'
'libjs-jsencrypt' changed from 'absent' to '2.3.0+dfsg2-1~u16.04+mcp2'
'python-glanceclient' changed from 'absent' to '1:2.13.1-3~u16.04+mcp4'
'python-xstatic-bootstrap-datepicker' changed from 'absent' to '1.3.1.1-1~u16.04+mcp1'
'python-xstatic-angular-schema-form' changed from 'absent' to '0.8.13.0-1.1~u16.04+mcp2'
'libjs-term.js' changed from 'absent' to '0.0.7-1~u16.04+mcp2'
'python-xstatic-jasmine' changed from 'absent' to '2.4.1.1+fixed1-1~u16.04+mcp1'
'python-semantic-version' changed from 'absent' to '2.3.1-1'
'python-blinker' changed from 'absent' to '1.3.dfsg2-1build1'
'python-roman' changed from 'absent' to '2.0.0-2'
'python-prettytable' changed from 'absent' to '0.7.2-3'
'python-bs4' changed from 'absent' to '4.6.0-1~u16.04+mcp1'
'python2.7-pymongo-ext' changed from 'absent' to '1'
'python-unittest2' changed from 'absent' to '1.1.0-6.1'
'python2.7-django-appconf' changed from 'absent' to '1'
'python-stevedore' changed from 'absent' to '1:1.29.0-1~u16.04+mcp4'
'docutils-doc' changed from 'absent' to '0.12+dfsg-1'
'python-dbus' changed from 'absent' to '1.2.0-3'
'python-gridfs' changed from 'absent' to '3.2-1build1'
'python-fixtures' changed from 'absent' to '3.0.0-1.1~u16.04+mcp2'
'python-xstatic-jquery.tablesorter' changed from 'absent' to '2.14.5.1-2.0~u16.04+mcp1'
'libjs-twitter-bootstrap' changed from 'absent' to '2.0.2+dfsg-9'
'python-testtools' changed from 'absent' to '2.3.0-1.0~u16.04+mcp1'
'libjs-jquery-cookie' changed from 'absent' to '10-2ubuntu2'
'libjs-angularjs-smart-table' changed from 'absent' to '1.4.13-1~u16.04+mcp2'
'python-xstatic-hogan' changed from 'absent' to '2.0.0.2-1'
'python-dogpile.cache' changed from 'absent' to '0.6.2-1.1~u16.04+mcp2'
'python-compressor' changed from 'absent' to '1'
'libjs-spin.js' changed from 'absent' to '1.2.8+dfsg2-1'
'fonts-roboto-fontface' changed from 'absent' to '0.5.0-2~u16.04+mcp2'
'python-pil' changed from 'absent' to '3.1.2-0ubuntu1.1'
'docutils-common' changed from 'absent' to '0.12+dfsg-1'
'python2.7-lxml' changed from 'absent' to '1'
'python-fasteners' changed from 'absent' to '0.12.0-2ubuntu1'
'python-babel' changed from 'absent' to '2.6.0+dfsg.1-1~u16.04+mcp'
'python-osc-lib' changed from 'absent' to '1.11.1-2~u16.04+mcp3'
'liblcms2-2' changed from 'absent' to '2.6-3ubuntu2.1'
'python2.7-simplejson' changed from 'absent' to '1'
'python-extras' changed from 'absent' to '1.0.0-2.0~u16.04+mcp1'
'python-xstatic-bootstrap-scss' changed from 'absent' to '3.3.7.1-2~u16.04+mcp3'
'python-xstatic-term.js' changed from 'absent' to '0.0.7.0-2~u16.04+mcp2'
'python-bson-ext' changed from 'absent' to '3.2-1build1'
'python-jwt' changed from 'absent' to '1.3.0-1ubuntu0.1'
'python-posix-ipc' changed from 'absent' to '0.9.8-2build2'
'python-xstatic-angular-bootstrap' changed from 'absent' to '2.2.0.0-1.1~u16.04+mcp2'
'python2.7-testtools' changed from 'absent' to '1'
'docutils' changed from 'absent' to '1'
'python-django-pyscss' changed from 'absent' to '2.0.2-4'
'python2.7-dbus' changed from 'absent' to '1'
'fonts-materialdesignicons-webfont' changed from 'absent' to '1.4.57-1.1~u16.04+mcp2'
'python-xstatic-angular' changed from 'absent' to '1.5.8.0-1.1~u16.04+mcp2'
'python-pillow' changed from 'absent' to '1'
'python2.7-cinderclient' changed from 'absent' to '1'
'python2.7-netifaces' changed from 'absent' to '1'
'python-xstatic-mdi' changed from 'absent' to '1.4.57.0-1.1~u16.04+mcp2'
'python-xstatic-jquery' changed from 'absent' to '1.10.2.1-2~u16.04+mcp2'
'python-oslo.context' changed from 'absent' to '1:2.21.0-1~u16.04+mcp4'
'python-neutronclient' changed from 'absent' to '1:6.9.1-1~u16.04+mcp6'
'python-pymongo-ext' changed from 'absent' to '3.2-1build1'
'python2.7-pyinotify' changed from 'absent' to '1'
'python-xstatic-jquery-ui' changed from 'absent' to '1.12.0.1+debian+dfsg3-2~u16.04+mcp2'
'python-pyparsing' changed from 'absent' to '2.2.0+dfsg1-2~u16.04+mcp1'
'python-babel-localedata' changed from 'absent' to '2.6.0+dfsg.1-1~u16.04+mcp'
'python-mimeparse' changed from 'absent' to '0.1.4-1build1'
'python-appconf' changed from 'absent' to '1'
'python-cmd2' changed from 'absent' to '0.6.8-1'
'libjs-magic-search' changed from 'absent' to '0.2.5-1'
'python-oslo-utils' changed from 'absent' to '1'
'python-xstatic-tv4' changed from 'absent' to '1.2.7.0-1.1~u16.04+mcp2'
'python-oslo-log' changed from 'absent' to '1'
'python-keystoneclient' changed from 'absent' to '1:3.17.0-1~u16.04+mcp6'
'python-xstatic-font-awesome' changed from 'absent' to '4.7.0.0-3~u16.04+mcp2'
'python-rjsmin' changed from 'absent' to '1.0.12+dfsg1-2ubuntu1'
'python-pygments' changed from 'absent' to '2.2.0+dfsg-1~u16.04+mcp2'
'python-pathlib' changed from 'absent' to '1.0.1-2'
'python-iso8601' changed from 'absent' to '0.1.11-1'
'python-xstatic-jsencrypt' changed from 'absent' to '2.3.1.1-2~u16.04+mcp2'
'python-jsonpatch' changed from 'absent' to '1.21-1~u16.04+mcp1'
'python-xstatic-d3' changed from 'absent' to '3.5.17.0-2~u16.04+mcp2'
'libwebpmux1' changed from 'absent' to '0.4.4-1'
'python-xstatic-roboto-fontface' changed from 'absent' to '0.5.0.0-2~u16.04+mcp2'
'python-oslo.policy' changed from 'absent' to '1.38.1-1~u16.04+mcp'
'python-xstatic' changed from 'absent' to '1.0.0-4'
'libjs-jquery-tablesorter' changed from 'absent' to '10-2ubuntu2'
'python-lxml' changed from 'absent' to '3.5.0-1ubuntu0.1'
'python-oslo.config' changed from 'absent' to '1:6.4.0-1~u16.04+mcp'
'python-futurist' changed from 'absent' to '1.6.0-1.0~u16.04+mcp7'
'libpaper1' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-webob' changed from 'absent' to '1:1.8.2-1~u16.04+mcp'
'python2.7-gi' changed from 'absent' to '1'
'python-linecache2' changed from 'absent' to '1.0.0-2'
'python-xstatic-objectpath' changed from 'absent' to '1.2.1.0-2.1~u16.04+mcp2'
'python-oauthlib' changed from 'absent' to '1.0.3-1'
'python2.7-django-compressor' changed from 'absent' to '1'
'python-gi' changed from 'absent' to '3.20.0-0ubuntu1'
'python-xstatic-angular-lrdragndrop' changed from 'absent' to '1.0.2.2-1'
'python-contextlib2' changed from 'absent' to '0.5.1-1'
'python-xstatic-bootswatch' changed from 'absent' to '3.3.7.0-2~u16.04+mcp2'
'python-xstatic-jquery-migrate' changed from 'absent' to '1.2.1.1+dfsg1-1'
'python-xstatic-jquery.quicksearch' changed from 'absent' to '2.0.4.1-1'
'python-novaclient' changed from 'absent' to '2:11.0.0-2~u16.04+mcp20'
'python-oslo.utils' changed from 'absent' to '3.36.4-1~u16.04+mcp'
'libjs-bootswatch' changed from 'absent' to '3.3.7+dfsg2-1~u16.04+mcp2'
'python-django' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'libjs-twitter-bootstrap-datepicker' changed from 'absent' to '1.3.1+dfsg1-1'
'python-keyrings.alt' changed from 'absent' to '1.1.1-1'
'python2.7-iso8601' changed from 'absent' to '1'
'python-bson' changed from 'absent' to '3.2-1build1'
'python-simplejson' changed from 'absent' to '3.8.1-1ubuntu2'
'fonts-font-awesome' changed from 'absent' to '4.7.0~dfsg-3~u16.04+mcp2'
'python-docutils' changed from 'absent' to '0.12+dfsg-1'
'python-xstatic-spin' changed from 'absent' to '1.2.8.0+dfsg1-1'
'python2.7-cmd2' changed from 'absent' to '1'
'libjs-jquery-ui' changed from 'absent' to '1.12.1+dfsg-5~u16.04+mcp2'
'python-tz' changed from 'absent' to '2014.10~dfsg1-0ubuntu2'
'libpaper-utils' changed from 'absent' to '1.1.24+nmu4ubuntu1'
'python-cliff' changed from 'absent' to '2.11.1-1~u16.04+mcp6'
'python-oslo.i18n' changed from 'absent' to '3.21.0-1~u16.04+mcp6'
'python-munch' changed from 'absent' to '2.2.0-1.0~u16.04+mcp1'
'python-xstatic-magic-search' changed from 'absent' to '0.2.5.1-1'
'python-appdirs' changed from 'absent' to '1.4.0-2'
'python2.7-pathlib' changed from 'absent' to '1'
'libpaperg' changed from 'absent' to '1'
'libjs-d3' changed from 'absent' to '3.5.17-2~u16.04+mcp2'
'python-django-appconf' changed from 'absent' to '1.0.1-4'
'libjs-jquery.quicksearch' changed from 'absent' to '2.0.4-1'
'python-xstatic-smart-table' changed from 'absent' to '1.4.13.2-2~u16.04+mcp1'
'python-oslo.serialization' changed from 'absent' to '2.27.0-1~u16.04+mcp5'
'python-unicodecsv' changed from 'absent' to '0.14.1-1'
'python-wrapt' changed from 'absent' to '1.8.0-5build2'
'python-rfc3986' changed from 'absent' to '0.3.1-2.1~u16.04+mcp2'
'python-django-horizon' changed from 'absent' to '3:14.0.2-1~u16.04+mcp50'
'python2.7-pyparsing' changed from 'absent' to '1'
'python-oslo.log' changed from 'absent' to '3.39.2-1~u16.04+mcp2'
'python-pyscss' changed from 'absent' to '1.3.4-5'
'python-pyinotify' changed from 'absent' to '0.9.6-1.1~u16.04+mcp2'
'libjpeg-turbo8' changed from 'absent' to '1.4.2-0ubuntu3.1'
'libjs-angularjs' changed from 'absent' to '1.5.10-1.1~u16.04+mcp2'
'libjpeg8' changed from 'absent' to '8c-2ubuntu8'
'python-os-client-config' changed from 'absent' to '1.29.0-1.0~u16.04+mcp7'
'libjs-angular-file-upload' changed from 'absent' to '12.0.4+dfsg1-2.1~u16.04+mcp2'
'libwebp5' changed from 'absent' to '0.4.4-1'
'python-django-compressor' changed from 'absent' to '2.1-1~u16.04+mcp2'
'python-netifaces' changed from 'absent' to '0.10.4-0.1build2'
'python-decorator' changed from 'absent' to '4.3.0-1~u16.04+mcp'
'python-osprofiler' changed from 'absent' to '2.3.0-1~u16.04+mcp'
'python-warlock' changed from 'absent' to '1.2.0-2.0~u16.04+mcp1'
'python-django-common' changed from 'absent' to '1:1.11.7-1~u16.04+mcp2'
'python-debtcollector' changed from 'absent' to '1.20.0-2~u16.04+mcp'
'openstack-dashboard' changed from 'absent' to '3:14.0.2-1~u16.04+mcp50'
'python-json-pointer' changed from 'absent' to '1.9-3'
'libjs-lrdragndrop' changed from 'absent' to '1.0.2-2'
'python-html5lib' changed from 'absent' to '0.999-4'
'python-swiftclient' changed from 'absent' to '1:3.6.0-2~u16.04+mcp6'
'python2.7-pil' changed from 'absent' to '1'
'python2.7-gridfs' changed from 'absent' to '1'
'python-django-babel' changed from 'absent' to '0.6.2-1~u16.04+mcp1'
'python-rcssmin' changed from 'absent' to '1.0.6-1ubuntu1'
'python-keyring' changed from 'absent' to '8.5.1-1.1~u16.04+mcp2'
'python-csscompressor' changed from 'absent' to '0.9.4-2'
'python-traceback2' changed from 'absent' to '1.4.0-3'
'python-jmespath' changed from 'absent' to '0.9.0-2'
'python-keystoneauth1' changed from 'absent' to '3.10.0-1~u16.04+mcp10'
'libjs-angular-gettext' changed from 'absent' to '2.3.8-2~u16.04+mcp2'
'python-pymongo' changed from 'absent' to '3.2-1build1'
'libjs-jquery-metadata' changed from 'absent' to '10-2ubuntu2'
'libjs-rickshaw' changed from 'absent' to '1.5.1.dfsg-1'
'python-xstatic-rickshaw' changed from 'absent' to '1.5.0.2-2'
'python-cinderclient' changed from 'absent' to '1:4.0.1-1~u16.04+mcp9'
'python-requestsexceptions' changed from 'absent' to '1.3.0-3~u16.04+mcp2'
'python-oslo-context' changed from 'absent' to '1'
'python2.7-bson-ext' changed from 'absent' to '1'
'python-xstatic-angular-gettext' changed from 'absent' to '2.3.8.0-2~u16.04+mcp2'
'libjbig0' changed from 'absent' to '2.1-3.1'

2019-04-30 22:48:10,939 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:48:10,963 [salt.state       :1951][INFO    ][7053] Completed state [openstack-dashboard] at time 22:48:10.963936 duration_in_ms=154601.217
2019-04-30 22:48:10,968 [salt.state       :1780][INFO    ][7053] Running state [python-lesscpy] at time 22:48:10.968133
2019-04-30 22:48:10,968 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [python-lesscpy]
2019-04-30 22:48:12,146 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:48:12,164 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-lesscpy'] in directory '/root'
2019-04-30 22:48:15,303 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:48:15,334 [salt.state       :300 ][INFO    ][7053] Made the following changes:
'python-lesscpy' changed from 'absent' to '0.10-1'

2019-04-30 22:48:15,357 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:48:15,481 [salt.state       :1951][INFO    ][7053] Completed state [python-lesscpy] at time 22:48:15.481918 duration_in_ms=4513.784
2019-04-30 22:48:15,486 [salt.state       :1780][INFO    ][7053] Running state [gettext-base] at time 22:48:15.486013
2019-04-30 22:48:15,486 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [gettext-base]
2019-04-30 22:48:15,961 [salt.state       :300 ][INFO    ][7053] All specified packages are already installed
2019-04-30 22:48:15,961 [salt.state       :1951][INFO    ][7053] Completed state [gettext-base] at time 22:48:15.961748 duration_in_ms=475.734
2019-04-30 22:48:15,962 [salt.state       :1780][INFO    ][7053] Running state [python-pylibmc] at time 22:48:15.961994
2019-04-30 22:48:15,962 [salt.state       :1813][INFO    ][7053] Executing state pkg.installed for [python-pylibmc]
2019-04-30 22:48:15,979 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:48:15,998 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'python-pylibmc'] in directory '/root'
2019-04-30 22:48:18,428 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:48:18,459 [salt.state       :300 ][INFO    ][7053] Made the following changes:
'python-pylibmc' changed from 'absent' to '1.5.0-4build1'
'libmemcached11' changed from 'absent' to '1.0.18-4.1ubuntu2'

2019-04-30 22:48:18,481 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:48:18,510 [salt.state       :1951][INFO    ][7053] Completed state [python-pylibmc] at time 22:48:18.510483 duration_in_ms=2548.489
2019-04-30 22:48:18,523 [salt.state       :1780][INFO    ][7053] Running state [openstack-dashboard] at time 22:48:18.523144
2019-04-30 22:48:18,523 [salt.state       :1813][INFO    ][7053] Executing state apache_conf.disabled for [openstack-dashboard]
2019-04-30 22:48:18,524 [salt.state       :300 ][INFO    ][7053] openstack-dashboard already disabled.
2019-04-30 22:48:18,524 [salt.state       :1951][INFO    ][7053] Completed state [openstack-dashboard] at time 22:48:18.524452 duration_in_ms=1.308
2019-04-30 22:48:18,524 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 22:48:18.524952
2019-04-30 22:48:18,525 [salt.state       :1813][INFO    ][7053] Executing state file.absent for [/etc/apache2/conf-available/openstack-dashboard.conf]
2019-04-30 22:48:18,525 [salt.state       :300 ][INFO    ][7053] File /etc/apache2/conf-available/openstack-dashboard.conf is not present
2019-04-30 22:48:18,526 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/conf-available/openstack-dashboard.conf] at time 22:48:18.526001 duration_in_ms=1.049
2019-04-30 22:48:18,526 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/sites-available/wsgi_openstack_web.conf] at time 22:48:18.526628
2019-04-30 22:48:18,526 [salt.state       :1813][INFO    ][7053] Executing state file.exists for [/etc/apache2/sites-available/wsgi_openstack_web.conf]
2019-04-30 22:48:18,527 [salt.state       :300 ][INFO    ][7053] Path /etc/apache2/sites-available/wsgi_openstack_web.conf exists
2019-04-30 22:48:18,527 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/sites-available/wsgi_openstack_web.conf] at time 22:48:18.527625 duration_in_ms=0.997
2019-04-30 22:48:18,906 [salt.state       :1780][INFO    ][7053] Running state [apache2] at time 22:48:18.906828
2019-04-30 22:48:18,907 [salt.state       :1813][INFO    ][7053] Executing state service.running for [apache2]
2019-04-30 22:48:18,908 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'status', 'apache2.service', '-n', '0'] in directory '/root'
2019-04-30 22:48:18,919 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2019-04-30 22:48:18,928 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2019-04-30 22:48:18,939 [salt.state       :300 ][INFO    ][7053] The service apache2 is already running
2019-04-30 22:48:18,940 [salt.state       :1951][INFO    ][7053] Completed state [apache2] at time 22:48:18.940112 duration_in_ms=33.283
2019-04-30 22:48:18,940 [salt.state       :1780][INFO    ][7053] Running state [apache2] at time 22:48:18.940490
2019-04-30 22:48:18,940 [salt.state       :1813][INFO    ][7053] Executing state service.mod_watch for [apache2]
2019-04-30 22:48:18,941 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2019-04-30 22:48:18,952 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'systemctl', 'reload', 'apache2.service'] in directory '/root'
2019-04-30 22:48:19,135 [salt.state       :300 ][INFO    ][7053] {'apache2': True}
2019-04-30 22:48:19,135 [salt.state       :1951][INFO    ][7053] Completed state [apache2] at time 22:48:19.135546 duration_in_ms=195.056
2019-04-30 22:48:19,136 [salt.state       :1780][INFO    ][7053] Running state [/etc/apache2/conf-enabled/security.conf] at time 22:48:19.136484
2019-04-30 22:48:19,136 [salt.state       :1813][INFO    ][7053] Executing state file.symlink for [/etc/apache2/conf-enabled/security.conf]
2019-04-30 22:48:19,140 [salt.state       :300 ][INFO    ][7053] {'new': '/etc/apache2/conf-enabled/security.conf'}
2019-04-30 22:48:19,140 [salt.state       :1951][INFO    ][7053] Completed state [/etc/apache2/conf-enabled/security.conf] at time 22:48:19.140853 duration_in_ms=4.369
2019-04-30 22:48:19,147 [salt.state       :1780][INFO    ][7053] Running state [openstack-dashboard-apache] at time 22:48:19.147263
2019-04-30 22:48:19,147 [salt.state       :1813][INFO    ][7053] Executing state pkg.purged for [openstack-dashboard-apache]
2019-04-30 22:48:19,285 [salt.state       :300 ][INFO    ][7053] All specified packages are already absent
2019-04-30 22:48:19,285 [salt.state       :1951][INFO    ][7053] Completed state [openstack-dashboard-apache] at time 22:48:19.285822 duration_in_ms=138.559
2019-04-30 22:48:19,286 [salt.state       :1780][INFO    ][7053] Running state [/etc/openstack-dashboard/local_settings.py] at time 22:48:19.286644
2019-04-30 22:48:19,287 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/etc/openstack-dashboard/local_settings.py]
2019-04-30 22:48:19,305 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/local_settings/rocky_settings.py'
2019-04-30 22:48:19,353 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_local_settings.py'
2019-04-30 22:48:19,395 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_horizon_settings.py'
2019-04-30 22:48:19,418 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_keystone_settings.py'
2019-04-30 22:48:19,442 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_nova_settings.py'
2019-04-30 22:48:19,456 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_glance_settings.py'
2019-04-30 22:48:19,513 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_neutron_settings.py'
2019-04-30 22:48:19,529 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_heat_settings.py'
2019-04-30 22:48:19,544 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_cinder_settings.py'
2019-04-30 22:48:19,559 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_websso_settings.py'
2019-04-30 22:48:19,580 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_ssl_settings.py'
2019-04-30 22:48:19,594 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/horizon_settings/_django_settings.py'
2019-04-30 22:48:19,601 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -1,169 +1,83 @@
-# -*- coding: utf-8 -*-
-
 import os
 
+from django.utils.translation import pgettext_lazy
 from django.utils.translation import ugettext_lazy as _
-
-from horizon.utils import secret_key
-
-from openstack_dashboard.settings import HORIZON_CONFIG
-
-DEBUG = True
-
-# This setting controls whether or not compression is enabled. Disabling
-# compression makes Horizon considerably slower, but makes it much easier
-# to debug JS and CSS changes
-#COMPRESS_ENABLED = not DEBUG
-
-# This setting controls whether compression happens on the fly, or offline
-# with `python manage.py compress`
-# See https://django-compressor.readthedocs.io/en/latest/usage/#offline-compression
-# for more information
-#COMPRESS_OFFLINE = not DEBUG
-
-# WEBROOT is the location relative to Webserver root
-# should end with a slash.
-WEBROOT = '/'
-#LOGIN_URL = WEBROOT + 'auth/login/'
-#LOGOUT_URL = WEBROOT + 'auth/logout/'
-#
-# LOGIN_REDIRECT_URL can be used as an alternative for
-# HORIZON_CONFIG.user_home, if user_home is not set.
-# Do not set it to '/home/', as this will cause circular redirect loop
-#LOGIN_REDIRECT_URL = WEBROOT
-
-# If horizon is running in production (DEBUG is False), set this
-# with the list of host/domain names that the application can serve.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#allowed-hosts
-ALLOWED_HOSTS = [ 'prx01', 'localhost', ]
-
-# Set SSL proxy settings:
-# Pass this header from the proxy after terminating the SSL,
-# and don't forget to strip it from the client's request.
-# For more information see:
-# https://docs.djangoproject.com/en/dev/ref/settings/#secure-proxy-ssl-header
-#SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
-
-# If Horizon is being served through SSL, then uncomment the following two
-# settings to better secure the cookies from security exploits
-#CSRF_COOKIE_SECURE = True
-#SESSION_COOKIE_SECURE = True
-
-# The absolute path to the directory where message files are collected.
-# The message file must have a .json file extension. When the user logins to
-# horizon, the message files collected are processed and displayed to the user.
-#MESSAGES_PATH=None
-
-# Overrides for OpenStack API versions. Use this setting to force the
-# OpenStack dashboard to use a specific API version for a given service API.
-# Versions specified here should be integers or floats, not strings.
-# NOTE: The version should be formatted as it appears in the URL for the
-# service API. For example, The identity service APIs have inconsistent
-# use of the decimal point, so valid options would be 2.0 or 3.
-# Minimum compute version to get the instance locked status is 2.9.
-#OPENSTACK_API_VERSIONS = {
-#    "data-processing": 1.1,
-#    "identity": 3,
-#    "image": 2,
-#    "volume": 2,
-#    "compute": 2,
-#}
-
-# Set this to True if running on a multi-domain model. When this is enabled, it
-# will require the user to enter the Domain name in addition to the username
-# for login.
-#OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
-
-# Set this to True if you want available domains displayed as a dropdown menu
-# on the login screen. It is strongly advised NOT to enable this for public
-# clouds, as advertising enabled domains to unauthenticated customers
-# irresponsibly exposes private information. This should only be used for
-# private clouds where the dashboard sits behind a corporate firewall.
-#OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN = False
-
-# If OPENSTACK_KEYSTONE_DOMAIN_DROPDOWN is enabled, this option can be used to
-# set the available domains to choose from. This is a list of pairs whose first
-# value is the domain name and the second is the display name.
-#OPENSTACK_KEYSTONE_DOMAIN_CHOICES = (
-#  ('Default', 'Default'),
-#)
-
-# Overrides the default domain used when running on single-domain model
-# with Keystone V3. All entities will be created in the default domain.
-# NOTE: This value must be the name of the default domain, NOT the ID.
-# Also, you will most likely have a value in the keystone policy file like this
-#    "cloud_admin": "rule:admin_required and domain_id:<your domain id>"
-# This value must be the name of the domain whose ID is specified there.
-#OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
-
-# Set this to True to enable panels that provide the ability for users to
-# manage Identity Providers (IdPs) and establish a set of rules to map
-# federation protocol attributes to Identity API attributes.
-# This extension requires v3.0+ of the Identity API.
-#OPENSTACK_KEYSTONE_FEDERATION_MANAGEMENT = False
-
-# Set Console type:
-# valid options are "AUTO"(default), "VNC", "SPICE", "RDP", "SERIAL", "MKS"
-# or None. Set to None explicitly if you want to deactivate the console.
-#CONSOLE_TYPE = "AUTO"
-
-# Toggle showing the openrc file for Keystone V2.
-# If set to false the link will be removed from the user dropdown menu
-# and the API Access page
-#SHOW_KEYSTONE_V2_RC = True
-
-# If provided, a "Report Bug" link will be displayed in the site header
-# which links to the value of this setting (ideally a URL containing
-# information on how to report issues).
-#HORIZON_CONFIG["bug_url"] = "http://bug-report.example.com"
-
-# Show backdrop element outside the modal, do not close the modal
-# after clicking on backdrop.
-#HORIZON_CONFIG["modal_backdrop"] = "static"
-
-# Specify a regular expression to validate user passwords.
-#HORIZON_CONFIG["password_validator"] = {
-#    "regex": '.*',
-#    "help_text": _("Your password does not meet the requirements."),
-#}
-
-# Turn off browser autocompletion for forms including the login form and
-# the database creation workflow if so desired.
-#HORIZON_CONFIG["password_autocomplete"] = "off"
-
-# Setting this to True will disable the reveal button for password fields,
-# including on the login form.
-#HORIZON_CONFIG["disable_password_reveal"] = False
+from openstack_dashboard import exceptions
+
+HORIZON_CONFIG = {
+    'user_home': 'openstack_dashboard.views.get_user_home',
+    'ajax_queue_limit': 10,
+    'auto_fade_alerts': {
+        'delay': 3000,
+        'fade_duration': 1500,
+        'types': ['alert-success', 'alert-info']
+    },
+    'help_url': "http://docs.openstack.org",
+    'exceptions': {'recoverable': exceptions.RECOVERABLE,
+                   'not_found': exceptions.NOT_FOUND,
+                   'unauthorized': exceptions.UNAUTHORIZED},
+    'modal_backdrop': 'static',
+    'angular_modules': [],
+    'js_files': [],
+    'js_spec_files': [],
+    'disable_password_reveal': True,
+    'password_autocomplete': 'off'
+}
+# 'key', 'label', 'path'
+AVAILABLE_THEMES = [
+    (
+        "default",
+        pgettext_lazy("Default style theme", "Default"),
+        "themes/default"
+    ),
+    (
+        "material",
+        pgettext_lazy("Google's Material Design style theme", "Material"),
+        "themes/material"
+    ),
+]
+
+# The default theme if no cookie is present
+DEFAULT_THEME = 'default'
+
+# Theme Static Directory
+THEME_COLLECTION_DIR = 'themes'
+
+# Theme Cookie Name
+THEME_COOKIE_NAME = 'theme'
+
+INSTALLED_APPS = (
+    'openstack_dashboard',
+    'django.contrib.contenttypes',
+    'django.contrib.auth',
+    'django.contrib.sessions',
+    'django.contrib.messages',
+    'django.contrib.staticfiles',
+    'django.contrib.humanize',
+    'compressor',
+    'horizon',
+    'openstack_auth',
+)
+
+
+
+DEBUG = False
+
+TEMPLATE_DEBUG = DEBUG
+
+ALLOWED_HOSTS = ['*']
+
+AUTHENTICATION_URLS = ['openstack_auth.urls']
 
 LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
 
-# Set custom secret key:
-# You can either set it to a specific value or you can let horizon generate a
-# default secret key that is unique on this machine, e.i. regardless of the
-# amount of Python WSGI workers (if used behind Apache+mod_wsgi): However,
-# there may be situations where you would want to set this explicitly, e.g.
-# when multiple dashboard instances are distributed on different machines
-# (usually behind a load-balancer). Either you have to make sure that a session
-# gets all requests routed to the same dashboard instance or you set the same
-# SECRET_KEY for all of them.
-SECRET_KEY = secret_key.generate_or_read_from_file(
-    os.path.join("/","var","lib","openstack-dashboard","secret-key", '.secret_key_store'))
-
-# We recommend you use memcached for development; otherwise after every reload
-# of the django development server, you will have to login again. To use
-# memcached set CACHES to something like
-#CACHES = {
-#    'default': {
-#        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-#        'LOCATION': '127.0.0.1:11211',
-#    },
-#}
+SECRET_KEY = 'opaesee8Que2yahJoh9fo0eefo1Aeyo6ahyei8zeiboh3aeth5loth7ieNa5xi5e'
 
 CACHES = {
     'default': {
-        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
-    },
+        'BACKEND': "django.core.cache.backends.memcached.PyLibMCCache",
+        'LOCATION': "172.30.10.102:11211"
+    }
 }
 
 # Send email to the console by default
@@ -172,86 +86,249 @@
 #EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'
 
 # Configure these for your outgoing email host
-#EMAIL_HOST = 'smtp.my-company.com'
-#EMAIL_PORT = 25
-#EMAIL_HOST_USER = 'djangomail'
-#EMAIL_HOST_PASSWORD = 'top-secret!'
+# EMAIL_HOST = 'smtp.my-company.com'
+# EMAIL_PORT = 25
+# EMAIL_HOST_USER = 'djangomail'
+# EMAIL_HOST_PASSWORD = 'top-secret!'
+
+# The number of objects (Swift containers/objects or images) to display
+# on a single page before providing a paging element (a "more" link)
+# to paginate results.
+API_RESULT_LIMIT = 1000
+API_RESULT_PAGE_SIZE = 20
+
+# The timezone of the server. This should correspond with the timezone
+# of your entire OpenStack installation, and hopefully be in UTC.
+TIME_ZONE = "UTC"
+
+COMPRESS_OFFLINE = True
+
+# Trove user and database extension support. By default support for
+# creating users and databases on database instances is turned on.
+# To disable these extensions set the permission here to something
+# unusable such as ["!"].
+# TROVE_ADD_USER_PERMS = []
+# TROVE_ADD_DATABASE_PERMS = []
+
+SITE_BRANDING = 'OpenStack Dashboard'
+SESSION_COOKIE_HTTPONLY = True
+BOOT_ONLY_FROM_VOLUME = True
+
+REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
+                             'LAUNCH_INSTANCE_DEFAULTS',
+                             'OPENSTACK_IMAGE_FORMATS']
+
+
+# Specify a regular expression to validate user passwords.
+# HORIZON_CONFIG["password_validator"] = {
+#     "regex": '.*',
+#     "help_text": _("Your password does not meet the requirements.")
+# }
+
+# Turn off browser autocompletion for the login form if so desired.
+# HORIZON_CONFIG["password_autocomplete"] = "off"
+
+# The Horizon Policy Enforcement engine uses these values to load per service
+# policy rule files. The content of these files should match the files the
+# OpenStack services are using to determine role based access control in the
+# target installation.
+
+SESSION_TIMEOUT = 43200
+SESSION_ENGINE = "django.contrib.sessions.backends.cache"
+DROPDOWN_MAX_ITEMS = 30
+# A dictionary of settings which can be used to provide the default values for
+# properties found in the Launch Instance modal.
+
+# Path to directory containing policy.json files
+POLICY_FILES_PATH = "/usr/share/openstack-dashboard/openstack_dashboard/conf"
+# Map of local copy of service policy files
+POLICY_FILES = {
+    "compute": "nova_policy.json",
+    "network": "neutron_policy.json",
+    "image": "glance_policy.json",
+    "telemetry": "ceilometer_policy.json",
+    "volume": "cinder_policy.json",
+    "orchestration": "heat_policy.json",
+    "identity": "keystone_policy.json",
+}
+
+LOGGING = {
+    'version': 1,
+    # When set to True this will disable all logging except
+    # for loggers specified in this configuration dictionary. Note that
+    # if nothing is specified here and disable_existing_loggers is True,
+    # django.db.backends will still log unless it is disabled explicitly.
+    
+    'disable_existing_loggers': False,
+    'handlers': {
+        'null': {
+            'level': 'DEBUG',
+            'class': 'logging.NullHandler',
+        },
+        'console': {
+            # Set the level to "DEBUG" for verbose output logging.
+            'level': 'INFO',
+            'class': 'logging.StreamHandler',
+        },
+        'file': {
+            'level': 'DEBUG',
+            'class': 'logging.FileHandler',
+            'filename': '/var/log/horizon/horizon.log',
+        },
+    },
+    'loggers': {
+        # Logging from django.db.backends is VERY verbose, send to null
+        # by default.
+        'django.db.backends': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        # DEBUG level for django.template starting Pike has some false positive traces, set it to INFO
+        # by default. Caused by bug PROD-17558.
+        'django.template': {
+            'handlers': ['file'],
+            'level': 'INFO',
+            'propagate': True,
+        },
+        'requests': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+        'horizon': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_dashboard': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'novaclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'cinderclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'keystoneclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'glanceclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'neutronclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'heatclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'ceilometerclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'troveclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'mistralclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'swiftclient': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'openstack_auth': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'scss.expression': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'nose.plugins.manager': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'django': {
+            'handlers': ['file'],
+            'level': 'DEBUG',
+            'propagate': False,
+        },
+        'iso8601': {
+            'handlers': ['null'],
+            'propagate': False,
+        },
+    }
+}
+
+
+# Overrides for OpenStack API versions. Use this setting to force the
+# OpenStack dashboard to use a specfic API version for a given service API.
+# NOTE: The version should be formatted as it appears in the URL for the
+# service API. For example, The identity service APIs have inconsistent
+# use of the decimal point, so valid options would be "2.0" or "3".
+OPENSTACK_API_VERSIONS = {
+    "identity": 3
+}
+# Set this to True if running on multi-domain model. When this is enabled, it
+# will require user to enter the Domain name in addition to username for login.
+# OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+
+# Overrides the default domain used when running on single-domain model
+# with Keystone V3. All entities will be created in the default domain.
+# OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = 'Default'
 
 # For multiple regions uncomment this configuration, and add (endpoint, title).
-#AVAILABLE_REGIONS = [
-#    ('http://cluster1.example.com:5000/v3', 'cluster1'),
-#    ('http://cluster2.example.com:5000/v3', 'cluster2'),
-#]
-
-OPENSTACK_HOST = "127.0.0.1"
+# AVAILABLE_REGIONS = [
+#     ('http://cluster1.example.com:5000/v2.0', 'cluster1'),
+#     ('http://cluster2.example.com:5000/v2.0', 'cluster2'),
+# ]
+
+
+OPENSTACK_HOST = "10.167.4.35"
 OPENSTACK_KEYSTONE_URL = "http://%s:5000/v3" % OPENSTACK_HOST
-OPENSTACK_KEYSTONE_DEFAULT_ROLE = "_member_"
-
-# For setting the default service region on a per-endpoint basis. Note that the
-# default value for this setting is {}, and below is just an example of how it
-# should be specified.
-# A key of '*' is an optional global default if no other key matches.
-#DEFAULT_SERVICE_REGIONS = {
-#    '*': 'RegionOne'
-#    OPENSTACK_KEYSTONE_URL: 'RegionTwo'
-#}
-
-# Enables keystone web single-sign-on if set to True.
-#WEBSSO_ENABLED = False
-
-# Authentication mechanism to be selected as default.
-# The value must be a key from WEBSSO_CHOICES.
-#WEBSSO_INITIAL_CHOICE = "credentials"
-
-# The list of authentication mechanisms which include keystone
-# federation protocols and identity provider/federation protocol
-# mapping keys (WEBSSO_IDP_MAPPING). Current supported protocol
-# IDs are 'saml2' and 'oidc'  which represent SAML 2.0, OpenID
-# Connect respectively.
-# Do not remove the mandatory credentials mechanism.
-# Note: The last two tuples are sample mapping keys to a identity provider
-# and federation protocol combination (WEBSSO_IDP_MAPPING).
-#WEBSSO_CHOICES = (
-#    ("credentials", _("Keystone Credentials")),
-#    ("oidc", _("OpenID Connect")),
-#    ("saml2", _("Security Assertion Markup Language")),
-#    ("acme_oidc", "ACME - OpenID Connect"),
-#    ("acme_saml2", "ACME - SAML2"),
-#)
-
-# A dictionary of specific identity provider and federation protocol
-# combinations. From the selected authentication mechanism, the value
-# will be looked up as keys in the dictionary. If a match is found,
-# it will redirect the user to a identity provider and federation protocol
-# specific WebSSO endpoint in keystone, otherwise it will use the value
-# as the protocol_id when redirecting to the WebSSO by protocol endpoint.
-# NOTE: The value is expected to be a tuple formatted as: (<idp_id>, <protocol_id>).
-#WEBSSO_IDP_MAPPING = {
-#    "acme_oidc": ("acme", "oidc"),
-#    "acme_saml2": ("acme", "saml2"),
-#}
-
-# If set this URL will be used for web single-sign-on authentication
-# instead of OPENSTACK_KEYSTONE_URL. This is needed in the deployment
-# scenarios where network segmentation is used per security requirement.
-# In this case, the controllers are not reachable from public network.
-# Therefore, user's browser will not be able to access OPENSTACK_KEYSTONE_URL
-# if it is set to the internal endpoint.
-#WEBSSO_KEYSTONE_URL = "http://keystone-public.example.com/v3"
-
-# The Keystone Provider drop down uses Keystone to Keystone federation
-# to switch between Keystone service providers.
-# Set display name for Identity Provider (dropdown display name)
-#KEYSTONE_PROVIDER_IDP_NAME = "Local Keystone"
-# This id is used for only for comparison with the service provider IDs. This ID
-# should not match any service provider IDs.
-#KEYSTONE_PROVIDER_IDP_ID = "localkeystone"
+
+OPENSTACK_KEYSTONE_MULTIDOMAIN_SUPPORT = False
+OPENSTACK_KEYSTONE_DEFAULT_DOMAIN = "default"
+
+OPENSTACK_KEYSTONE_DEFAULT_ROLE = "Member"
 
 # Disable SSL certificate checks (useful for self-signed certificates):
-#OPENSTACK_SSL_NO_VERIFY = True
 
 # The CA certificate to use to verify SSL connections
-#OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+# OPENSTACK_SSL_CACERT = '/path/to/cacert.pem'
+
+# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is 'publicURL'.
+OPENSTACK_ENDPOINT_TYPE = "internalURL"
+
+# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
+# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
+# in the Keystone service catalog. Use this setting when Horizon is running
+# external to the OpenStack environment. The default is None.  This
+# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
+#SECONDARY_ENDPOINT_TYPE = "publicURL"
 
 # The OPENSTACK_KEYSTONE_BACKEND settings can be used to identify the
 # capabilities of the auth backend for Keystone.
@@ -265,38 +342,13 @@
     'can_edit_group': True,
     'can_edit_project': True,
     'can_edit_domain': True,
-    'can_edit_role': True,
-}
-
-# Setting this to True, will add a new "Retrieve Password" action on instance,
-# allowing Admin session password retrieval/decryption.
-#OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
-
-# The Launch Instance user experience has been significantly enhanced.
-# You can choose whether to enable the new launch instance experience,
-# the legacy experience, or both. The legacy experience will be removed
-# in a future release, but is available as a temporary backup setting to ensure
-# compatibility with existing deployments. Further development will not be
-# done on the legacy experience. Please report any problems with the new
-# experience via the Launchpad tracking system.
-#
-# Toggle LAUNCH_INSTANCE_LEGACY_ENABLED and LAUNCH_INSTANCE_NG_ENABLED to
-# determine the experience to enable.  Set them both to true to enable
-# both.
-#LAUNCH_INSTANCE_LEGACY_ENABLED = True
-#LAUNCH_INSTANCE_NG_ENABLED = False
-
-# A dictionary of settings which can be used to provide the default values for
-# properties found in the Launch Instance modal.
-#LAUNCH_INSTANCE_DEFAULTS = {
-#    'config_drive': False,
-#    'enable_scheduler_hints': True,
-#    'disable_image': False,
-#    'disable_instance_snapshot': False,
-#    'disable_volume': False,
-#    'disable_volume_snapshot': False,
-#    'create_volume': True,
-#}
+    'can_edit_role': True
+}
+
+
+# Set Console type:
+# valid options would be "AUTO", "VNC" or "SPICE"
+# CONSOLE_TYPE = "AUTO"
 
 # The Xen Hypervisor has the ability to set the mount point for volumes
 # attached to instances (other Hypervisors currently do not). Setting
@@ -305,102 +357,52 @@
 OPENSTACK_HYPERVISOR_FEATURES = {
     'can_set_mount_point': False,
     'can_set_password': False,
-    'requires_keypair': False,
-    'enable_quotas': True
-}
-
-# This settings controls whether IP addresses of servers are retrieved from
-# neutron in the project instance table. Setting this to ``False`` may mitigate
-# a performance issue in the project instance table in large deployments.
-#OPENSTACK_INSTANCE_RETRIEVE_IP_ADDRESSES = True
-
-# The OPENSTACK_CINDER_FEATURES settings can be used to enable optional
-# services provided by cinder that is not exposed by its extension API.
-OPENSTACK_CINDER_FEATURES = {
-    'enable_backup': False,
-}
-
-# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
-# services provided by neutron. Options currently available are load
-# balancer service, security groups, quotas, VPN service.
-OPENSTACK_NEUTRON_NETWORK = {
-    'enable_router': True,
-    'enable_quotas': True,
-    'enable_ipv6': True,
-    'enable_distributed_router': False,
-    'enable_ha_router': False,
-    'enable_fip_topology_check': True,
-
-    # Default dns servers you would like to use when a subnet is
-    # created.  This is only a default, users can still choose a different
-    # list of dns servers when creating a new subnet.
-    # The entries below are examples only, and are not appropriate for
-    # real deployments
-    # 'default_dns_nameservers': ["8.8.8.8", "8.8.4.4", "208.67.222.222"],
-
-    # Set which provider network types are supported. Only the network types
-    # in this list will be available to choose from when creating a network.
-    # Network types include local, flat, vlan, gre, vxlan and geneve.
-    # 'supported_provider_types': ['*'],
-
-    # You can configure available segmentation ID range per network type
-    # in your deployment.
-    # 'segmentation_id_range': {
-    #     'vlan': [1024, 2048],
-    #     'vxlan': [4094, 65536],
-    # },
-
-    # You can define additional provider network types here.
-    # 'extra_provider_types': {
-    #     'awesome_type': {
-    #         'display_name': 'Awesome New Type',
-    #         'require_physical_network': False,
-    #         'require_segmentation_id': True,
-    #     }
-    # },
-
-    # Set which VNIC types are supported for port binding. Only the VNIC
-    # types in this list will be available to choose from when creating a
-    # port.
-    # VNIC types include 'normal', 'direct', 'direct-physical', 'macvtap',
-    # 'baremetal' and 'virtio-forwarder'
-    # Set to empty list or None to disable VNIC type selection.
-    'supported_vnic_types': ['*'],
-
-    # Set list of available physical networks to be selected in the physical
-    # network field on the admin create network modal. If it's set to an empty
-    # list, the field will be a regular input field.
-    # e.g. ['default', 'test']
-    'physical_networks': [],
-
-}
-
-# The OPENSTACK_HEAT_STACK settings can be used to disable password
-# field required while launching the stack.
-OPENSTACK_HEAT_STACK = {
-    'enable_user_pass': True,
-}
+}
+
+# When set, enables the instance action "Retrieve password"
+# allowing password retrieval
+OPENSTACK_ENABLE_PASSWORD_RETRIEVE = False
+
+# When launching an instance, the menu of available flavors is
+# sorted by RAM usage, ascending.  Provide a callback method here
+# (and/or a flag for reverse sort) for the sorted() method if you'd
+# like a different behaviour.  For more info, see
+# http://docs.python.org/2/library/functions.html#sorted
+# CREATE_INSTANCE_FLAVOR_SORT = {
+#     'key': my_awesome_callback_method,
+#     'reverse': False,
+# }
+
+FLAVOR_EXTRA_KEYS = {
+    'flavor_keys': [
+        ('quota:read_bytes_sec', _('Quota: Read bytes')),
+        ('quota:write_bytes_sec', _('Quota: Write bytes')),
+        ('quota:cpu_quota', _('Quota: CPU')),
+        ('quota:cpu_period', _('Quota: CPU period')),
+        ('quota:inbound_average', _('Quota: Inbound average')),
+        ('quota:outbound_average', _('Quota: Outbound average')),
+    ]
+}
+
 
 # The OPENSTACK_IMAGE_BACKEND settings can be used to customize features
 # in the OpenStack Dashboard related to the Image service, such as the list
 # of supported image formats.
-#OPENSTACK_IMAGE_BACKEND = {
-#    'image_formats': [
-#        ('', _('Select format')),
-#        ('aki', _('AKI - Amazon Kernel Image')),
-#        ('ami', _('AMI - Amazon Machine Image')),
-#        ('ari', _('ARI - Amazon Ramdisk Image')),
-#        ('docker', _('Docker')),
-#        ('iso', _('ISO - Optical Disk Image')),
-#        ('ova', _('OVA - Open Virtual Appliance')),
-#        ('qcow2', _('QCOW2 - QEMU Emulator')),
-#        ('raw', _('Raw')),
-#        ('vdi', _('VDI - Virtual Disk Image')),
-#        ('vhd', _('VHD - Virtual Hard Disk')),
-#        ('vhdx', _('VHDX - Large Virtual Hard Disk')),
-#        ('vmdk', _('VMDK - Virtual Machine Disk')),
-#    ],
-#}
+OPENSTACK_IMAGE_BACKEND = {
+    'image_formats': [
+        ('', ''),
+        ('aki', _('AKI - Amazon Kernel Image')),
+        ('ami', _('AMI - Amazon Machine Image')),
+        ('ari', _('ARI - Amazon Ramdisk Image')),
+        ('iso', _('ISO - Optical Disk Image')),
+        ('qcow2', _('QCOW2 - QEMU Emulator')),
+        ('raw', _('Raw')),
+        ('vdi', _('VDI')),
+        ('vhd', _('VHD')),
+        ('vmdk', _('VMDK')),
+        ('docker', _('Docker Container'))
+    ]
+}
 
 # The IMAGE_CUSTOM_PROPERTY_TITLES settings is used to customize the titles for
 # image custom property attributes that appear on image detail pages.
@@ -410,270 +412,54 @@
     "ramdisk_id": _("Ramdisk ID"),
     "image_state": _("Euca2ools state"),
     "project_id": _("Project ID"),
-    "image_type": _("Image Type"),
-}
-
-# The IMAGE_RESERVED_CUSTOM_PROPERTIES setting is used to specify which image
-# custom properties should not be displayed in the Image Custom Properties
-# table.
-IMAGE_RESERVED_CUSTOM_PROPERTIES = []
-
-# Set to 'legacy' or 'direct' to allow users to upload images to glance via
-# Horizon server. When enabled, a file form field will appear on the create
-# image form. If set to 'off', there will be no file form field on the create
-# image form. See documentation for deployment considerations.
-#HORIZON_IMAGES_UPLOAD_MODE = 'legacy'
-
-# Allow a location to be set when creating or updating Glance images.
-# If using Glance V2, this value should be False unless the Glance
-# configuration and policies allow setting locations.
-#IMAGES_ALLOW_LOCATION = False
-
-# A dictionary of default settings for create image modal.
-#CREATE_IMAGE_DEFAULTS = {
-#    'image_visibility': "public",
-#}
-
-# OPENSTACK_ENDPOINT_TYPE specifies the endpoint type to use for the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is 'publicURL'.
-#OPENSTACK_ENDPOINT_TYPE = "publicURL"
-
-# SECONDARY_ENDPOINT_TYPE specifies the fallback endpoint type to use in the
-# case that OPENSTACK_ENDPOINT_TYPE is not present in the endpoints
-# in the Keystone service catalog. Use this setting when Horizon is running
-# external to the OpenStack environment. The default is None. This
-# value should differ from OPENSTACK_ENDPOINT_TYPE if used.
-#SECONDARY_ENDPOINT_TYPE = None
-
-# The number of objects (Swift containers/objects or images) to display
-# on a single page before providing a paging element (a "more" link)
-# to paginate results.
-API_RESULT_LIMIT = 1000
-API_RESULT_PAGE_SIZE = 20
-
-# The size of chunk in bytes for downloading objects from Swift
-SWIFT_FILE_TRANSFER_CHUNK_SIZE = 512 * 1024
-
-# The default number of lines displayed for instance console log.
-INSTANCE_LOG_LENGTH = 35
-
-# Specify a maximum number of items to display in a dropdown.
-DROPDOWN_MAX_ITEMS = 30
-
-# The timezone of the server. This should correspond with the timezone
-# of your entire OpenStack installation, and hopefully be in UTC.
-TIME_ZONE = "UTC"
-
-# When launching an instance, the menu of available flavors is
-# sorted by RAM usage, ascending. If you would like a different sort order,
-# you can provide another flavor attribute as sorting key. Alternatively, you
-# can provide a custom callback method to use for sorting. You can also provide
-# a flag for reverse sort. For more info, see
-# http://docs.python.org/2/library/functions.html#sorted
-#CREATE_INSTANCE_FLAVOR_SORT = {
-#    'key': 'name',
-#     # or
-#    'key': my_awesome_callback_method,
-#    'reverse': False,
-#}
-
-# Set this to True to display an 'Admin Password' field on the Change Password
-# form to verify that it is indeed the admin logged-in who wants to change
-# the password.
-#ENFORCE_PASSWORD_CHECK = False
-
-# Modules that provide /auth routes that can be used to handle different types
-# of user authentication. Add auth plugins that require extra route handling to
-# this list.
-#AUTHENTICATION_URLS = [
-#    'openstack_auth.urls',
-#]
-
-# The Horizon Policy Enforcement engine uses these values to load per service
-# policy rule files. The content of these files should match the files the
-# OpenStack services are using to determine role based access control in the
-# target installation.
-
-# Path to directory containing policy.json files
-#POLICY_FILES_PATH = os.path.join(ROOT_PATH, "conf")
-
-# Map of local copy of service policy files.
-# Please insure that your identity policy file matches the one being used on
-# your keystone servers. There is an alternate policy file that may be used
-# in the Keystone v3 multi-domain case, policy.v3cloudsample.json.
-# This file is not included in the Horizon repository by default but can be
-# found at
-# http://git.openstack.org/cgit/openstack/keystone/tree/etc/ \
-# policy.v3cloudsample.json
-# Having matching policy files on the Horizon and Keystone servers is essential
-# for normal operation. This holds true for all services and their policy files.
-#POLICY_FILES = {
-#    'identity': 'keystone_policy.json',
-#    'compute': 'nova_policy.json',
-#    'volume': 'cinder_policy.json',
-#    'image': 'glance_policy.json',
-#    'network': 'neutron_policy.json',
-#}
-
-# Change this patch to the appropriate list of tuples containing
-# a key, label and static directory containing two files:
-# _variables.scss and _styles.scss
-#AVAILABLE_THEMES = [
-#    ('default', 'Default', 'themes/default'),
-#    ('material', 'Material', 'themes/material'),
-#]
-
-LOGGING = {
-    'version': 1,
-    # When set to True this will disable all logging except
-    # for loggers specified in this configuration dictionary. Note that
-    # if nothing is specified here and disable_existing_loggers is True,
-    # django.db.backends will still log unless it is disabled explicitly.
-    'disable_existing_loggers': False,
-    # If apache2 mod_wsgi is used to deploy OpenStack dashboard
-    # timestamp is output by mod_wsgi. If WSGI framework you use does not
-    # output timestamp for logging, add %(asctime)s in the following
-    # format definitions.
-    'formatters': {
-        'console': {
-            'format': '%(levelname)s %(name)s %(message)s'
-        },
-        'operation': {
-            # The format of "%(message)s" is defined by
-            # OPERATION_LOG_OPTIONS['format']
-            'format': '%(message)s'
-        },
-    },
-    'handlers': {
-        'null': {
-            'level': 'DEBUG',
-            'class': 'logging.NullHandler',
-        },
-        'console': {
-            # Set the level to "DEBUG" for verbose output logging.
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'console',
-        },
-        'operation': {
-            'level': 'INFO',
-            'class': 'logging.StreamHandler',
-            'formatter': 'operation',
-        },
-    },
-    'loggers': {
-        'horizon': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'horizon.operation_log': {
-            'handlers': ['operation'],
-            'level': 'INFO',
-            'propagate': False,
-        },
-        'openstack_dashboard': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'novaclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'cinderclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneauth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'keystoneclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'glanceclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'neutronclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'swiftclient': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'oslo_policy': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'openstack_auth': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        'django': {
-            'handlers': ['console'],
-            'level': 'DEBUG',
-            'propagate': False,
-        },
-        # Logging from django.db.backends is VERY verbose, send to null
-        # by default.
-        'django.db.backends': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'requests': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'urllib3': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'chardet.charsetprober': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'iso8601': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-        'scss': {
-            'handlers': ['null'],
-            'propagate': False,
-        },
-    },
+    "image_type": _("Image Type")
+}
+# Default
+IMAGES_ALLOW_LOCATION = True
+
+HORIZON_IMAGES_UPLOAD_MODE = "direct"
+
+
+# Disable simplified floating IP address management for deployments with
+# multiple floating IP pools or complex network requirements.
+# HORIZON_CONFIG["simple_ip_management"] = False
+
+# The OPENSTACK_NEUTRON_NETWORK settings can be used to enable optional
+# services provided by neutron. Options currenly available are load
+# balancer service, security groups, quotas, VPN service.
+
+OPENSTACK_NEUTRON_NETWORK = {
+    'enable_lb': True,
+    'enable_firewall': False,
+    'enable_quotas': True,
+    'enable_security_group': True,
+    'enable_vpn': False,
+    # The profile_support option is used to detect if an externa lrouter can be
+    # configured via the dashboard. When using specific plugins the
+    # profile_support can be turned on if needed.
+    'profile_support': None,
+    'enable_fip_topology_check': True,
+
+    #'profile_support': 'cisco',
 }
 
 # 'direction' should not be specified for all_tcp/udp/icmp.
 # It is specified in the form.
 SECURITY_GROUP_RULES = {
     'all_tcp': {
-        'name': _('All TCP'),
+        'name': 'ALL TCP',
         'ip_protocol': 'tcp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_udp': {
-        'name': _('All UDP'),
+        'name': 'ALL UDP',
         'ip_protocol': 'udp',
         'from_port': '1',
         'to_port': '65535',
     },
     'all_icmp': {
-        'name': _('All ICMP'),
+        'name': 'ALL ICMP',
         'ip_protocol': 'icmp',
         'from_port': '-1',
         'to_port': '-1',
@@ -764,136 +550,18 @@
     },
 }
 
-# Deprecation Notice:
-#
-# The setting FLAVOR_EXTRA_KEYS has been deprecated.
-# Please load extra spec metadata into the Glance Metadata Definition Catalog.
-#
-# The sample quota definitions can be found in:
-# <glance_source>/etc/metadefs/compute-quota.json
-#
-# The metadata definition catalog supports CLI and API:
-#  $glance --os-image-api-version 2 help md-namespace-import
-#  $glance-manage db_load_metadefs <directory_with_definition_files>
-#
-# See Metadata Definitions on:
-# https://docs.openstack.org/glance/latest/user/glancemetadefcatalogapi.html
-
-# The hash algorithm to use for authentication tokens. This must
-# match the hash algorithm that the identity server and the
-# auth_token middleware are using. Allowed values are the
-# algorithms supported by Python's hashlib library.
-#OPENSTACK_TOKEN_HASH_ALGORITHM = 'md5'
-
-# AngularJS requires some settings to be made available to
-# the client side. Some settings are required by in-tree / built-in horizon
-# features. These settings must be added to REST_API_REQUIRED_SETTINGS in the
-# form of ['SETTING_1','SETTING_2'], etc.
-#
-# You may remove settings from this list for security purposes, but do so at
-# the risk of breaking a built-in horizon feature. These settings are required
-# for horizon to function properly. Only remove them if you know what you
-# are doing. These settings may in the future be moved to be defined within
-# the enabled panel configuration.
-# You should not add settings to this list for out of tree extensions.
-# See: https://wiki.openstack.org/wiki/Horizon/RESTAPI
-REST_API_REQUIRED_SETTINGS = ['OPENSTACK_HYPERVISOR_FEATURES',
-                              'LAUNCH_INSTANCE_DEFAULTS',
-                              'OPENSTACK_IMAGE_FORMATS',
-                              'OPENSTACK_KEYSTONE_BACKEND',
-                              'OPENSTACK_KEYSTONE_DEFAULT_DOMAIN',
-                              'CREATE_IMAGE_DEFAULTS',
-                              'ENFORCE_PASSWORD_CHECK']
-
-# Additional settings can be made available to the client side for
-# extensibility by specifying them in REST_API_ADDITIONAL_SETTINGS
-# !! Please use extreme caution as the settings are transferred via HTTP/S
-# and are not encrypted on the browser. This is an experimental API and
-# may be deprecated in the future without notice.
-#REST_API_ADDITIONAL_SETTINGS = []
-
-# DISALLOW_IFRAME_EMBED can be used to prevent Horizon from being embedded
-# within an iframe. Legacy browsers are still vulnerable to a Cross-Frame
-# Scripting (XFS) vulnerability, so this option allows extra security hardening
-# where iframes are not used in deployment. Default setting is True.
-# For more information see:
-# http://tinyurl.com/anticlickjack
-#DISALLOW_IFRAME_EMBED = True
-
-# Help URL can be made available for the client. To provide a help URL, edit the
-# following attribute to the URL of your choice.
-#HORIZON_CONFIG["help_url"] = "http://openstack.mycompany.org"
-
-# Settings for OperationLogMiddleware
-# OPERATION_LOG_ENABLED is flag to use the function to log an operation on
-# Horizon.
-# mask_targets is arrangement for appointing a target to mask.
-# method_targets is arrangement of HTTP method to output log.
-# format is the log contents.
-#OPERATION_LOG_ENABLED = False
-#OPERATION_LOG_OPTIONS = {
-#    'mask_fields': ['password'],
-#    'target_methods': ['POST'],
-#    'ignored_urls': ['/js/', '/static/', '^/api/'],
-#    'format': ("[%(client_ip)s] [%(domain_name)s]"
-#        " [%(domain_id)s] [%(project_name)s]"
-#        " [%(project_id)s] [%(user_name)s] [%(user_id)s] [%(request_scheme)s]"
-#        " [%(referer_url)s] [%(request_url)s] [%(message)s] [%(method)s]"
-#        " [%(http_status)s] [%(param)s]"),
-#}
-
-# The default date range in the Overview panel meters - either <today> minus N
-# days (if the value is integer N), or from the beginning of the current month
-# until today (if set to None). This setting should be used to limit the amount
-# of data fetched by default when rendering the Overview panel.
-#OVERVIEW_DAYS_RANGE = 1
-
-# To allow operators to require users provide a search criteria first
-# before loading any data into the views, set the following dict
-# attributes to True in each one of the panels you want to enable this feature.
-# Follow the convention <dashboard>.<view>
-#FILTER_DATA_FIRST = {
-#    'admin.instances': False,
-#    'admin.images': False,
-#    'admin.networks': False,
-#    'admin.routers': False,
-#    'admin.volumes': False,
-#    'identity.users': False,
-#    'identity.projects': False,
-#    'identity.groups': False,
-#    'identity.roles': False
-#}
-
-# Dict used to restrict user private subnet cidr range.
-# An empty list means that user input will not be restricted
-# for a corresponding IP version. By default, there is
-# no restriction for IPv4 or IPv6. To restrict
-# user private subnet cidr range set ALLOWED_PRIVATE_SUBNET_CIDR
-# to something like
-#ALLOWED_PRIVATE_SUBNET_CIDR = {
-#    'ipv4': ['10.0.0.0/8', '192.168.0.0/16'],
-#    'ipv6': ['fc00::/7']
-#}
-ALLOWED_PRIVATE_SUBNET_CIDR = {'ipv4': [], 'ipv6': []}
-
-# Projects and users can have extra attributes as defined by keystone v3.
-# Horizon has the ability to display these extra attributes via this setting.
-# If you'd like to display extra data in the project or user tables, set the
-# corresponding dict key to the attribute name, followed by the display name.
-# For more information, see horizon's customization
-# (https://docs.openstack.org/horizon/latest/configuration/customizing.html#horizon-customization-module-overrides)
-#PROJECT_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-#USER_TABLE_EXTRA_INFO = {
-#   'phone_num': _('Phone Number'),
-#}
-
-# Password will have an expiration date when using keystone v3 and enabling the
-# feature.
-# This setting allows you to set the number of days that the user will be alerted
-# prior to the password expiration.
-# Once the password expires keystone will deny the access and users must
-# contact an admin to change their password.
-#PASSWORD_EXPIRES_WARNING_THRESHOLD_DAYS = 0
-COMPRESS_OFFLINE=True
+
+
+
+
+
+
+USE_SSL = True
+SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTOCOL', 'https')
+CSRF_COOKIE_SECURE = True
+SESSION_COOKIE_SECURE = True
+
+
+
+
+FILE_UPLOAD_TEMP_DIR = '/var/tmp/'

2019-04-30 22:48:19,641 [salt.state       :915 ][INFO    ][7053] Loading fresh modules for state activity
2019-04-30 22:48:19,687 [salt.state       :1951][INFO    ][7053] Completed state [/etc/openstack-dashboard/local_settings.py] at time 22:48:19.687265 duration_in_ms=400.62
2019-04-30 22:48:19,704 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 22:48:19.704569
2019-04-30 22:48:19,705 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json]
2019-04-30 22:48:19,736 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/nova_policy.json'
2019-04-30 22:48:19,740 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -1,174 +1,500 @@
 {
-    "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or project_id:%(project_id)s",
+    "context_is_admin":  "role:admin",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
+
+    "compute:create": "rule:admin_or_owner",
+    "compute:create:attach_network": "rule:admin_or_owner",
+    "compute:create:attach_volume": "rule:admin_or_owner",
+    "compute:create:forced_host": "is_admin:True",
+
+    "compute:get": "rule:admin_or_owner",
+    "compute:get_all": "rule:admin_or_owner",
+    "compute:get_all_tenants": "is_admin:True",
+
+    "compute:update": "rule:admin_or_owner",
+
+    "compute:get_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_metadata": "rule:admin_or_owner",
+    "compute:get_all_instance_system_metadata": "rule:admin_or_owner",
+    "compute:update_instance_metadata": "rule:admin_or_owner",
+    "compute:delete_instance_metadata": "rule:admin_or_owner",
+
+    "compute:get_diagnostics": "rule:admin_or_owner",
+    "compute:get_instance_diagnostics": "rule:admin_or_owner",
+
+    "compute:start": "rule:admin_or_owner",
+    "compute:stop": "rule:admin_or_owner",
+
+    "compute:lock": "rule:admin_or_owner",
+    "compute:unlock": "rule:admin_or_owner",
+    "compute:unlock_override": "rule:admin_api",
+
+    "compute:get_vnc_console": "rule:admin_or_owner",
+    "compute:get_spice_console": "rule:admin_or_owner",
+    "compute:get_rdp_console": "rule:admin_or_owner",
+    "compute:get_serial_console": "rule:admin_or_owner",
+    "compute:get_mks_console": "rule:admin_or_owner",
+    "compute:get_console_output": "rule:admin_or_owner",
+
+    "compute:reset_network": "rule:admin_or_owner",
+    "compute:inject_network_info": "rule:admin_or_owner",
+    "compute:add_fixed_ip": "rule:admin_or_owner",
+    "compute:remove_fixed_ip": "rule:admin_or_owner",
+
+    "compute:attach_volume": "rule:admin_or_owner",
+    "compute:detach_volume": "rule:admin_or_owner",
+    "compute:swap_volume": "rule:admin_api",
+
+    "compute:attach_interface": "rule:admin_or_owner",
+    "compute:detach_interface": "rule:admin_or_owner",
+
+    "compute:set_admin_password": "rule:admin_or_owner",
+
+    "compute:rescue": "rule:admin_or_owner",
+    "compute:unrescue": "rule:admin_or_owner",
+
+    "compute:suspend": "rule:admin_or_owner",
+    "compute:resume": "rule:admin_or_owner",
+
+    "compute:pause": "rule:admin_or_owner",
+    "compute:unpause": "rule:admin_or_owner",
+
+    "compute:shelve": "rule:admin_or_owner",
+    "compute:shelve_offload": "rule:admin_or_owner",
+    "compute:unshelve": "rule:admin_or_owner",
+
+    "compute:snapshot": "rule:admin_or_owner",
+    "compute:snapshot_volume_backed": "rule:admin_or_owner",
+    "compute:backup": "rule:admin_or_owner",
+
+    "compute:resize": "rule:admin_or_owner",
+    "compute:confirm_resize": "rule:admin_or_owner",
+    "compute:revert_resize": "rule:admin_or_owner",
+
+    "compute:rebuild": "rule:admin_or_owner",
+    "compute:reboot": "rule:admin_or_owner",
+    "compute:delete": "rule:admin_or_owner",
+    "compute:soft_delete": "rule:admin_or_owner",
+    "compute:force_delete": "rule:admin_or_owner",
+
+    "compute:security_groups:add_to_instance": "rule:admin_or_owner",
+    "compute:security_groups:remove_from_instance": "rule:admin_or_owner",
+
+    "compute:restore": "rule:admin_or_owner",
+
+    "compute:volume_snapshot_create": "rule:admin_or_owner",
+    "compute:volume_snapshot_delete": "rule:admin_or_owner",
+
     "admin_api": "is_admin:True",
-    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
-    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
-    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
-    "os_compute_api:os-admin-password": "rule:admin_or_owner",
-    "os_compute_api:os-agents": "rule:admin_api",
-    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
-    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:create": "rule:admin_api",
-    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
-    "os_compute_api:os-aggregates:update": "rule:admin_api",
-    "os_compute_api:os-aggregates:index": "rule:admin_api",
-    "os_compute_api:os-aggregates:delete": "rule:admin_api",
-    "os_compute_api:os-aggregates:show": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
-    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
-    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:create": "rule:admin_or_owner",
-    "os_compute_api:os-attach-interfaces:delete": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
-    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
-    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
-    "os_compute_api:os-cells:update": "rule:admin_api",
-    "os_compute_api:os-cells:create": "rule:admin_api",
-    "os_compute_api:os-cells": "rule:admin_api",
-    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
-    "os_compute_api:os-cells:delete": "rule:admin_api",
-    "cells_scheduler_filter:DifferentCellFilter": "is_admin:True",
-    "cells_scheduler_filter:TargetCellFilter": "is_admin:True",
-    "os_compute_api:os-config-drive": "rule:admin_or_owner",
-    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
-    "os_compute_api:os-console-output": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
-    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
-    "os_compute_api:os-create-backup": "rule:admin_or_owner",
-    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
-    "os_compute_api:os-evacuate": "rule:admin_api",
-    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
-    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
-    "os_compute_api:os-extended-status": "rule:admin_or_owner",
-    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
-    "os_compute_api:extensions": "rule:admin_or_owner",
-    "os_compute_api:os-fixed-ips": "rule:admin_api",
-    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
-    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
-    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
-    "os_compute_api:os-flavor-manage": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:create": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-manage:update": "rule:admin_api",
-    "os_compute_api:os-flavor-manage:delete": "rule:os_compute_api:os-flavor-manage",
-    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
-    "os_compute_api:flavors": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
-    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
-    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
-    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
-    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
-    "os_compute_api:os-fping": "rule:admin_or_owner",
-    "os_compute_api:os-hide-server-addresses": "is_admin:False",
-    "os_compute_api:os-hosts": "rule:admin_api",
-    "os_compute_api:os-hypervisors": "rule:admin_api",
-    "os_compute_api:image-size": "rule:admin_or_owner",
-    "os_compute_api:os-instance-actions:events": "rule:admin_api",
-    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
-    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
-    "os_compute_api:ips:show": "rule:admin_or_owner",
-    "os_compute_api:ips:index": "rule:admin_or_owner",
-    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
-    "os_compute_api:os-keypairs": "rule:admin_or_owner",
-    "os_compute_api:limits": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
-    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
-    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
-    "os_compute_api:os-migrations:index": "rule:admin_api",
-    "os_compute_api:os-multinic": "rule:admin_or_owner",
-    "os_compute_api:os-networks": "rule:admin_api",
-    "os_compute_api:os-networks:view": "rule:admin_or_owner",
-    "os_compute_api:os-networks-associate": "rule:admin_api",
-    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
-    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
-    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
-    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:update": "rule:admin_api",
-    "os_compute_api:os-quota-sets:defaults": "@",
-    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
-    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
-    "os_compute_api:os-quota-sets:detail": "rule:admin_or_owner",
-    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
-    "os_compute_api:os-rescue": "rule:admin_or_owner",
-    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
-    "os_compute_api:os-security-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-diagnostics": "rule:admin_api",
-    "os_compute_api:os-server-external-events:create": "rule:admin_api",
-    "os_compute_api:os-server-groups": "rule:admin_or_owner",
-    "os_compute_api:os-server-groups:create": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:delete": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:index": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:os-server-groups:show": "rule:os_compute_api:os-server-groups",
-    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
-    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-password": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:index": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update_all": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:delete": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:update": "rule:admin_or_owner",
-    "os_compute_api:os-server-tags:show": "rule:admin_or_owner",
-    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "compute_extension:accounts": "rule:admin_api",
+    "compute_extension:admin_actions": "rule:admin_api",
+    "compute_extension:admin_actions:pause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unpause": "rule:admin_or_owner",
+    "compute_extension:admin_actions:suspend": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resume": "rule:admin_or_owner",
+    "compute_extension:admin_actions:lock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:unlock": "rule:admin_or_owner",
+    "compute_extension:admin_actions:resetNetwork": "rule:admin_api",
+    "compute_extension:admin_actions:injectNetworkInfo": "rule:admin_api",
+    "compute_extension:admin_actions:createBackup": "rule:admin_or_owner",
+    "compute_extension:admin_actions:migrateLive": "rule:admin_api",
+    "compute_extension:admin_actions:resetState": "rule:admin_api",
+    "compute_extension:admin_actions:migrate": "rule:admin_api",
+    "compute_extension:aggregates": "rule:admin_api",
+    "compute_extension:agents": "rule:admin_api",
+    "compute_extension:attach_interfaces": "rule:admin_or_owner",
+    "compute_extension:baremetal_nodes": "rule:admin_api",
+    "compute_extension:cells": "rule:admin_api",
+    "compute_extension:cells:create": "rule:admin_api",
+    "compute_extension:cells:delete": "rule:admin_api",
+    "compute_extension:cells:update": "rule:admin_api",
+    "compute_extension:cells:sync_instances": "rule:admin_api",
+    "compute_extension:certificates": "rule:admin_or_owner",
+    "compute_extension:cloudpipe": "rule:admin_api",
+    "compute_extension:cloudpipe_update": "rule:admin_api",
+    "compute_extension:config_drive": "rule:admin_or_owner",
+    "compute_extension:console_output": "rule:admin_or_owner",
+    "compute_extension:consoles": "rule:admin_or_owner",
+    "compute_extension:createserverext": "rule:admin_or_owner",
+    "compute_extension:deferred_delete": "rule:admin_or_owner",
+    "compute_extension:disk_config": "rule:admin_or_owner",
+    "compute_extension:evacuate": "rule:admin_api",
+    "compute_extension:extended_server_attributes": "rule:admin_api",
+    "compute_extension:extended_status": "rule:admin_or_owner",
+    "compute_extension:extended_availability_zone": "rule:admin_or_owner",
+    "compute_extension:extended_ips": "rule:admin_or_owner",
+    "compute_extension:extended_ips_mac": "rule:admin_or_owner",
+    "compute_extension:extended_vif_net": "rule:admin_or_owner",
+    "compute_extension:extended_volumes": "rule:admin_or_owner",
+    "compute_extension:fixed_ips": "rule:admin_api",
+    "compute_extension:flavor_access": "rule:admin_or_owner",
+    "compute_extension:flavor_access:addTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_access:removeTenantAccess": "rule:admin_api",
+    "compute_extension:flavor_disabled": "rule:admin_or_owner",
+    "compute_extension:flavor_rxtx": "rule:admin_or_owner",
+    "compute_extension:flavor_swap": "rule:admin_or_owner",
+    "compute_extension:flavorextradata": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:index": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:show": "rule:admin_or_owner",
+    "compute_extension:flavorextraspecs:create": "rule:admin_api",
+    "compute_extension:flavorextraspecs:update": "rule:admin_api",
+    "compute_extension:flavorextraspecs:delete": "rule:admin_api",
+    "compute_extension:flavormanage": "rule:admin_api",
+    "compute_extension:floating_ip_dns": "rule:admin_or_owner",
+    "compute_extension:floating_ip_pools": "rule:admin_or_owner",
+    "compute_extension:floating_ips": "rule:admin_or_owner",
+    "compute_extension:floating_ips_bulk": "rule:admin_api",
+    "compute_extension:fping": "rule:admin_or_owner",
+    "compute_extension:fping:all_tenants": "rule:admin_api",
+    "compute_extension:hide_server_addresses": "is_admin:False",
+    "compute_extension:hosts": "rule:admin_api",
+    "compute_extension:hypervisors": "rule:admin_api",
+    "compute_extension:image_size": "rule:admin_or_owner",
+    "compute_extension:instance_actions": "rule:admin_or_owner",
+    "compute_extension:instance_actions:events": "rule:admin_api",
+    "compute_extension:instance_usage_audit_log": "rule:admin_api",
+    "compute_extension:keypairs": "rule:admin_or_owner",
+    "compute_extension:keypairs:index": "rule:admin_or_owner",
+    "compute_extension:keypairs:show": "rule:admin_or_owner",
+    "compute_extension:keypairs:create": "rule:admin_or_owner",
+    "compute_extension:keypairs:delete": "rule:admin_or_owner",
+    "compute_extension:multinic": "rule:admin_or_owner",
+    "compute_extension:networks": "rule:admin_api",
+    "compute_extension:networks:view": "rule:admin_or_owner",
+    "compute_extension:networks_associate": "rule:admin_api",
+    "compute_extension:os-tenant-networks": "rule:admin_or_owner",
+    "compute_extension:quotas:show": "rule:admin_or_owner",
+    "compute_extension:quotas:update": "rule:admin_api",
+    "compute_extension:quotas:delete": "rule:admin_api",
+    "compute_extension:quota_classes": "rule:admin_or_owner",
+    "compute_extension:rescue": "rule:admin_or_owner",
+    "compute_extension:security_group_default_rules": "rule:admin_api",
+    "compute_extension:security_groups": "rule:admin_or_owner",
+    "compute_extension:server_diagnostics": "rule:admin_api",
+    "compute_extension:server_groups": "rule:admin_or_owner",
+    "compute_extension:server_password": "rule:admin_or_owner",
+    "compute_extension:server_usage": "rule:admin_or_owner",
+    "compute_extension:services": "rule:admin_api",
+    "compute_extension:shelve": "rule:admin_or_owner",
+    "compute_extension:shelveOffload": "rule:admin_api",
+    "compute_extension:simple_tenant_usage:show": "rule:admin_or_owner",
+    "compute_extension:simple_tenant_usage:list": "rule:admin_api",
+    "compute_extension:unshelve": "rule:admin_or_owner",
+    "compute_extension:users": "rule:admin_api",
+    "compute_extension:virtual_interfaces": "rule:admin_or_owner",
+    "compute_extension:virtual_storage_arrays": "rule:admin_or_owner",
+    "compute_extension:volumes": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:index": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:show": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:create": "rule:admin_or_owner",
+    "compute_extension:volume_attachments:update": "rule:admin_api",
+    "compute_extension:volume_attachments:delete": "rule:admin_or_owner",
+    "compute_extension:volumetypes": "rule:admin_or_owner",
+    "compute_extension:availability_zone:list": "rule:admin_or_owner",
+    "compute_extension:availability_zone:detail": "rule:admin_api",
+    "compute_extension:used_limits_for_admin": "rule:admin_api",
+    "compute_extension:migrations:index": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "compute_extension:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "compute_extension:console_auth_tokens": "rule:admin_api",
+    "compute_extension:os-server-external-events:create": "rule:admin_api",
+
+    "network:get_all": "rule:admin_or_owner",
+    "network:get": "rule:admin_or_owner",
+    "network:create": "rule:admin_or_owner",
+    "network:delete": "rule:admin_or_owner",
+    "network:associate": "rule:admin_or_owner",
+    "network:disassociate": "rule:admin_or_owner",
+    "network:get_vifs_by_instance": "rule:admin_or_owner",
+    "network:allocate_for_instance": "rule:admin_or_owner",
+    "network:deallocate_for_instance": "rule:admin_or_owner",
+    "network:validate_networks": "rule:admin_or_owner",
+    "network:get_instance_uuids_by_ip_filter": "rule:admin_or_owner",
+    "network:get_instance_id_by_floating_address": "rule:admin_or_owner",
+    "network:setup_networks_on_host": "rule:admin_or_owner",
+    "network:get_backdoor_port": "rule:admin_or_owner",
+
+    "network:get_floating_ip": "rule:admin_or_owner",
+    "network:get_floating_ip_pools": "rule:admin_or_owner",
+    "network:get_floating_ip_by_address": "rule:admin_or_owner",
+    "network:get_floating_ips_by_project": "rule:admin_or_owner",
+    "network:get_floating_ips_by_fixed_address": "rule:admin_or_owner",
+    "network:allocate_floating_ip": "rule:admin_or_owner",
+    "network:associate_floating_ip": "rule:admin_or_owner",
+    "network:disassociate_floating_ip": "rule:admin_or_owner",
+    "network:release_floating_ip": "rule:admin_or_owner",
+    "network:migrate_instance_start": "rule:admin_or_owner",
+    "network:migrate_instance_finish": "rule:admin_or_owner",
+
+    "network:get_fixed_ip": "rule:admin_or_owner",
+    "network:get_fixed_ip_by_address": "rule:admin_or_owner",
+    "network:add_fixed_ip_to_instance": "rule:admin_or_owner",
+    "network:remove_fixed_ip_from_instance": "rule:admin_or_owner",
+    "network:add_network_to_project": "rule:admin_or_owner",
+    "network:get_instance_nw_info": "rule:admin_or_owner",
+
+    "network:get_dns_domains": "rule:admin_or_owner",
+    "network:add_dns_entry": "rule:admin_or_owner",
+    "network:modify_dns_entry": "rule:admin_or_owner",
+    "network:delete_dns_entry": "rule:admin_or_owner",
+    "network:get_dns_entries_by_address": "rule:admin_or_owner",
+    "network:get_dns_entries_by_name": "rule:admin_or_owner",
+    "network:create_private_dns_domain": "rule:admin_or_owner",
+    "network:create_public_dns_domain": "rule:admin_or_owner",
+    "network:delete_dns_domain": "rule:admin_or_owner",
+    "network:attach_external_network": "rule:admin_api",
+    "network:get_vif_by_mac_address": "rule:admin_or_owner",
+
+    "os_compute_api:servers:detail:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:index:get_all_tenants": "is_admin:True",
+    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
+    "os_compute_api:servers:create": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
+    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
+    "os_compute_api:servers:create:forced_host": "rule:admin_api",
+    "os_compute_api:servers:delete": "rule:admin_or_owner",
+    "os_compute_api:servers:update": "rule:admin_or_owner",
+    "os_compute_api:servers:detail": "rule:admin_or_owner",
     "os_compute_api:servers:index": "rule:admin_or_owner",
-    "os_compute_api:servers:detail": "rule:admin_or_owner",
-    "os_compute_api:servers:index:get_all_tenants": "rule:admin_api",
-    "os_compute_api:servers:detail:get_all_tenants": "rule:admin_api",
+    "os_compute_api:servers:reboot": "rule:admin_or_owner",
+    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
+    "os_compute_api:servers:resize": "rule:admin_or_owner",
+    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
     "os_compute_api:servers:show": "rule:admin_or_owner",
     "os_compute_api:servers:show:host_status": "rule:admin_api",
-    "os_compute_api:servers:create": "rule:admin_or_owner",
-    "os_compute_api:servers:create:forced_host": "rule:admin_api",
-    "os_compute_api:servers:create:attach_volume": "rule:admin_or_owner",
-    "os_compute_api:servers:create:attach_network": "rule:admin_or_owner",
-    "network:attach_external_network": "is_admin:True",
-    "os_compute_api:servers:delete": "rule:admin_or_owner",
-    "os_compute_api:servers:update": "rule:admin_or_owner",
-    "os_compute_api:servers:confirm_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:revert_resize": "rule:admin_or_owner",
-    "os_compute_api:servers:reboot": "rule:admin_or_owner",
-    "os_compute_api:servers:resize": "rule:admin_or_owner",
-    "os_compute_api:servers:rebuild": "rule:admin_or_owner",
     "os_compute_api:servers:create_image": "rule:admin_or_owner",
     "os_compute_api:servers:create_image:allow_volume_backed": "rule:admin_or_owner",
     "os_compute_api:servers:start": "rule:admin_or_owner",
     "os_compute_api:servers:stop": "rule:admin_or_owner",
     "os_compute_api:servers:trigger_crash_dump": "rule:admin_or_owner",
-    "os_compute_api:servers:migrations:show": "rule:admin_api",
     "os_compute_api:servers:migrations:force_complete": "rule:admin_api",
     "os_compute_api:servers:migrations:delete": "rule:admin_api",
+    "os_compute_api:servers:discoverable": "@",
     "os_compute_api:servers:migrations:index": "rule:admin_api",
+    "os_compute_api:servers:migrations:show": "rule:admin_api",
+    "os_compute_api:os-access-ips:discoverable": "@",
+    "os_compute_api:os-access-ips": "rule:admin_or_owner",
+    "os_compute_api:os-admin-actions": "rule:admin_api",
+    "os_compute_api:os-admin-actions:discoverable": "@",
+    "os_compute_api:os-admin-actions:reset_network": "rule:admin_api",
+    "os_compute_api:os-admin-actions:inject_network_info": "rule:admin_api",
+    "os_compute_api:os-admin-actions:reset_state": "rule:admin_api",
+    "os_compute_api:os-admin-password": "rule:admin_or_owner",
+    "os_compute_api:os-admin-password:discoverable": "@",
+    "os_compute_api:os-aggregates:discoverable": "@",
+    "os_compute_api:os-aggregates:index": "rule:admin_api",
+    "os_compute_api:os-aggregates:create": "rule:admin_api",
+    "os_compute_api:os-aggregates:show": "rule:admin_api",
+    "os_compute_api:os-aggregates:update": "rule:admin_api",
+    "os_compute_api:os-aggregates:delete": "rule:admin_api",
+    "os_compute_api:os-aggregates:add_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:remove_host": "rule:admin_api",
+    "os_compute_api:os-aggregates:set_metadata": "rule:admin_api",
+    "os_compute_api:os-agents": "rule:admin_api",
+    "os_compute_api:os-agents:discoverable": "@",
+    "os_compute_api:os-attach-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-attach-interfaces:discoverable": "@",
+    "os_compute_api:os-baremetal-nodes": "rule:admin_api",
+    "os_compute_api:os-baremetal-nodes:discoverable": "@",
+    "os_compute_api:os-block-device-mapping-v1:discoverable": "@",
+    "os_compute_api:os-cells": "rule:admin_api",
+    "os_compute_api:os-cells:create": "rule:admin_api",
+    "os_compute_api:os-cells:delete": "rule:admin_api",
+    "os_compute_api:os-cells:update": "rule:admin_api",
+    "os_compute_api:os-cells:sync_instances": "rule:admin_api",
+    "os_compute_api:os-cells:discoverable": "@",
+    "os_compute_api:os-certificates:create": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:show": "rule:admin_or_owner",
+    "os_compute_api:os-certificates:discoverable": "@",
+    "os_compute_api:os-cloudpipe": "rule:admin_api",
+    "os_compute_api:os-cloudpipe:discoverable": "@",
+    "os_compute_api:os-config-drive": "rule:admin_or_owner",
+    "os_compute_api:os-config-drive:discoverable": "@",
+    "os_compute_api:os-consoles:discoverable": "@",
+    "os_compute_api:os-consoles:create": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:delete": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:index": "rule:admin_or_owner",
+    "os_compute_api:os-consoles:show": "rule:admin_or_owner",
+    "os_compute_api:os-console-output:discoverable": "@",
+    "os_compute_api:os-console-output": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles": "rule:admin_or_owner",
+    "os_compute_api:os-remote-consoles:discoverable": "@",
+    "os_compute_api:os-create-backup:discoverable": "@",
+    "os_compute_api:os-create-backup": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete": "rule:admin_or_owner",
+    "os_compute_api:os-deferred-delete:discoverable": "@",
+    "os_compute_api:os-disk-config": "rule:admin_or_owner",
+    "os_compute_api:os-disk-config:discoverable": "@",
+    "os_compute_api:os-evacuate": "rule:admin_api",
+    "os_compute_api:os-evacuate:discoverable": "@",
+    "os_compute_api:os-extended-server-attributes": "rule:admin_api",
+    "os_compute_api:os-extended-server-attributes:discoverable": "@",
+    "os_compute_api:os-extended-status": "rule:admin_or_owner",
+    "os_compute_api:os-extended-status:discoverable": "@",
+    "os_compute_api:os-extended-availability-zone": "rule:admin_or_owner",
+    "os_compute_api:os-extended-availability-zone:discoverable": "@",
+    "os_compute_api:extensions": "rule:admin_or_owner",
+    "os_compute_api:extensions:discoverable": "@",
+    "os_compute_api:extension_info:discoverable": "@",
+    "os_compute_api:os-extended-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-extended-volumes:discoverable": "@",
+    "os_compute_api:os-fixed-ips": "rule:admin_api",
+    "os_compute_api:os-fixed-ips:discoverable": "@",
+    "os_compute_api:os-flavor-access": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-access:discoverable": "@",
+    "os_compute_api:os-flavor-access:remove_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-access:add_tenant_access": "rule:admin_api",
+    "os_compute_api:os-flavor-rxtx": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-rxtx:discoverable": "@",
+    "os_compute_api:flavors": "rule:admin_or_owner",
+    "os_compute_api:flavors:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:discoverable": "@",
+    "os_compute_api:os-flavor-extra-specs:index": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:show": "rule:admin_or_owner",
+    "os_compute_api:os-flavor-extra-specs:create": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:update": "rule:admin_api",
+    "os_compute_api:os-flavor-extra-specs:delete": "rule:admin_api",
+    "os_compute_api:os-flavor-manage:discoverable": "@",
+    "os_compute_api:os-flavor-manage": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-dns:discoverable": "@",
+    "os_compute_api:os-floating-ip-dns:domain:update": "rule:admin_api",
+    "os_compute_api:os-floating-ip-dns:domain:delete": "rule:admin_api",
+    "os_compute_api:os-floating-ip-pools": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ip-pools:discoverable": "@",
+    "os_compute_api:os-floating-ips": "rule:admin_or_owner",
+    "os_compute_api:os-floating-ips:discoverable": "@",
+    "os_compute_api:os-floating-ips-bulk": "rule:admin_api",
+    "os_compute_api:os-floating-ips-bulk:discoverable": "@",
+    "os_compute_api:os-fping": "rule:admin_or_owner",
+    "os_compute_api:os-fping:discoverable": "@",
+    "os_compute_api:os-fping:all_tenants": "rule:admin_api",
+    "os_compute_api:os-hide-server-addresses": "is_admin:False",
+    "os_compute_api:os-hide-server-addresses:discoverable": "@",
+    "os_compute_api:os-hosts": "rule:admin_api",
+    "os_compute_api:os-hosts:discoverable": "@",
+    "os_compute_api:os-hypervisors": "rule:admin_api",
+    "os_compute_api:os-hypervisors:discoverable": "@",
+    "os_compute_api:images:discoverable": "@",
+    "os_compute_api:image-size": "rule:admin_or_owner",
+    "os_compute_api:image-size:discoverable": "@",
+    "os_compute_api:os-instance-actions": "rule:admin_or_owner",
+    "os_compute_api:os-instance-actions:discoverable": "@",
+    "os_compute_api:os-instance-actions:events": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log": "rule:admin_api",
+    "os_compute_api:os-instance-usage-audit-log:discoverable": "@",
+    "os_compute_api:ips:discoverable": "@",
+    "os_compute_api:ips:index": "rule:admin_or_owner",
+    "os_compute_api:ips:show": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:discoverable": "@",
+    "os_compute_api:os-keypairs": "rule:admin_or_owner",
+    "os_compute_api:os-keypairs:index": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:show": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:create": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:os-keypairs:delete": "rule:admin_api or user_id:%(user_id)s",
+    "os_compute_api:limits:discoverable": "@",
+    "os_compute_api:limits": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:discoverable": "@",
+    "os_compute_api:os-lock-server:lock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock": "rule:admin_or_owner",
+    "os_compute_api:os-lock-server:unlock:unlock_override": "rule:admin_api",
+    "os_compute_api:os-migrate-server:discoverable": "@",
+    "os_compute_api:os-migrate-server:migrate": "rule:admin_api",
+    "os_compute_api:os-migrate-server:migrate_live": "rule:admin_api",
+    "os_compute_api:os-multinic": "rule:admin_or_owner",
+    "os_compute_api:os-multinic:discoverable": "@",
+    "os_compute_api:os-networks": "rule:admin_api",
+    "os_compute_api:os-networks:view": "rule:admin_or_owner",
+    "os_compute_api:os-networks:discoverable": "@",
+    "os_compute_api:os-networks-associate": "rule:admin_api",
+    "os_compute_api:os-networks-associate:discoverable": "@",
+    "os_compute_api:os-pause-server:discoverable": "@",
+    "os_compute_api:os-pause-server:pause": "rule:admin_or_owner",
+    "os_compute_api:os-pause-server:unpause": "rule:admin_or_owner",
+    "os_compute_api:os-pci:pci_servers": "rule:admin_or_owner",
+    "os_compute_api:os-pci:discoverable": "@",
+    "os_compute_api:os-pci:index": "rule:admin_api",
+    "os_compute_api:os-pci:detail": "rule:admin_api",
+    "os_compute_api:os-pci:show": "rule:admin_api",
+    "os_compute_api:os-personality:discoverable": "@",
+    "os_compute_api:os-preserve-ephemeral-rebuild:discoverable": "@",
+    "os_compute_api:os-quota-sets:discoverable": "@",
+    "os_compute_api:os-quota-sets:show": "rule:admin_or_owner",
+    "os_compute_api:os-quota-sets:defaults": "@",
+    "os_compute_api:os-quota-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-sets:delete": "rule:admin_api",
+    "os_compute_api:os-quota-sets:detail": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:update": "rule:admin_api",
+    "os_compute_api:os-quota-class-sets:show": "is_admin:True or quota_class:%(quota_class)s",
+    "os_compute_api:os-quota-class-sets:discoverable": "@",
+    "os_compute_api:os-rescue": "rule:admin_or_owner",
+    "os_compute_api:os-rescue:discoverable": "@",
+    "os_compute_api:os-scheduler-hints:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules:discoverable": "@",
+    "os_compute_api:os-security-group-default-rules": "rule:admin_api",
+    "os_compute_api:os-security-groups": "rule:admin_or_owner",
+    "os_compute_api:os-security-groups:discoverable": "@",
+    "os_compute_api:os-server-diagnostics": "rule:admin_api",
+    "os_compute_api:os-server-diagnostics:discoverable": "@",
+    "os_compute_api:os-server-password": "rule:admin_or_owner",
+    "os_compute_api:os-server-password:discoverable": "@",
+    "os_compute_api:os-server-usage": "rule:admin_or_owner",
+    "os_compute_api:os-server-usage:discoverable": "@",
+    "os_compute_api:os-server-groups": "rule:admin_or_owner",
+    "os_compute_api:os-server-groups:discoverable": "@",
+    "os_compute_api:os-server-tags:index": "@",
+    "os_compute_api:os-server-tags:show": "@",
+    "os_compute_api:os-server-tags:update": "@",
+    "os_compute_api:os-server-tags:update_all": "@",
+    "os_compute_api:os-server-tags:delete": "@",
+    "os_compute_api:os-server-tags:delete_all": "@",
     "os_compute_api:os-services": "rule:admin_api",
+    "os_compute_api:os-services:discoverable": "@",
+    "os_compute_api:server-metadata:discoverable": "@",
+    "os_compute_api:server-metadata:index": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:show": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:delete": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:create": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update": "rule:admin_or_owner",
+    "os_compute_api:server-metadata:update_all": "rule:admin_or_owner",
     "os_compute_api:os-shelve:shelve": "rule:admin_or_owner",
-    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-shelve:shelve:discoverable": "@",
     "os_compute_api:os-shelve:shelve_offload": "rule:admin_api",
+    "os_compute_api:os-simple-tenant-usage:discoverable": "@",
     "os_compute_api:os-simple-tenant-usage:show": "rule:admin_or_owner",
     "os_compute_api:os-simple-tenant-usage:list": "rule:admin_api",
+    "os_compute_api:os-suspend-server:discoverable": "@",
+    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-suspend-server:resume": "rule:admin_or_owner",
-    "os_compute_api:os-suspend-server:suspend": "rule:admin_or_owner",
     "os_compute_api:os-tenant-networks": "rule:admin_or_owner",
+    "os_compute_api:os-tenant-networks:discoverable": "@",
+    "os_compute_api:os-shelve:unshelve": "rule:admin_or_owner",
+    "os_compute_api:os-user-data:discoverable": "@",
+    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
+    "os_compute_api:os-virtual-interfaces:discoverable": "@",
+    "os_compute_api:os-volumes": "rule:admin_or_owner",
+    "os_compute_api:os-volumes:discoverable": "@",
+    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
+    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner",
+    "os_compute_api:os-volumes-attachments:discoverable": "@",
+    "os_compute_api:os-availability-zone:list": "rule:admin_or_owner",
+    "os_compute_api:os-availability-zone:discoverable": "@",
+    "os_compute_api:os-availability-zone:detail": "rule:admin_api",
     "os_compute_api:os-used-limits": "rule:admin_api",
-    "os_compute_api:os-virtual-interfaces": "rule:admin_or_owner",
-    "os_compute_api:os-volumes": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:index": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:create": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:show": "rule:admin_or_owner",
-    "os_compute_api:os-volumes-attachments:update": "rule:admin_api",
-    "os_compute_api:os-volumes-attachments:delete": "rule:admin_or_owner"
+    "os_compute_api:os-used-limits:discoverable": "@",
+    "os_compute_api:os-migrations:index": "rule:admin_api",
+    "os_compute_api:os-migrations:discoverable": "@",
+    "os_compute_api:os-assisted-volume-snapshots:create": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:delete": "rule:admin_api",
+    "os_compute_api:os-assisted-volume-snapshots:discoverable": "@",
+    "os_compute_api:os-console-auth-tokens": "rule:admin_api",
+    "os_compute_api:os-console-auth-tokens:discoverable": "@",
+    "os_compute_api:os-server-external-events:create": "rule:admin_api",
+    "os_compute_api:os-server-external-events:discoverable": "@"
 }

2019-04-30 22:48:19,745 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/nova_policy.json] at time 22:48:19.745313 duration_in_ms=40.744
2019-04-30 22:48:19,745 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 22:48:19.745874
2019-04-30 22:48:19,748 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json]
2019-04-30 22:48:19,779 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/neutron_policy.json'
2019-04-30 22:48:19,781 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -7,8 +7,9 @@
     "admin_owner_or_network_owner": "rule:owner or rule:admin_or_network_owner",
     "admin_only": "rule:context_is_admin",
     "regular_user": "",
-    "admin_or_data_plane_int": "rule:context_is_admin or role:data_plane_integrator",
     "shared": "field:networks:shared=True",
+    "shared_firewalls": "field:firewalls:shared=True",
+    "shared_firewall_policies": "field:firewall_policies:shared=True",
     "shared_subnetpools": "field:subnetpools:shared=True",
     "shared_address_scopes": "field:address_scopes:shared=True",
     "external": "field:networks:router:external=True",
@@ -16,11 +17,9 @@
 
     "create_subnet": "rule:admin_or_network_owner",
     "create_subnet:segment_id": "rule:admin_only",
-    "create_subnet:service_types": "rule:admin_only",
     "get_subnet": "rule:admin_or_owner or rule:shared",
     "get_subnet:segment_id": "rule:admin_only",
     "update_subnet": "rule:admin_or_network_owner",
-    "update_subnet:service_types": "rule:admin_only",
     "delete_subnet": "rule:admin_or_network_owner",
 
     "create_subnetpool": "",
@@ -94,7 +93,6 @@
     "update_port:binding:profile": "rule:admin_only",
     "update_port:mac_learning_enabled": "rule:context_is_advsvc or rule:admin_or_network_owner",
     "update_port:allowed_address_pairs": "rule:admin_or_network_owner",
-    "update_port:data_plane_status": "rule:admin_or_data_plane_int",
     "delete_port": "rule:context_is_advsvc or rule:admin_owner_or_network_owner",
 
     "get_router:ha": "rule:admin_only",
@@ -104,9 +102,6 @@
     "create_router:ha": "rule:admin_only",
     "get_router": "rule:admin_or_owner",
     "get_router:distributed": "rule:admin_only",
-    "update_router": "rule:admin_or_owner",
-    "update_router:external_gateway_info": "rule:admin_or_owner",
-    "update_router:external_gateway_info:network_id": "rule:admin_or_owner",
     "update_router:external_gateway_info:enable_snat": "rule:admin_only",
     "update_router:distributed": "rule:admin_only",
     "update_router:ha": "rule:admin_only",
@@ -117,6 +112,28 @@
 
     "create_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
     "update_router:external_gateway_info:external_fixed_ips": "rule:admin_only",
+
+    "create_firewall": "",
+    "get_firewall": "rule:admin_or_owner",
+    "create_firewall:shared": "rule:admin_only",
+    "get_firewall:shared": "rule:admin_only",
+    "update_firewall": "rule:admin_or_owner",
+    "update_firewall:shared": "rule:admin_only",
+    "delete_firewall": "rule:admin_or_owner",
+
+    "create_firewall_policy": "",
+    "get_firewall_policy": "rule:admin_or_owner or rule:shared_firewall_policies",
+    "create_firewall_policy:shared": "rule:admin_or_owner",
+    "update_firewall_policy": "rule:admin_or_owner",
+    "delete_firewall_policy": "rule:admin_or_owner",
+
+    "insert_rule": "rule:admin_or_owner",
+    "remove_rule": "rule:admin_or_owner",
+
+    "create_firewall_rule": "",
+    "get_firewall_rule": "rule:admin_or_owner or rule:shared_firewalls",
+    "update_firewall_rule": "rule:admin_or_owner",
+    "delete_firewall_rule": "rule:admin_or_owner",
 
     "create_qos_queue": "rule:admin_only",
     "get_qos_queue": "rule:admin_only",
@@ -189,10 +206,6 @@
     "delete_policy_dscp_marking_rule": "rule:admin_only",
     "update_policy_dscp_marking_rule": "rule:admin_only",
     "get_rule_type": "rule:regular_user",
-    "get_policy_minimum_bandwidth_rule": "rule:regular_user",
-    "create_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "delete_policy_minimum_bandwidth_rule": "rule:admin_only",
-    "update_policy_minimum_bandwidth_rule": "rule:admin_only",
 
     "restrict_wildcard": "(not field:rbac_policy:target_tenant=*) or rule:admin_only",
     "create_rbac_policy": "",
@@ -205,29 +218,5 @@
     "create_flavor_service_profile": "rule:admin_only",
     "delete_flavor_service_profile": "rule:admin_only",
     "get_flavor_service_profile": "rule:regular_user",
-    "get_auto_allocated_topology": "rule:admin_or_owner",
-
-    "create_trunk": "rule:regular_user",
-    "get_trunk": "rule:admin_or_owner",
-    "delete_trunk": "rule:admin_or_owner",
-    "get_subports": "",
-    "add_subports": "rule:admin_or_owner",
-    "remove_subports": "rule:admin_or_owner",
-
-    "get_security_groups": "rule:admin_or_owner",
-    "get_security_group": "rule:admin_or_owner",
-    "create_security_group": "rule:admin_or_owner",
-    "update_security_group": "rule:admin_or_owner",
-    "delete_security_group": "rule:admin_or_owner",
-    "get_security_group_rules": "rule:admin_or_owner",
-    "get_security_group_rule": "rule:admin_or_owner",
-    "create_security_group_rule": "rule:admin_or_owner",
-    "delete_security_group_rule": "rule:admin_or_owner",
-
-    "get_loggable_resources": "rule:admin_only",
-    "create_log": "rule:admin_only",
-    "update_log": "rule:admin_only",
-    "delete_log": "rule:admin_only",
-    "get_logs": "rule:admin_only",
-    "get_log": "rule:admin_only"
+    "get_auto_allocated_topology": "rule:admin_or_owner"
 }

2019-04-30 22:48:19,781 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/neutron_policy.json] at time 22:48:19.781535 duration_in_ms=35.66
2019-04-30 22:48:19,781 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 22:48:19.781946
2019-04-30 22:48:19,782 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json]
2019-04-30 22:48:19,797 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/glance_policy.json'
2019-04-30 22:48:19,798 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -8,7 +8,6 @@
     "get_images": "",
     "modify_image": "",
     "publicize_image": "role:admin",
-    "communitize_image": "",
     "copy_from": "",
 
     "download_image": "",
@@ -26,11 +25,10 @@
 
     "manage_image_cache": "role:admin",
 
-    "get_task": "",
-    "get_tasks": "",
-    "add_task": "",
-    "modify_task": "",
-    "tasks_api_access": "role:admin",
+    "get_task": "role:admin",
+    "get_tasks": "role:admin",
+    "add_task": "role:admin",
+    "modify_task": "role:admin",
 
     "deactivate": "",
     "reactivate": "",

2019-04-30 22:48:19,799 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/glance_policy.json] at time 22:48:19.799035 duration_in_ms=17.09
2019-04-30 22:48:19,799 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 22:48:19.799448
2019-04-30 22:48:19,799 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json]
2019-04-30 22:48:19,813 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/ceilometer_policy.json'
2019-04-30 22:48:19,815 [salt.state       :300 ][INFO    ][7053] File changed:
New file
2019-04-30 22:48:19,815 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/ceilometer_policy.json] at time 22:48:19.815212 duration_in_ms=15.764
2019-04-30 22:48:19,815 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 22:48:19.815640
2019-04-30 22:48:19,815 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json]
2019-04-30 22:48:19,831 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/cinder_policy.json'
2019-04-30 22:48:19,833 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -1,136 +1,113 @@
 {
     "context_is_admin": "role:admin",
-    "admin_or_owner": "is_admin:True or (role:admin and is_admin_project:True) or  project_id:%(project_id)s",
-    "admin_api": "is_admin:True or (role:admin and is_admin_project:True)",
-    "volume:attachment_create": "",
-    "volume:attachment_update": "rule:admin_or_owner",
-    "volume:attachment_delete": "rule:admin_or_owner",
-    "message:get_all": "rule:admin_or_owner",
-    "message:get": "rule:admin_or_owner",
-    "message:delete": "rule:admin_or_owner",
-    "clusters:get_all": "rule:admin_api",
-    "clusters:get": "rule:admin_api",
-    "clusters:update": "rule:admin_api",
-    "workers:cleanup": "rule:admin_api",
+    "admin_or_owner":  "is_admin:True or project_id:%(project_id)s",
+    "default": "rule:admin_or_owner",
+
+    "admin_api": "is_admin:True",
+
+    "volume:create": "",
+    "volume:delete": "rule:admin_or_owner",
+    "volume:get": "rule:admin_or_owner",
+    "volume:get_all": "rule:admin_or_owner",
+    "volume:get_volume_metadata": "rule:admin_or_owner",
+    "volume:delete_volume_metadata": "rule:admin_or_owner",
+    "volume:update_volume_metadata": "rule:admin_or_owner",
+    "volume:get_volume_admin_metadata": "rule:admin_api",
+    "volume:update_volume_admin_metadata": "rule:admin_api",
+    "volume:get_snapshot": "rule:admin_or_owner",
+    "volume:get_all_snapshots": "rule:admin_or_owner",
+    "volume:create_snapshot": "rule:admin_or_owner",
+    "volume:delete_snapshot": "rule:admin_or_owner",
+    "volume:update_snapshot": "rule:admin_or_owner",
     "volume:get_snapshot_metadata": "rule:admin_or_owner",
+    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
     "volume:update_snapshot_metadata": "rule:admin_or_owner",
-    "volume:delete_snapshot_metadata": "rule:admin_or_owner",
-    "volume:get_all_snapshots": "rule:admin_or_owner",
-    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
-    "volume:create_snapshot": "rule:admin_or_owner",
-    "volume:get_snapshot": "rule:admin_or_owner",
-    "volume:update_snapshot": "rule:admin_or_owner",
-    "volume:delete_snapshot": "rule:admin_or_owner",
-    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
-    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
-    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
-    "snapshot_extension:list_manageable": "rule:admin_api",
-    "snapshot_extension:snapshot_manage": "rule:admin_api",
-    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
-    "backup:get_all": "rule:admin_or_owner",
-    "backup:backup_project_attribute": "rule:admin_api",
-    "backup:create": "",
-    "backup:get": "rule:admin_or_owner",
-    "backup:update": "rule:admin_or_owner",
-    "backup:delete": "rule:admin_or_owner",
-    "backup:restore": "rule:admin_or_owner",
-    "backup:backup-import": "rule:admin_api",
-    "backup:export-import": "rule:admin_api",
-    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
-    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
-    "group:get_all": "rule:admin_or_owner",
-    "group:create": "",
-    "group:get": "rule:admin_or_owner",
-    "group:update": "rule:admin_or_owner",
-    "group:group_types_manage": "rule:admin_api",
-    "group:access_group_types_specs": "rule:admin_api",
-    "group:group_types_specs": "rule:admin_api",
-    "group:get_all_group_snapshots": "rule:admin_or_owner",
-    "group:create_group_snapshot": "",
-    "group:get_group_snapshot": "rule:admin_or_owner",
-    "group:delete_group_snapshot": "rule:admin_or_owner",
-    "group:update_group_snapshot": "rule:admin_or_owner",
-    "group:reset_group_snapshot_status": "rule:admin_or_owner",
-    "group:delete": "rule:admin_or_owner",
-    "group:reset_status": "rule:admin_api",
-    "group:enable_replication": "rule:admin_or_owner",
-    "group:disable_replication": "rule:admin_or_owner",
-    "group:failover_replication": "rule:admin_or_owner",
-    "group:list_replication_targets": "rule:admin_or_owner",
-    "volume_extension:qos_specs_manage:get_all": "rule:admin_api",
-    "volume_extension:qos_specs_manage:get": "rule:admin_api",
-    "volume_extension:qos_specs_manage:create": "rule:admin_api",
-    "volume_extension:qos_specs_manage:update": "rule:admin_api",
-    "volume_extension:qos_specs_manage:delete": "rule:admin_api",
-    "volume_extension:quota_classes": "rule:admin_api",
-    "volume_extension:quotas:show": "rule:admin_or_owner",
-    "volume_extension:quotas:update": "rule:admin_api",
-    "volume_extension:quotas:delete": "rule:admin_api",
-    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
-    "volume_extension:capabilities": "rule:admin_api",
-    "volume_extension:services:index": "rule:admin_api",
-    "volume_extension:services:update": "rule:admin_api",
-    "volume:freeze_host": "rule:admin_api",
-    "volume:thaw_host": "rule:admin_api",
-    "volume:failover_host": "rule:admin_api",
-    "scheduler_extension:scheduler_stats:get_pools": "rule:admin_api",
-    "volume_extension:hosts": "rule:admin_api",
-    "limits_extension:used_limits": "rule:admin_or_owner",
-    "volume_extension:list_manageable": "rule:admin_api",
-    "volume_extension:volume_manage": "rule:admin_api",
-    "volume_extension:volume_unmanage": "rule:admin_api",
+    "volume:extend": "rule:admin_or_owner",
+    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume:retype": "rule:admin_or_owner",
+    "volume:update": "rule:admin_or_owner",
+
     "volume_extension:types_manage": "rule:admin_api",
-    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:types_extra_specs": "rule:admin_api",
+    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:access_types_extra_specs": "rule:admin_api",
-    "volume_extension:access_types_qos_specs_id": "rule:admin_api",
     "volume_extension:volume_type_access": "rule:admin_or_owner",
     "volume_extension:volume_type_access:addProjectAccess": "rule:admin_api",
     "volume_extension:volume_type_access:removeProjectAccess": "rule:admin_api",
-    "volume:extend": "rule:admin_or_owner",
-    "volume:extend_attached_volume": "rule:admin_or_owner",
-    "volume:revert_to_snapshot": "rule:admin_or_owner",
+    "volume_extension:volume_type_encryption": "rule:admin_api",
+    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner",
+    "volume_extension:extended_snapshot_attributes": "rule:admin_or_owner",
+    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
+
+    "volume_extension:quotas:show": "",
+    "volume_extension:quotas:update": "rule:admin_api",
+    "volume_extension:quotas:delete": "rule:admin_api",
+    "volume_extension:quota_classes": "rule:admin_api",
+    "volume_extension:quota_classes:validate_setup_for_nested_quota_use": "rule:admin_api",
+
     "volume_extension:volume_admin_actions:reset_status": "rule:admin_api",
-    "volume:retype": "rule:admin_or_owner",
-    "volume:update_readonly_flag": "rule:admin_or_owner",
+    "volume_extension:snapshot_admin_actions:reset_status": "rule:admin_api",
+    "volume_extension:backup_admin_actions:reset_status": "rule:admin_api",
     "volume_extension:volume_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
+    "volume_extension:snapshot_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:backup_admin_actions:force_delete": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
+    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
+
     "volume_extension:volume_actions:upload_public": "rule:admin_api",
     "volume_extension:volume_actions:upload_image": "rule:admin_or_owner",
-    "volume_extension:volume_admin_actions:force_detach": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume": "rule:admin_api",
-    "volume_extension:volume_admin_actions:migrate_volume_completion": "rule:admin_api",
-    "volume_extension:volume_actions:initialize_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:terminate_connection": "rule:admin_or_owner",
-    "volume_extension:volume_actions:roll_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:reserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:unreserve": "rule:admin_or_owner",
-    "volume_extension:volume_actions:begin_detaching": "rule:admin_or_owner",
-    "volume_extension:volume_actions:attach": "rule:admin_or_owner",
-    "volume_extension:volume_actions:detach": "rule:admin_or_owner",
-    "volume:get_all_transfers": "rule:admin_or_owner",
-    "volume:create_transfer": "rule:admin_or_owner",
-    "volume:get_transfer": "rule:admin_or_owner",
-    "volume:accept_transfer": "",
-    "volume:delete_transfer": "rule:admin_or_owner",
-    "volume:get_volume_metadata": "rule:admin_or_owner",
-    "volume:create_volume_metadata": "rule:admin_or_owner",
-    "volume:update_volume_metadata": "rule:admin_or_owner",
-    "volume:delete_volume_metadata": "rule:admin_or_owner",
-    "volume_extension:volume_image_metadata": "rule:admin_or_owner",
-    "volume:update_volume_admin_metadata": "rule:admin_api",
-    "volume_extension:types_extra_specs:index": "rule:admin_api",
-    "volume_extension:types_extra_specs:create": "rule:admin_api",
-    "volume_extension:types_extra_specs:show": "rule:admin_api",
-    "volume_extension:types_extra_specs:update": "rule:admin_api",
-    "volume_extension:types_extra_specs:delete": "rule:admin_api",
-    "volume:create": "",
-    "volume:create_from_image": "",
-    "volume:get": "rule:admin_or_owner",
-    "volume:get_all": "rule:admin_or_owner",
-    "volume:update": "rule:admin_or_owner",
-    "volume:delete": "rule:admin_or_owner",
-    "volume:force_delete": "rule:admin_api",
+
     "volume_extension:volume_host_attribute": "rule:admin_api",
     "volume_extension:volume_tenant_attribute": "rule:admin_or_owner",
     "volume_extension:volume_mig_status_attribute": "rule:admin_api",
-    "volume_extension:volume_encryption_metadata": "rule:admin_or_owner"
+    "volume_extension:hosts": "rule:admin_api",
+    "volume_extension:services:index": "rule:admin_api",
+    "volume_extension:services:update" : "rule:admin_api",
+
+    "volume_extension:volume_manage": "rule:admin_api",
+    "volume_extension:volume_unmanage": "rule:admin_api",
+
+    "volume_extension:capabilities": "rule:admin_api",
+
+    "volume:create_transfer": "rule:admin_or_owner",
+    "volume:accept_transfer": "",
+    "volume:delete_transfer": "rule:admin_or_owner",
+    "volume:get_transfer": "rule:admin_or_owner",
+    "volume:get_all_transfers": "rule:admin_or_owner",
+
+    "volume_extension:replication:promote": "rule:admin_api",
+    "volume_extension:replication:reenable": "rule:admin_api",
+
+    "volume:failover_host": "rule:admin_api",
+    "volume:freeze_host": "rule:admin_api",
+    "volume:thaw_host": "rule:admin_api",
+
+    "backup:create" : "",
+    "backup:delete": "rule:admin_or_owner",
+    "backup:get": "rule:admin_or_owner",
+    "backup:get_all": "rule:admin_or_owner",
+    "backup:restore": "rule:admin_or_owner",
+    "backup:backup-import": "rule:admin_api",
+    "backup:backup-export": "rule:admin_api",
+
+    "snapshot_extension:snapshot_actions:update_snapshot_status": "",
+    "snapshot_extension:snapshot_manage": "rule:admin_api",
+    "snapshot_extension:snapshot_unmanage": "rule:admin_api",
+
+    "consistencygroup:create" : "group:nobody",
+    "consistencygroup:delete": "group:nobody",
+    "consistencygroup:update": "group:nobody",
+    "consistencygroup:get": "group:nobody",
+    "consistencygroup:get_all": "group:nobody",
+
+    "consistencygroup:create_cgsnapshot" : "group:nobody",
+    "consistencygroup:delete_cgsnapshot": "group:nobody",
+    "consistencygroup:get_cgsnapshot": "group:nobody",
+    "consistencygroup:get_all_cgsnapshots": "group:nobody",
+
+    "scheduler_extension:scheduler_stats:get_pools" : "rule:admin_api",
+    "message:delete": "rule:admin_or_owner",
+    "message:get": "rule:admin_or_owner",
+    "message:get_all": "rule:admin_or_owner"
 }

2019-04-30 22:48:19,838 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/cinder_policy.json] at time 22:48:19.838074 duration_in_ms=22.434
2019-04-30 22:48:19,838 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 22:48:19.838514
2019-04-30 22:48:19,838 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json]
2019-04-30 22:48:19,852 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/heat_policy.json'
2019-04-30 22:48:19,853 [salt.state       :300 ][INFO    ][7053] File changed:
New file
2019-04-30 22:48:19,854 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/heat_policy.json] at time 22:48:19.854111 duration_in_ms=15.598
2019-04-30 22:48:19,854 [salt.state       :1780][INFO    ][7053] Running state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 22:48:19.854570
2019-04-30 22:48:19,854 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json]
2019-04-30 22:48:19,868 [salt.fileclient  :1219][INFO    ][7053] Fetching file from saltenv 'base', ** done ** 'horizon/files/policy/rocky/keystone_policy.json'
2019-04-30 22:48:19,870 [salt.state       :300 ][INFO    ][7053] File changed:
--- 
+++ 
@@ -2,50 +2,137 @@
     "admin_required": "role:admin or is_admin:1",
     "service_role": "role:service",
     "service_or_admin": "rule:admin_required or rule:service_role",
-    "owner": "user_id:%(user_id)s",
+    "owner" : "user_id:%(user_id)s",
     "admin_or_owner": "rule:admin_required or rule:owner",
     "token_subject": "user_id:%(target.token.user_id)s",
     "admin_or_token_subject": "rule:admin_required or rule:token_subject",
     "service_admin_or_token_subject": "rule:service_or_admin or rule:token_subject",
-    "identity:authorize_request_token": "rule:admin_required",
-    "identity:get_access_token": "rule:admin_required",
-    "identity:get_access_token_role": "rule:admin_required",
-    "identity:list_access_tokens": "rule:admin_required",
-    "identity:list_access_token_roles": "rule:admin_required",
-    "identity:delete_access_token": "rule:admin_required",
-    "identity:get_auth_catalog": "",
-    "identity:get_auth_projects": "",
-    "identity:get_auth_domains": "",
-    "identity:get_consumer": "rule:admin_required",
-    "identity:list_consumers": "rule:admin_required",
-    "identity:create_consumer": "rule:admin_required",
-    "identity:update_consumer": "rule:admin_required",
-    "identity:delete_consumer": "rule:admin_required",
+
+    "default": "rule:admin_required",
+
+    "identity:get_region": "",
+    "identity:list_regions": "",
+    "identity:create_region": "rule:admin_required",
+    "identity:update_region": "rule:admin_required",
+    "identity:delete_region": "rule:admin_required",
+
+    "identity:get_service": "rule:admin_required",
+    "identity:list_services": "rule:admin_required",
+    "identity:create_service": "rule:admin_required",
+    "identity:update_service": "rule:admin_required",
+    "identity:delete_service": "rule:admin_required",
+
+    "identity:get_endpoint": "rule:admin_required",
+    "identity:list_endpoints": "rule:admin_required",
+    "identity:create_endpoint": "rule:admin_required",
+    "identity:update_endpoint": "rule:admin_required",
+    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_domain": "rule:admin_required",
+    "identity:list_domains": "rule:admin_required",
+    "identity:create_domain": "rule:admin_required",
+    "identity:update_domain": "rule:admin_required",
+    "identity:delete_domain": "rule:admin_required",
+
+    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
+    "identity:list_projects": "rule:admin_required",
+    "identity:list_user_projects": "rule:admin_or_owner",
+    "identity:create_project": "rule:admin_required",
+    "identity:update_project": "rule:admin_required",
+    "identity:delete_project": "rule:admin_required",
+
+    "identity:get_user": "rule:admin_required",
+    "identity:list_users": "rule:admin_required",
+    "identity:create_user": "rule:admin_required",
+    "identity:update_user": "rule:admin_required",
+    "identity:delete_user": "rule:admin_required",
+    "identity:change_password": "rule:admin_or_owner",
+
+    "identity:get_group": "rule:admin_required",
+    "identity:list_groups": "rule:admin_required",
+    "identity:list_groups_for_user": "rule:admin_or_owner",
+    "identity:create_group": "rule:admin_required",
+    "identity:update_group": "rule:admin_required",
+    "identity:delete_group": "rule:admin_required",
+    "identity:list_users_in_group": "rule:admin_required",
+    "identity:remove_user_from_group": "rule:admin_required",
+    "identity:check_user_in_group": "rule:admin_required",
+    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:get_credential": "rule:admin_required",
     "identity:list_credentials": "rule:admin_required",
     "identity:create_credential": "rule:admin_required",
     "identity:update_credential": "rule:admin_required",
     "identity:delete_credential": "rule:admin_required",
-    "identity:get_domain": "rule:admin_required or token.project.domain.id:%(target.domain.id)s",
-    "identity:list_domains": "rule:admin_required",
-    "identity:create_domain": "rule:admin_required",
-    "identity:update_domain": "rule:admin_required",
-    "identity:delete_domain": "rule:admin_required",
-    "identity:create_domain_config": "rule:admin_required",
-    "identity:get_domain_config": "rule:admin_required",
-    "identity:get_security_compliance_domain_config": "",
-    "identity:update_domain_config": "rule:admin_required",
-    "identity:delete_domain_config": "rule:admin_required",
-    "identity:get_domain_config_default": "rule:admin_required",
+
     "identity:ec2_get_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
     "identity:ec2_list_credentials": "rule:admin_or_owner",
     "identity:ec2_create_credential": "rule:admin_or_owner",
     "identity:ec2_delete_credential": "rule:admin_required or (rule:owner and user_id:%(target.credential.user_id)s)",
-    "identity:get_endpoint": "rule:admin_required",
-    "identity:list_endpoints": "rule:admin_required",
-    "identity:create_endpoint": "rule:admin_required",
-    "identity:update_endpoint": "rule:admin_required",
-    "identity:delete_endpoint": "rule:admin_required",
+
+    "identity:get_role": "rule:admin_required",
+    "identity:list_roles": "rule:admin_required",
+    "identity:create_role": "rule:admin_required",
+    "identity:update_role": "rule:admin_required",
+    "identity:delete_role": "rule:admin_required",
+    "identity:get_domain_role": "rule:admin_required",
+    "identity:list_domain_roles": "rule:admin_required",
+    "identity:create_domain_role": "rule:admin_required",
+    "identity:update_domain_role": "rule:admin_required",
+    "identity:delete_domain_role": "rule:admin_required",
+
+    "identity:get_implied_role": "rule:admin_required ",
+    "identity:list_implied_roles": "rule:admin_required",
+    "identity:create_implied_role": "rule:admin_required",
+    "identity:delete_implied_role": "rule:admin_required",
+    "identity:list_role_inference_rules": "rule:admin_required",
+    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:check_grant": "rule:admin_required",
+    "identity:list_grants": "rule:admin_required",
+    "identity:create_grant": "rule:admin_required",
+    "identity:revoke_grant": "rule:admin_required",
+
+    "identity:list_role_assignments": "rule:admin_required",
+    "identity:list_role_assignments_for_tree": "rule:admin_required",
+
+    "identity:get_policy": "rule:admin_required",
+    "identity:list_policies": "rule:admin_required",
+    "identity:create_policy": "rule:admin_required",
+    "identity:update_policy": "rule:admin_required",
+    "identity:delete_policy": "rule:admin_required",
+
+    "identity:check_token": "rule:admin_or_token_subject",
+    "identity:validate_token": "rule:service_admin_or_token_subject",
+    "identity:validate_token_head": "rule:service_or_admin",
+    "identity:revocation_list": "rule:service_or_admin",
+    "identity:revoke_token": "rule:admin_or_token_subject",
+
+    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
+    "identity:list_trusts": "",
+    "identity:list_roles_for_trust": "",
+    "identity:get_role_for_trust": "",
+    "identity:delete_trust": "",
+
+    "identity:create_consumer": "rule:admin_required",
+    "identity:get_consumer": "rule:admin_required",
+    "identity:list_consumers": "rule:admin_required",
+    "identity:delete_consumer": "rule:admin_required",
+    "identity:update_consumer": "rule:admin_required",
+
+    "identity:authorize_request_token": "rule:admin_required",
+    "identity:list_access_token_roles": "rule:admin_required",
+    "identity:get_access_token_role": "rule:admin_required",
+    "identity:list_access_tokens": "rule:admin_required",
+    "identity:get_access_token": "rule:admin_required",
+    "identity:delete_access_token": "rule:admin_required",
+
+    "identity:list_projects_for_endpoint": "rule:admin_required",
+    "identity:add_endpoint_to_project": "rule:admin_required",
+    "identity:check_endpoint_in_project": "rule:admin_required",
+    "identity:list_endpoints_for_project": "rule:admin_required",
+    "identity:remove_endpoint_from_project": "rule:admin_required",
+
     "identity:create_endpoint_group": "rule:admin_required",
     "identity:list_endpoint_groups": "rule:admin_required",
     "identity:get_endpoint_group": "rule:admin_required",
@@ -57,41 +144,40 @@
     "identity:list_endpoint_groups_for_project": "rule:admin_required",
     "identity:add_endpoint_group_to_project": "rule:admin_required",
     "identity:remove_endpoint_group_from_project": "rule:admin_required",
-    "identity:check_grant": "rule:admin_required",
-    "identity:list_grants": "rule:admin_required",
-    "identity:create_grant": "rule:admin_required",
-    "identity:revoke_grant": "rule:admin_required",
-    "identity:get_group": "rule:admin_required",
-    "identity:list_groups": "rule:admin_required",
-    "identity:list_groups_for_user": "rule:admin_or_owner",
-    "identity:create_group": "rule:admin_required",
-    "identity:update_group": "rule:admin_required",
-    "identity:delete_group": "rule:admin_required",
-    "identity:list_users_in_group": "rule:admin_required",
-    "identity:remove_user_from_group": "rule:admin_required",
-    "identity:check_user_in_group": "rule:admin_required",
-    "identity:add_user_to_group": "rule:admin_required",
+
     "identity:create_identity_provider": "rule:admin_required",
     "identity:list_identity_providers": "rule:admin_required",
-    "identity:get_identity_provider": "rule:admin_required",
+    "identity:get_identity_providers": "rule:admin_required",
     "identity:update_identity_provider": "rule:admin_required",
     "identity:delete_identity_provider": "rule:admin_required",
-    "identity:get_implied_role": "rule:admin_required",
-    "identity:list_implied_roles": "rule:admin_required",
-    "identity:create_implied_role": "rule:admin_required",
-    "identity:delete_implied_role": "rule:admin_required",
-    "identity:list_role_inference_rules": "rule:admin_required",
-    "identity:check_implied_role": "rule:admin_required",
+
+    "identity:create_protocol": "rule:admin_required",
+    "identity:update_protocol": "rule:admin_required",
+    "identity:get_protocol": "rule:admin_required",
+    "identity:list_protocols": "rule:admin_required",
+    "identity:delete_protocol": "rule:admin_required",
+
     "identity:create_mapping": "rule:admin_required",
     "identity:get_mapping": "rule:admin_required",
     "identity:list_mappings": "rule:admin_required",
     "identity:delete_mapping": "rule:admin_required",
     "identity:update_mapping": "rule:admin_required",
-    "identity:get_policy": "rule:admin_required",
-    "identity:list_policies": "rule:admin_required",
-    "identity:create_policy": "rule:admin_required",
-    "identity:update_policy": "rule:admin_required",
-    "identity:delete_policy": "rule:admin_required",
+
+    "identity:create_service_provider": "rule:admin_required",
+    "identity:list_service_providers": "rule:admin_required",
+    "identity:get_service_provider": "rule:admin_required",
+    "identity:update_service_provider": "rule:admin_required",
+    "identity:delete_service_provider": "rule:admin_required",
+
+    "identity:get_auth_catalog": "",
+    "identity:get_auth_projects": "",
+    "identity:get_auth_domains": "",
+
+    "identity:list_projects_for_groups": "",
+    "identity:list_domains_for_groups": "",
+
+    "identity:list_revoke_events": "",
+
     "identity:create_policy_association_for_endpoint": "rule:admin_required",
     "identity:check_policy_association_for_endpoint": "rule:admin_required",
     "identity:delete_policy_association_for_endpoint": "rule:admin_required",
@@ -103,72 +189,10 @@
     "identity:delete_policy_association_for_region_and_service": "rule:admin_required",
     "identity:get_policy_for_endpoint": "rule:admin_required",
     "identity:list_endpoints_for_policy": "rule:admin_required",
-    "identity:get_project": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:list_projects": "rule:admin_required",
-    "identity:list_user_projects": "rule:admin_or_owner",
-    "identity:create_project": "rule:admin_required",
-    "identity:update_project": "rule:admin_required",
-    "identity:delete_project": "rule:admin_required",
-    "identity:list_project_tags": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:get_project_tag": "rule:admin_required or project_id:%(target.project.id)s",
-    "identity:update_project_tags": "rule:admin_required",
-    "identity:create_project_tag": "rule:admin_required",
-    "identity:delete_project_tags": "rule:admin_required",
-    "identity:delete_project_tag": "rule:admin_required",
-    "identity:list_projects_for_endpoint": "rule:admin_required",
-    "identity:add_endpoint_to_project": "rule:admin_required",
-    "identity:check_endpoint_in_project": "rule:admin_required",
-    "identity:list_endpoints_for_project": "rule:admin_required",
-    "identity:remove_endpoint_from_project": "rule:admin_required",
-    "identity:create_protocol": "rule:admin_required",
-    "identity:update_protocol": "rule:admin_required",
-    "identity:get_protocol": "rule:admin_required",
-    "identity:list_protocols": "rule:admin_required",
-    "identity:delete_protocol": "rule:admin_required",
-    "identity:get_region": "",
-    "identity:list_regions": "",
-    "identity:create_region": "rule:admin_required",
-    "identity:update_region": "rule:admin_required",
-    "identity:delete_region": "rule:admin_required",
-    "identity:list_revoke_events": "rule:service_or_admin",
-    "identity:get_role": "rule:admin_required",
-    "identity:list_roles": "rule:admin_required",
-    "identity:create_role": "rule:admin_required",
-    "identity:update_role": "rule:admin_required",
-    "identity:delete_role": "rule:admin_required",
-    "identity:get_domain_role": "rule:admin_required",
-    "identity:list_domain_roles": "rule:admin_required",
-    "identity:create_domain_role": "rule:admin_required",
-    "identity:update_domain_role": "rule:admin_required",
-    "identity:delete_domain_role": "rule:admin_required",
-    "identity:list_role_assignments": "rule:admin_required",
-    "identity:list_role_assignments_for_tree": "rule:admin_required",
-    "identity:get_service": "rule:admin_required",
-    "identity:list_services": "rule:admin_required",
-    "identity:create_service": "rule:admin_required",
-    "identity:update_service": "rule:admin_required",
-    "identity:delete_service": "rule:admin_required",
-    "identity:create_service_provider": "rule:admin_required",
-    "identity:list_service_providers": "rule:admin_required",
-    "identity:get_service_provider": "rule:admin_required",
-    "identity:update_service_provider": "rule:admin_required",
-    "identity:delete_service_provider": "rule:admin_required",
-    "identity:revocation_list": "rule:service_or_admin",
-    "identity:check_token": "rule:admin_or_token_subject",
-    "identity:validate_token": "rule:service_admin_or_token_subject",
-    "identity:validate_token_head": "rule:service_or_admin",
-    "identity:revoke_token": "rule:admin_or_token_subject",
-    "identity:create_trust": "user_id:%(trust.trustor_user_id)s",
-    "identity:list_trusts": "",
-    "identity:list_roles_for_trust": "",
-    "identity:get_role_for_trust": "",
-    "identity:delete_trust": "",
-    "identity:get_trust": "",
-    "identity:get_user": "rule:admin_or_owner",
-    "identity:list_users": "rule:admin_required",
-    "identity:list_projects_for_user": "",
-    "identity:list_domains_for_user": "",
-    "identity:create_user": "rule:admin_required",
-    "identity:update_user": "rule:admin_required",
-    "identity:delete_user": "rule:admin_required"
+
+    "identity:create_domain_config": "rule:admin_required",
+    "identity:get_domain_config": "rule:admin_required",
+    "identity:update_domain_config": "rule:admin_required",
+    "identity:delete_domain_config": "rule:admin_required",
+    "identity:get_domain_config_default": "rule:admin_required"
 }

2019-04-30 22:48:19,876 [salt.state       :1951][INFO    ][7053] Completed state [/usr/share/openstack-dashboard/openstack_dashboard/conf/keystone_policy.json] at time 22:48:19.876546 duration_in_ms=21.976
2019-04-30 22:48:19,880 [salt.state       :1780][INFO    ][7053] Running state [wsgi_openstack_web] at time 22:48:19.880285
2019-04-30 22:48:19,880 [salt.state       :1813][INFO    ][7053] Executing state apache_site.enabled for [wsgi_openstack_web]
2019-04-30 22:48:19,882 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['a2ensite', 'wsgi_openstack_web'] in directory '/root'
2019-04-30 22:48:19,917 [salt.state       :300 ][INFO    ][7053] {'new': 'wsgi_openstack_web', 'old': None}
2019-04-30 22:48:19,917 [salt.state       :1951][INFO    ][7053] Completed state [wsgi_openstack_web] at time 22:48:19.917442 duration_in_ms=37.154
2019-04-30 22:48:20,259 [salt.state       :1780][INFO    ][7053] Running state [/var/log/horizon] at time 22:48:20.259059
2019-04-30 22:48:20,259 [salt.state       :1813][INFO    ][7053] Executing state file.directory for [/var/log/horizon]
2019-04-30 22:48:20,260 [salt.state       :300 ][INFO    ][7053] {'/var/log/horizon': 'New Dir'}
2019-04-30 22:48:20,260 [salt.state       :1951][INFO    ][7053] Completed state [/var/log/horizon] at time 22:48:20.260820 duration_in_ms=1.761
2019-04-30 22:48:20,261 [salt.state       :1780][INFO    ][7053] Running state [/var/log/horizon/horizon.log] at time 22:48:20.261165
2019-04-30 22:48:20,261 [salt.state       :1813][INFO    ][7053] Executing state file.managed for [/var/log/horizon/horizon.log]
2019-04-30 22:48:20,261 [salt.loaded.int.states.file:2298][WARNING ][7053] State for file: /var/log/horizon/horizon.log - Neither 'source' nor 'contents' nor 'contents_pillar' nor 'contents_grains' was defined, yet 'replace' was set to 'True'. As there is no source to replace the file with, 'replace' has been set to 'False' to avoid reading the file unnecessarily.
2019-04-30 22:48:20,262 [salt.state       :300 ][INFO    ][7053] {'new': 'file /var/log/horizon/horizon.log created', 'group': 'adm', 'user': 'horizon', 'mode': '0640'}
2019-04-30 22:48:20,262 [salt.state       :1951][INFO    ][7053] Completed state [/var/log/horizon/horizon.log] at time 22:48:20.262684 duration_in_ms=1.52
2019-04-30 22:48:20,263 [salt.state       :1780][INFO    ][7053] Running state [apache2] at time 22:48:20.263359
2019-04-30 22:48:20,263 [salt.state       :1813][INFO    ][7053] Executing state service.running for [apache2]
2019-04-30 22:48:20,264 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2019-04-30 22:48:20,276 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2019-04-30 22:48:20,289 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemd-run', '--scope', 'systemctl', 'start', 'apache2.service'] in directory '/root'
2019-04-30 22:48:21,404 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-active', 'apache2.service'] in directory '/root'
2019-04-30 22:48:21,420 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2019-04-30 22:48:21,438 [salt.loaded.int.module.cmdmod:395 ][INFO    ][7053] Executing command ['systemctl', 'is-enabled', 'apache2.service'] in directory '/root'
2019-04-30 22:48:21,452 [salt.state       :300 ][INFO    ][7053] {'apache2': True}
2019-04-30 22:48:21,453 [salt.state       :1951][INFO    ][7053] Completed state [apache2] at time 22:48:21.453334 duration_in_ms=1189.972
2019-04-30 22:48:21,457 [salt.minion      :1711][INFO    ][7053] Returning information for job: 20190430224516212732
2019-04-30 22:48:22,107 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command state.sls with jid 20190430224822096887
2019-04-30 22:48:22,118 [salt.minion      :1432][INFO    ][12445] Starting a new job with PID 12445
2019-04-30 22:48:25,727 [salt.state       :915 ][INFO    ][12445] Loading fresh modules for state activity
2019-04-30 22:48:25,766 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/init.sls'
2019-04-30 22:48:25,792 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/server.sls'
2019-04-30 22:48:25,845 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/server/users.sls'
2019-04-30 22:48:25,875 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/server/sites.sls'
2019-04-30 22:48:25,925 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2019-04-30 22:48:25,942 [salt.loaded.int.module.cmdmod:730 ][ERROR   ][12445] Command 'cat /etc/ssl/certs/172.30.10.101-with-chain.crt' failed with return code: 1
2019-04-30 22:48:25,943 [salt.loaded.int.module.cmdmod:732 ][ERROR   ][12445] stdout: cat: /etc/ssl/certs/172.30.10.101-with-chain.crt: No such file or directory
2019-04-30 22:48:25,943 [salt.loaded.int.module.cmdmod:736 ][ERROR   ][12445] retcode: 1
2019-04-30 22:48:25,944 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt' in directory '/root'
2019-04-30 22:48:25,999 [salt.state       :1780][INFO    ][12445] Running state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 22:48:25.999510
2019-04-30 22:48:26,000 [salt.state       :1813][INFO    ][12445] Executing state cmd.run for [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt]
2019-04-30 22:48:26,000 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command '/bin/true' in directory '/root'
2019-04-30 22:48:26,013 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command 'cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt' in directory '/root'
2019-04-30 22:48:26,021 [salt.state       :300 ][INFO    ][12445] {'pid': 12464, 'retcode': 0, 'stderr': '', 'stdout': ''}
2019-04-30 22:48:26,021 [salt.state       :1951][INFO    ][12445] Completed state [cat /etc/ssl/certs/172.30.10.101.crt /etc/ssl/certs/ca-salt_master_ca.crt > /etc/ssl/certs/172.30.10.101-with-chain.crt] at time 22:48:26.021634 duration_in_ms=22.126
2019-04-30 22:48:27,171 [salt.state       :1780][INFO    ][12445] Running state [nginx] at time 22:48:27.171790
2019-04-30 22:48:27,172 [salt.state       :1813][INFO    ][12445] Executing state pkg.installed for [nginx]
2019-04-30 22:48:27,173 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:48:27,478 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['apt-cache', '-q', 'policy', 'nginx'] in directory '/root'
2019-04-30 22:48:27,530 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['apt-get', '-q', 'update'] in directory '/root'
2019-04-30 22:48:29,270 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['dpkg', '--get-selections', '*'] in directory '/root'
2019-04-30 22:48:29,290 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemd-run', '--scope', 'apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'nginx'] in directory '/root'
2019-04-30 22:48:37,137 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224837123929
2019-04-30 22:48:37,148 [salt.minion      :1432][INFO    ][13228] Starting a new job with PID 13228
2019-04-30 22:48:37,164 [salt.minion      :1711][INFO    ][13228] Returning information for job: 20190430224837123929
2019-04-30 22:48:41,631 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}', '-W'] in directory '/root'
2019-04-30 22:48:41,662 [salt.state       :300 ][INFO    ][12445] Made the following changes:
'libgd3' changed from 'absent' to '2.1.1-4ubuntu0.16.04.11'
'nginx-core' changed from 'absent' to '1.10.3-0ubuntu0.16.04.3'
'libxpm4' changed from 'absent' to '1:3.5.11-1ubuntu0.16.04.1'
'nginx' changed from 'absent' to '1.10.3-0ubuntu0.16.04.3'
'nginx-common' changed from 'absent' to '1.10.3-0ubuntu0.16.04.3'
'libfontconfig' changed from 'absent' to '1'
'fonts-dejavu-core' changed from 'absent' to '2.35-1'
'fontconfig-config' changed from 'absent' to '2.11.94-0ubuntu1.1'
'libvpx3' changed from 'absent' to '1.5.0-2ubuntu1'
'libfontconfig1' changed from 'absent' to '2.11.94-0ubuntu1.1'

2019-04-30 22:48:41,685 [salt.state       :915 ][INFO    ][12445] Loading fresh modules for state activity
2019-04-30 22:48:41,805 [salt.state       :1951][INFO    ][12445] Completed state [nginx] at time 22:48:41.805143 duration_in_ms=14633.353
2019-04-30 22:48:41,810 [salt.state       :1780][INFO    ][12445] Running state [apache2-utils] at time 22:48:41.810060
2019-04-30 22:48:41,810 [salt.state       :1813][INFO    ][12445] Executing state pkg.installed for [apache2-utils]
2019-04-30 22:48:42,282 [salt.state       :300 ][INFO    ][12445] All specified packages are already installed
2019-04-30 22:48:42,283 [salt.state       :1951][INFO    ][12445] Completed state [apache2-utils] at time 22:48:42.283013 duration_in_ms=472.953
2019-04-30 22:48:42,283 [salt.state       :1780][INFO    ][12445] Running state [openssl] at time 22:48:42.283263
2019-04-30 22:48:42,283 [salt.state       :1813][INFO    ][12445] Executing state pkg.installed for [openssl]
2019-04-30 22:48:42,288 [salt.state       :300 ][INFO    ][12445] All specified packages are already installed
2019-04-30 22:48:42,288 [salt.state       :1951][INFO    ][12445] Completed state [openssl] at time 22:48:42.288843 duration_in_ms=5.579
2019-04-30 22:48:42,290 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_barbican.conf] at time 22:48:42.290636
2019-04-30 22:48:42,290 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_barbican.conf]
2019-04-30 22:48:42,312 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/proxy.conf'
2019-04-30 22:48:42,363 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_limit.conf'
2019-04-30 22:48:42,386 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/headers/_strict_transport_security.conf'
2019-04-30 22:48:42,403 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_name.conf'
2019-04-30 22:48:42,419 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl.conf'
2019-04-30 22:48:42,464 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_ssl_secure.conf'
2019-04-30 22:48:42,478 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_auth.conf'
2019-04-30 22:48:42,493 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_access_policy.conf'
2019-04-30 22:48:42,497 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:42,498 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_barbican.conf] at time 22:48:42.498017 duration_in_ms=207.381
2019-04-30 22:48:42,498 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_barbican.conf] at time 22:48:42.498223
2019-04-30 22:48:42,498 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_barbican.conf]
2019-04-30 22:48:42,499 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_barbican.conf'}
2019-04-30 22:48:42,499 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_barbican.conf] at time 22:48:42.499626 duration_in_ms=1.403
2019-04-30 22:48:42,500 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 22:48:42.500027
2019-04-30 22:48:42,500 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf]
2019-04-30 22:48:42,641 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:42,641 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone_private.conf] at time 22:48:42.641671 duration_in_ms=141.643
2019-04-30 22:48:42,641 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 22:48:42.641851
2019-04-30 22:48:42,642 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf]
2019-04-30 22:48:42,643 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf'}
2019-04-30 22:48:42,643 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone_private.conf] at time 22:48:42.643158 duration_in_ms=1.306
2019-04-30 22:48:42,643 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 22:48:42.643321
2019-04-30 22:48:42,643 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2019-04-30 22:48:42,643 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2019-04-30 22:48:42,643 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 22:48:42.643897 duration_in_ms=0.576
2019-04-30 22:48:42,644 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 22:48:42.644072
2019-04-30 22:48:42,644 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf]
2019-04-30 22:48:42,644 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf is not present
2019-04-30 22:48:42,644 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cloudwatch.conf] at time 22:48:42.644591 duration_in_ms=0.518
2019-04-30 22:48:42,645 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 22:48:42.644974
2019-04-30 22:48:42,645 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf]
2019-04-30 22:48:42,783 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:42,783 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_aodh.conf] at time 22:48:42.783442 duration_in_ms=138.468
2019-04-30 22:48:42,783 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 22:48:42.783642
2019-04-30 22:48:42,783 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf]
2019-04-30 22:48:42,784 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf'}
2019-04-30 22:48:42,785 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_aodh.conf] at time 22:48:42.784975 duration_in_ms=1.333
2019-04-30 22:48:42,785 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 22:48:42.785358
2019-04-30 22:48:42,785 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf]
2019-04-30 22:48:42,922 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:42,923 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_neutron.conf] at time 22:48:42.923056 duration_in_ms=137.698
2019-04-30 22:48:42,923 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 22:48:42.923229
2019-04-30 22:48:42,923 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf]
2019-04-30 22:48:42,924 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf'}
2019-04-30 22:48:42,924 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_neutron.conf] at time 22:48:42.924459 duration_in_ms=1.229
2019-04-30 22:48:42,924 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 22:48:42.924623
2019-04-30 22:48:42,924 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf]
2019-04-30 22:48:42,925 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-available/nginx_proxy_openstack_web.conf is not present
2019-04-30 22:48:42,925 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_web.conf] at time 22:48:42.925152 duration_in_ms=0.529
2019-04-30 22:48:42,925 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 22:48:42.925316
2019-04-30 22:48:42,925 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf]
2019-04-30 22:48:42,925 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf is not present
2019-04-30 22:48:42,925 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_web.conf] at time 22:48:42.925841 duration_in_ms=0.526
2019-04-30 22:48:42,926 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 22:48:42.926265
2019-04-30 22:48:42,926 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf]
2019-04-30 22:48:43,067 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,067 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_glance.conf] at time 22:48:43.067831 duration_in_ms=141.566
2019-04-30 22:48:43,068 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 22:48:43.068013
2019-04-30 22:48:43,068 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf]
2019-04-30 22:48:43,069 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf'}
2019-04-30 22:48:43,069 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_glance.conf] at time 22:48:43.069335 duration_in_ms=1.321
2019-04-30 22:48:43,069 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 22:48:43.069705
2019-04-30 22:48:43,069 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_novnc.conf]
2019-04-30 22:48:43,209 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,209 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_novnc.conf] at time 22:48:43.209929 duration_in_ms=140.224
2019-04-30 22:48:43,210 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 22:48:43.210110
2019-04-30 22:48:43,210 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf]
2019-04-30 22:48:43,211 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_novnc.conf'}
2019-04-30 22:48:43,211 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_novnc.conf] at time 22:48:43.211413 duration_in_ms=1.303
2019-04-30 22:48:43,211 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 22:48:43.211865
2019-04-30 22:48:43,212 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf]
2019-04-30 22:48:43,350 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,350 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_keystone.conf] at time 22:48:43.350227 duration_in_ms=138.362
2019-04-30 22:48:43,350 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 22:48:43.350404
2019-04-30 22:48:43,350 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf]
2019-04-30 22:48:43,351 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf'}
2019-04-30 22:48:43,351 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_keystone.conf] at time 22:48:43.351723 duration_in_ms=1.318
2019-04-30 22:48:43,352 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 22:48:43.352119
2019-04-30 22:48:43,352 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf]
2019-04-30 22:48:43,491 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,491 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_cinder.conf] at time 22:48:43.491804 duration_in_ms=139.685
2019-04-30 22:48:43,492 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 22:48:43.491992
2019-04-30 22:48:43,492 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf]
2019-04-30 22:48:43,493 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf'}
2019-04-30 22:48:43,493 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_cinder.conf] at time 22:48:43.493246 duration_in_ms=1.254
2019-04-30 22:48:43,493 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 22:48:43.493605
2019-04-30 22:48:43,493 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf]
2019-04-30 22:48:43,631 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,632 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat_cfn.conf] at time 22:48:43.632118 duration_in_ms=138.512
2019-04-30 22:48:43,632 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 22:48:43.632307
2019-04-30 22:48:43,632 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf]
2019-04-30 22:48:43,633 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf'}
2019-04-30 22:48:43,633 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat_cfn.conf] at time 22:48:43.633623 duration_in_ms=1.316
2019-04-30 22:48:43,633 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 22:48:43.633795
2019-04-30 22:48:43,633 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf]
2019-04-30 22:48:43,634 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf is not present
2019-04-30 22:48:43,634 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_redirect_openstack_web_redirect.conf] at time 22:48:43.634345 duration_in_ms=0.55
2019-04-30 22:48:43,634 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 22:48:43.634508
2019-04-30 22:48:43,634 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf]
2019-04-30 22:48:43,634 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf is not present
2019-04-30 22:48:43,635 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_redirect_openstack_web_redirect.conf] at time 22:48:43.635055 duration_in_ms=0.548
2019-04-30 22:48:43,635 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 22:48:43.635426
2019-04-30 22:48:43,635 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_static_reclass_doc.conf]
2019-04-30 22:48:43,650 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/static.conf'
2019-04-30 22:48:43,682 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/_log.conf'
2019-04-30 22:48:43,759 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,759 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_static_reclass_doc.conf] at time 22:48:43.759505 duration_in_ms=124.079
2019-04-30 22:48:43,759 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 22:48:43.759688
2019-04-30 22:48:43,759 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf]
2019-04-30 22:48:43,760 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf'}
2019-04-30 22:48:43,760 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_static_reclass_doc.conf] at time 22:48:43.760908 duration_in_ms=1.22
2019-04-30 22:48:43,761 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 22:48:43.761286
2019-04-30 22:48:43,761 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf]
2019-04-30 22:48:43,902 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:43,902 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_heat.conf] at time 22:48:43.902346 duration_in_ms=141.06
2019-04-30 22:48:43,902 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 22:48:43.902594
2019-04-30 22:48:43,902 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf]
2019-04-30 22:48:43,904 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf'}
2019-04-30 22:48:43,904 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_heat.conf] at time 22:48:43.904184 duration_in_ms=1.59
2019-04-30 22:48:43,904 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_static_stats.conf] at time 22:48:43.904701
2019-04-30 22:48:43,904 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_static_stats.conf]
2019-04-30 22:48:44,015 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:44,016 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_static_stats.conf] at time 22:48:44.016065 duration_in_ms=111.364
2019-04-30 22:48:44,016 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_static_stats.conf] at time 22:48:44.016240
2019-04-30 22:48:44,016 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_static_stats.conf]
2019-04-30 22:48:44,017 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_static_stats.conf'}
2019-04-30 22:48:44,017 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_static_stats.conf] at time 22:48:44.017494 duration_in_ms=1.254
2019-04-30 22:48:44,017 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 22:48:44.017885
2019-04-30 22:48:44,018 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf]
2019-04-30 22:48:44,162 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:44,162 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_nova.conf] at time 22:48:44.162326 duration_in_ms=144.441
2019-04-30 22:48:44,162 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 22:48:44.162512
2019-04-30 22:48:44,162 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf]
2019-04-30 22:48:44,163 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf'}
2019-04-30 22:48:44,163 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_nova.conf] at time 22:48:44.163954 duration_in_ms=1.442
2019-04-30 22:48:44,164 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/nginx_proxy_openstack_api_placement.conf] at time 22:48:44.164383
2019-04-30 22:48:44,164 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/sites-available/nginx_proxy_openstack_api_placement.conf]
2019-04-30 22:48:44,308 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:44,308 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/nginx_proxy_openstack_api_placement.conf] at time 22:48:44.308347 duration_in_ms=143.963
2019-04-30 22:48:44,308 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_placement.conf] at time 22:48:44.308531
2019-04-30 22:48:44,308 [salt.state       :1813][INFO    ][12445] Executing state file.symlink for [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_placement.conf]
2019-04-30 22:48:44,309 [salt.state       :300 ][INFO    ][12445] {'new': '/etc/nginx/sites-enabled/nginx_proxy_openstack_api_placement.conf'}
2019-04-30 22:48:44,309 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/nginx_proxy_openstack_api_placement.conf] at time 22:48:44.309837 duration_in_ms=1.305
2019-04-30 22:48:44,310 [salt.state       :1780][INFO    ][12445] Running state [/usr/sbin/policy-rc.d] at time 22:48:44.310007
2019-04-30 22:48:44,310 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/usr/sbin/policy-rc.d]
2019-04-30 22:48:44,315 [salt.state       :300 ][INFO    ][12445] File changed:
New file
2019-04-30 22:48:44,315 [salt.state       :1951][INFO    ][12445] Completed state [/usr/sbin/policy-rc.d] at time 22:48:44.315547 duration_in_ms=5.539
2019-04-30 22:48:44,315 [salt.state       :1780][INFO    ][12445] Running state [/usr/sbin/policy-rc.d] at time 22:48:44.315945
2019-04-30 22:48:44,316 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/usr/sbin/policy-rc.d]
2019-04-30 22:48:44,316 [salt.state       :300 ][INFO    ][12445] {'removed': '/usr/sbin/policy-rc.d'}
2019-04-30 22:48:44,316 [salt.state       :1951][INFO    ][12445] Completed state [/usr/sbin/policy-rc.d] at time 22:48:44.316557 duration_in_ms=0.612
2019-04-30 22:48:44,316 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/conf.d/default.conf] at time 22:48:44.316931
2019-04-30 22:48:44,317 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/conf.d/default.conf]
2019-04-30 22:48:44,317 [salt.state       :300 ][INFO    ][12445] File /etc/nginx/conf.d/default.conf is not present
2019-04-30 22:48:44,317 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/conf.d/default.conf] at time 22:48:44.317504 duration_in_ms=0.572
2019-04-30 22:48:44,317 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-enabled/default] at time 22:48:44.317901
2019-04-30 22:48:44,318 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-enabled/default]
2019-04-30 22:48:44,318 [salt.state       :300 ][INFO    ][12445] {'removed': '/etc/nginx/sites-enabled/default'}
2019-04-30 22:48:44,318 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-enabled/default] at time 22:48:44.318510 duration_in_ms=0.61
2019-04-30 22:48:44,318 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/sites-available/default] at time 22:48:44.318893
2019-04-30 22:48:44,319 [salt.state       :1813][INFO    ][12445] Executing state file.absent for [/etc/nginx/sites-available/default]
2019-04-30 22:48:44,319 [salt.state       :300 ][INFO    ][12445] {'removed': '/etc/nginx/sites-available/default'}
2019-04-30 22:48:44,319 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/sites-available/default] at time 22:48:44.319486 duration_in_ms=0.593
2019-04-30 22:48:44,319 [salt.state       :1780][INFO    ][12445] Running state [/etc/nginx/nginx.conf] at time 22:48:44.319863
2019-04-30 22:48:44,320 [salt.state       :1813][INFO    ][12445] Executing state file.managed for [/etc/nginx/nginx.conf]
2019-04-30 22:48:44,334 [salt.fileclient  :1219][INFO    ][12445] Fetching file from saltenv 'base', ** done ** 'nginx/files/nginx.conf'
2019-04-30 22:48:44,354 [salt.state       :300 ][INFO    ][12445] File changed:
--- 
+++ 
@@ -1,85 +1,102 @@
 user www-data;
 worker_processes auto;
+worker_rlimit_nofile 20000;
 pid /run/nginx.pid;
 
+
 events {
-	worker_connections 768;
-	# multi_accept on;
+        worker_connections 1024;
+        # multi_accept on;
 }
 
 http {
 
-	##
-	# Basic Settings
-	##
+        ##
+        # Basic Settings
+        ##
 
-	sendfile on;
-	tcp_nopush on;
-	tcp_nodelay on;
-	keepalive_timeout 65;
-	types_hash_max_size 2048;
-	# server_tokens off;
+        sendfile on;
+        tcp_nopush on;
+        tcp_nodelay on;
+        keepalive_timeout 65;
+        types_hash_max_size 2048;
+        server_tokens off;
 
-	# server_names_hash_bucket_size 64;
-	# server_name_in_redirect off;
+        server_names_hash_bucket_size 128;
+        # server_name_in_redirect off;
 
-	include /etc/nginx/mime.types;
-	default_type application/octet-stream;
+        variables_hash_bucket_size 128;
 
-	##
-	# SSL Settings
-	##
+        include /etc/nginx/mime.types;
+        default_type application/octet-stream;
 
-	ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE
-	ssl_prefer_server_ciphers on;
+        ##
+        # Logging Settings
+        ##
 
-	##
-	# Logging Settings
-	##
+        access_log /var/log/nginx/access.log;
+        error_log /var/log/nginx/error.log;
 
-	access_log /var/log/nginx/access.log;
-	error_log /var/log/nginx/error.log;
+        ##
+        # Gzip Settings
+        ##
 
-	##
-	# Gzip Settings
-	##
+        gzip on;
+        gzip_disable "msie6";
 
-	gzip on;
-	gzip_disable "msie6";
+        # gzip_vary on;
+        # gzip_proxied any;
+        # gzip_comp_level 6;
+        # gzip_buffers 16 8k;
+        # gzip_http_version 1.1;
+        # gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
 
-	# gzip_vary on;
-	# gzip_proxied any;
-	# gzip_comp_level 6;
-	# gzip_buffers 16 8k;
-	# gzip_http_version 1.1;
-	# gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
+        ##
+        # nginx-naxsi config
+        ##
+        # Uncomment it if you installed nginx-naxsi
+        ##
 
-	##
-	# Virtual Host Configs
-	##
+        #include /etc/nginx/naxsi_core.rules;
 
-	include /etc/nginx/conf.d/*.conf;
-	include /etc/nginx/sites-enabled/*;
+        ##
+        # nginx-passenger config
+        ##
+        # Uncomment it if you installed nginx-passenger
+        ##
+
+        #passenger_root /usr;
+        #passenger_ruby /usr/bin/ruby;
+
+
+
+        ##
+        # Virtual Host Configs
+        ##
+
+        include /etc/nginx/conf.d/*.conf;
+        include /etc/nginx/sites-enabled/*.conf;
 }
 
 
+
 #mail {
-#	# See sample authentication script at:
-#	# http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
-# 
-#	# auth_http localhost/auth.php;
-#	# pop3_capabilities "TOP" "USER";
-#	# imap_capabilities "IMAP4rev1" "UIDPLUS";
-# 
-#	server {
-#		listen     localhost:110;
-#		protocol   pop3;
-#		proxy      on;
-#	}
-# 
-#	server {
-#		listen     localhost:143;
-#		protocol   imap;
-#		proxy      on;
-#	}
+#       # See sample authentication script at:
+#       # http://wiki.nginx.org/ImapAuthenticateWithApachePhpScript
+#
+#       # auth_http localhost/auth.php;
+#       # pop3_capabilities "TOP" "USER";
+#       # imap_capabilities "IMAP4rev1" "UIDPLUS";
+#
+#       server {
+#               listen     localhost:110;
+#               protocol   pop3;
+#               proxy      on;
+#       }
+#
+#       server {
+#               listen     localhost:143;
+#               protocol   imap;
+#               proxy      on;
+#       }
 #}

2019-04-30 22:48:44,354 [salt.state       :1951][INFO    ][12445] Completed state [/etc/nginx/nginx.conf] at time 22:48:44.354308 duration_in_ms=34.444
2019-04-30 22:48:44,354 [salt.state       :1780][INFO    ][12445] Running state [/etc/ssl/private] at time 22:48:44.354703
2019-04-30 22:48:44,354 [salt.state       :1813][INFO    ][12445] Executing state file.directory for [/etc/ssl/private]
2019-04-30 22:48:44,355 [salt.state       :300 ][INFO    ][12445] Directory /etc/ssl/private is in the correct state
Directory /etc/ssl/private updated
2019-04-30 22:48:44,355 [salt.state       :1951][INFO    ][12445] Completed state [/etc/ssl/private] at time 22:48:44.355501 duration_in_ms=0.797
2019-04-30 22:48:44,363 [salt.state       :1780][INFO    ][12445] Running state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 22:48:44.363718
2019-04-30 22:48:44,363 [salt.state       :1813][INFO    ][12445] Executing state cmd.run for [openssl dhparam -out /etc/ssl/dhparams.pem 2048]
2019-04-30 22:48:44,364 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command 'openssl dhparam -out /etc/ssl/dhparams.pem 2048' in directory '/root'
2019-04-30 22:49:07,157 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command saltutil.find_job with jid 20190430224907145249
2019-04-30 22:49:07,172 [salt.minion      :1432][INFO    ][13534] Starting a new job with PID 13534
2019-04-30 22:49:07,183 [salt.minion      :1711][INFO    ][13534] Returning information for job: 20190430224907145249
2019-04-30 22:49:36,484 [salt.state       :300 ][INFO    ][12445] {'pid': 13522, 'retcode': 0, 'stderr': "Generating DH parameters, 2048 bit long safe prime, generator 2\nThis is going to take a long time\n.......+...............................................+...........................+...................+......................................................................................................................................................................................................................................................+..............................................................................................................................+...................................................................................................................................................................................+....................................................................................................................................+...........................................................................................+.......................................................................+...........................................................................................+......................................................................+...............................................................................................................................................................................................................................+.........+............................................................................................................................+.......+...........................................................................................................................................................................+........................+.................................................................................+...........................................................................+..........+..........................................................................+...................+.............+..................................................................................................................................................................................................................+...............+.................................+..................+...+.................................................................................................................................................................................................+.....................................................................................................................................................................................................................................................................................................................................................................................................................................................................+.........................................................................................+..........................................................................................................................+.............................................................................................................................+.............................+............................+......................................................................................................................................................................................................................+...............................................+......+........................+...............................+.......................................................................................................................................................................................+........................................................+............................+..........+..........................................+...........................................................+.......................................................................................................................................................................................................................+...........................................+................................+........+........................+.......+.....+.........................................................................................................................................................+..............................+........................+............................................................................+.................+....................................................................................................................+.........................................................................................+.................................................................................+..................................................................................................................+......................+.........................................................................+..................................................................................+..................................................................................................................................................+..........+........................................+...+.........................................................................................................................................................................................+...............................................................................................................................................................................................................................................................................................................................................................+.......................+...................................................................+...............................+..........................................................................................+...........................................................................................................................................................+................................................................................................................................+.....................................................................................................................................................+........................................................................+..........+..........................................................+.......................................................+..................+...............................................................................+...................................................+...................................+.........................................................................................+............................................................+.............................................................................................................................................................+.........+.................................................................................................................................................+...............................................................................................+.....................................................................................+........................................................................................................................................++*++*\nunable to write 'random state'", 'stdout': ''}
2019-04-30 22:49:36,485 [salt.state       :1951][INFO    ][12445] Completed state [openssl dhparam -out /etc/ssl/dhparams.pem 2048] at time 22:49:36.485301 duration_in_ms=52121.583
2019-04-30 22:49:36,488 [salt.state       :1780][INFO    ][12445] Running state [nginx] at time 22:49:36.488460
2019-04-30 22:49:36,488 [salt.state       :1813][INFO    ][12445] Executing state service.running for [nginx]
2019-04-30 22:49:36,489 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemctl', 'status', 'nginx.service', '-n', '0'] in directory '/root'
2019-04-30 22:49:36,502 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2019-04-30 22:49:36,512 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemctl', 'is-enabled', 'nginx.service'] in directory '/root'
2019-04-30 22:49:36,522 [salt.state       :300 ][INFO    ][12445] The service nginx is already running
2019-04-30 22:49:36,523 [salt.state       :1951][INFO    ][12445] Completed state [nginx] at time 22:49:36.523490 duration_in_ms=35.029
2019-04-30 22:49:36,523 [salt.state       :1780][INFO    ][12445] Running state [nginx] at time 22:49:36.523887
2019-04-30 22:49:36,524 [salt.state       :1813][INFO    ][12445] Executing state service.mod_watch for [nginx]
2019-04-30 22:49:36,525 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemctl', 'is-active', 'nginx.service'] in directory '/root'
2019-04-30 22:49:36,535 [salt.loaded.int.module.cmdmod:395 ][INFO    ][12445] Executing command ['systemd-run', '--scope', 'systemctl', 'restart', 'nginx.service'] in directory '/root'
2019-04-30 22:49:36,643 [salt.state       :300 ][INFO    ][12445] {'nginx': True}
2019-04-30 22:49:36,643 [salt.state       :1951][INFO    ][12445] Completed state [nginx] at time 22:49:36.643667 duration_in_ms=119.777
2019-04-30 22:49:36,646 [salt.minion      :1711][INFO    ][12445] Returning information for job: 20190430224822096887
2019-04-30 22:49:37,347 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command pillar.get with jid 20190430224937336117
2019-04-30 22:49:37,359 [salt.minion      :1432][INFO    ][13612] Starting a new job with PID 13612
2019-04-30 22:49:37,364 [salt.minion      :1711][INFO    ][13612] Returning information for job: 20190430224937336117
2019-04-30 22:49:37,974 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command cp.push with jid 20190430224937963119
2019-04-30 22:49:37,986 [salt.minion      :1432][INFO    ][13617] Starting a new job with PID 13617
2019-04-30 22:49:38,003 [salt.minion      :1711][INFO    ][13617] Returning information for job: 20190430224937963119
2019-04-30 22:50:27,001 [salt.minion      :1308][INFO    ][1820] User sudo_ubuntu Executing command cp.push_dir with jid 20190430225026989463
2019-04-30 22:50:27,016 [salt.minion      :1432][INFO    ][13644] Starting a new job with PID 13644
