2018-05-27 23:12:25,752 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-fuel-baremetal-daily-master-231 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | fuel | | DEPLOY_SCENARIO | os-nosdn-nofeature-ha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod2 | +--------------------------------------+----------------------------------------------------------+ 2018-05-27 23:12:25,755 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file export OS_IDENTITY_API_VERSION=3 export OS_PROJECT_DOMAIN_NAME=Default export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_NAME=admin export OS_TENANT_NAME=admin export OS_USERNAME=admin export OS_PASSWORD=opnfv_secret export OS_REGION_NAME=RegionOne export OS_INTERFACE=internal export OS_ENDPOINT_TYPE="internal" export OS_CACERT="/etc/ssl/certs/mcp_os_cacert" export VOLUME_DEVICE_NAME=vdc export OS_AUTH_URL=http://10.167.4.35:35357/v3 2018-05-27 23:12:25,755 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-27 23:12:25,756 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------------+---------------+-------------------------------------------+-------------------------------------------------+------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------------+---------------+-------------------------------------------+-------------------------------------------------+------------------------------------+ | healthcheck | 0 | (merge)|(verify)|(daily)|(weekly) | First tier to be executed to verify the | connection_check api_check | | | | | basic operations in the VIM. | snaps_health_check | +---------------------+---------------+-------------------------------------------+-------------------------------------------------+------------------------------------+ 2018-05-27 23:12:25,757 - xtesting.ci.run_tests - INFO - Running tier 'healthcheck' 2018-05-27 23:12:25,757 - xtesting.ci.run_tests - INFO - Running test case 'connection_check'... 2018-05-27 23:12:28,960 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-27 23:12:42,141 - xtesting.core.unit - DEBUG - test_glance_connect_fail (snaps.openstack.utils.tests.glance_utils_tests.GlanceSmokeTests) ... ok test_glance_connect_success (snaps.openstack.utils.tests.glance_utils_tests.GlanceSmokeTests) ... ok test_keystone_connect_fail (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneSmokeTests) ... ok test_keystone_connect_success (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneSmokeTests) ... ok test_neutron_connect_fail (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_neutron_connect_success (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_retrieve_ext_network_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_nova_connect_fail (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_nova_connect_success (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_nova_get_hypervisor_hosts (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_heat_connect_fail (snaps.openstack.utils.tests.heat_utils_tests.HeatSmokeTests) ... ok test_heat_connect_success (snaps.openstack.utils.tests.heat_utils_tests.HeatSmokeTests) ... ok test_cinder_connect_fail (snaps.openstack.utils.tests.cinder_utils_tests.CinderSmokeTests) ... ok test_cinder_connect_success (snaps.openstack.utils.tests.cinder_utils_tests.CinderSmokeTests) ... ok ---------------------------------------------------------------------- Ran 14 tests in 13.170s OK 2018-05-27 23:12:42,257 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:12:42,257 - xtesting.ci.run_tests - INFO - Test result: +--------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +--------------------------+------------------+------------------+----------------+ | connection_check | functest | 00:13 | PASS | +--------------------------+------------------+------------------+----------------+ 2018-05-27 23:12:42,261 - xtesting.ci.run_tests - INFO - Running test case 'api_check'... 2018-05-27 23:12:43,587 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-27 23:28:45,189 - xtesting.core.unit - DEBUG - test_create_project_minimal (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_create_user_minimal (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_fail_without_proper_credentials (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_fail_without_proper_service (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_success (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_with_each_interface (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_grant_user_role_to_project (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_create_admin_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_delete_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_user_2x (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_delete_project (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_2x (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_bad_domain (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_quota_override (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_update_quotas (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_sec_grp_one_user (snaps.openstack.tests.create_project_tests.CreateProjectUserTests) ... ok test_create_project_sec_grp_two_users (snaps.openstack.tests.create_project_tests.CreateProjectUserTests) ... ok test_create_image_minimal_file (snaps.openstack.utils.tests.glance_utils_tests.GlanceUtilsTests) ... ok test_create_image_minimal_url (snaps.openstack.utils.tests.glance_utils_tests.GlanceUtilsTests) ... ok test_create_network (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_network_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_network_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_empty_cidr (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_null_cidr (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_add_interface_router (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_missing_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_null_router (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_null_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_invalid_ip (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_invalid_ip_to_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_ip (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_network_object (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_router_simple (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_router_with_public_interface (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_delete_simple_sec_grp (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_list_sec_grp_no_rules (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_no_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_no_rules (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_one_rule (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_get_sec_grp_by_id (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_floating_ips (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsFloatingIpTests) ... ok test_create_delete_keypair (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_key_from_file (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_keypair (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_delete_flavor (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsFlavorTests) ... ok test_create_flavor (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsFlavorTests) ... ok test_create_instance (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceTests) ... ok test_add_remove_volume (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_attach_volume_nowait (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_detach_volume_nowait (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_create_clean_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_delete_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor_all_settings (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor_existing (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateSimpleStackTests) ... ok test_create_stack_x2 (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateSimpleStackTests) ... ok test_get_settings_from_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateComplexStackTests) ... ok test_create_flavor_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsFlavorTests) ... ok test_create_keypair_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsKeypairTests) ... ok test_create_security_group_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsSecurityGroupTests) ... ok test_create_delete_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_back (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_both (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_front (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_delete_volume (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTests) ... ok test_create_simple_volume (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTests) ... ok test_create_delete_volume_type (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsSimpleVolumeTypeTests) ... ok test_create_simple_volume_type (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsSimpleVolumeTypeTests) ... ok test_create_bad_key_size (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_delete_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_simple_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_with_all_attrs (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_with_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_invalid_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_qos_and_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok ---------------------------------------------------------------------- Ran 85 tests in 961.547s OK 2018-05-27 23:28:45,312 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:28:45,314 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | api_check | functest | 16:02 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-27 23:28:45,318 - xtesting.ci.run_tests - INFO - Running test case 'snaps_health_check'... 2018-05-27 23:28:46,867 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-27 23:29:48,423 - xtesting.core.unit - DEBUG - test_check_vm_ip_dhcp (snaps.openstack.tests.create_instance_tests.SimpleHealthCheck) ... ok ---------------------------------------------------------------------- Ran 1 test in 61.554s OK 2018-05-27 23:29:48,539 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:29:48,539 - xtesting.ci.run_tests - INFO - Test result: +----------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------------+------------------+------------------+----------------+ | snaps_health_check | functest | 01:02 | PASS | +----------------------------+------------------+------------------+----------------+ 2018-05-27 23:29:48,543 - xtesting.ci.run_tests - INFO - Xtesting report: +----------------------------+------------------+---------------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +----------------------------+------------------+---------------------+------------------+----------------+ | connection_check | functest | healthcheck | 00:13 | PASS | | api_check | functest | healthcheck | 16:02 | PASS | | snaps_health_check | functest | healthcheck | 01:02 | PASS | +----------------------------+------------------+---------------------+------------------+----------------+ 2018-05-27 23:29:48,546 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK 2018-05-27 23:30:10,497 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-fuel-baremetal-daily-master-231 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | fuel | | DEPLOY_SCENARIO | os-nosdn-nofeature-ha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod2 | +--------------------------------------+----------------------------------------------------------+ 2018-05-27 23:30:10,501 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file export OS_IDENTITY_API_VERSION=3 export OS_PROJECT_DOMAIN_NAME=Default export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_NAME=admin export OS_TENANT_NAME=admin export OS_USERNAME=admin export OS_PASSWORD=opnfv_secret export OS_REGION_NAME=RegionOne export OS_INTERFACE=internal export OS_ENDPOINT_TYPE="internal" export OS_CACERT="/etc/ssl/certs/mcp_os_cacert" export VOLUME_DEVICE_NAME=vdc export OS_AUTH_URL=http://10.167.4.35:35357/v3 2018-05-27 23:30:10,501 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-27 23:30:10,502 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ | smoke | 1 | (daily)|(weekly) | Set of basic Functional tests to | vping_ssh vping_userdata cinder_test | | | | | validate the OPNFV scenarios. | tempest_smoke_serial rally_sanity | | | | | | patrole snaps_smoke neutron_trunk | +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ 2018-05-27 23:30:10,504 - xtesting.ci.run_tests - INFO - Running tier 'smoke' 2018-05-27 23:30:10,504 - xtesting.ci.run_tests - INFO - Running test case 'vping_ssh'... 2018-05-27 23:30:12,948 - functest.opnfv_tests.openstack.vping.vping_base - DEBUG - ext_net: Munch({u'status': u'ACTIVE', u'subnets': [u'7b531737-77a7-41af-93de-a06ade38f99f'], u'description': u'', u'provider:physical_network': u'physnet1', u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:10:42Z', u'is_default': True, u'revision_number': 4, u'port_security_enabled': True, u'mtu': 1500, u'id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'provider:segmentation_id': None, u'router:external': True, u'availability_zone_hints': [], u'availability_zones': [u'nova'], u'name': u'floating_net', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:10:32Z', u'provider:network_type': u'flat', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:30:13,321 - xtesting.energy.energy - INFO - API recorder available at : http://energy.opnfv.fr/resources/recorders/environment/lf-pod2 2018-05-27 23:30:13,321 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-27 23:30:13,759 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-27 23:30:13,759 - xtesting.energy.energy - DEBUG - Submitting scenario (vping_ssh/running) 2018-05-27 23:30:14,147 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Begin virtual environment setup 2018-05-27 23:30:14,147 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - vPing Start Time:'2018-05-27 23:30:14' 2018-05-27 23:30:14,148 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating image with name: 'functest-vping--597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:14,148 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Image metadata: None 2018-05-27 23:30:15,700 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/functest-vping--597c6a75-8748-4a69-b031-f5ba6b127fc1', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-27T23:30:15Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'shared', u'file': u'/v2/images/f0d12337-f28a-42db-a696-5587b57bab0b/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'f0d12337-f28a-42db-a696-5587b57bab0b', u'size': None, u'name': u'functest-vping--597c6a75-8748-4a69-b031-f5ba6b127fc1', u'checksum': None, u'self': u'/v2/images/f0d12337-f28a-42db-a696-5587b57bab0b', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-27T23:30:15Z', u'schema': u'/v2/schemas/image'}) 2018-05-27 23:30:15,701 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating network with name: 'vping-net-597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:16,389 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:30:16Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'0dac3d04-be25-4103-a913-e61041a37f14', u'provider:segmentation_id': 33, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'vping-net-597c6a75-8748-4a69-b031-f5ba6b127fc1', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:30:16Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:30:17,829 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:30:17Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.130.2', u'end': u'192.168.130.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.130.0/24', u'id': u'901b9e09-e2c5-4e68-9473-1f24082cddb5', u'subnetpool_id': None, u'service_types': [], u'name': u'vping-subnet-597c6a75-8748-4a69-b031-f5ba6b127fc1', u'enable_dhcp': True, u'network_id': u'0dac3d04-be25-4103-a913-e61041a37f14', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:30:17Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.130.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:30:17,829 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating router with name: 'vping-router-597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:21,254 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - router: Munch({u'status': u'ACTIVE', u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:30:20Z', u'revision_number': 2, u'ha': False, u'id': u'914ac656-b88e-4bde-97a1-559d9bf2e459', u'external_gateway_info': {u'network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'enable_snat': True, u'external_fixed_ips': [{u'subnet_id': u'7b531737-77a7-41af-93de-a06ade38f99f', u'ip_address': u'172.30.10.119'}]}, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'vping-router-597c6a75-8748-4a69-b031-f5ba6b127fc1', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:30:18Z', u'distributed': False, u'flavor_id': None, u'routes': [], u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:30:25,111 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating flavor with name: 'vping-flavor-597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:25,342 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - flavor: Munch({'name': u'vping-flavor-597c6a75-8748-4a69-b031-f5ba6b127fc1', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'bfe22903-ffec-4064-a678-7218ce0187bf', 'swap': 0}) 2018-05-27 23:30:26,703 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating VM 1 instance with name: 'opnfv-vping-1-597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:41,355 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm1: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-597c6a75-8748-4a69-b031-f5ba6b127fc1': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:ac:93:e7', u'version': 4, u'addr': u'192.168.130.12', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'f0d12337-f28a-42db-a696-5587b57bab0b'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000020', u'OS-SRV-USG:launched_at': u'2018-05-27T23:30:39.000000', 'flavor': Munch({u'id': u'bfe22903-ffec-4064-a678-7218ce0187bf'}), 'az': u'nova', 'id': u'878ca4dc-c879-4726-ad54-4e5b8687b6e1', 'security_groups': [Munch({u'name': u'vping-sg-597c6a75-8748-4a69-b031-f5ba6b127fc1'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-27T23:30:39.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-27T23:30:40Z', 'hostId': u'dd2bee3dd1b165d48bb7d88e23cbc9260b01f25e031acdb6c31db7fb', u'OS-EXT-SRV-ATTR:host': u'cmp002', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.12', 'cloud': 'envvars', 'host_id': u'dd2bee3dd1b165d48bb7d88e23cbc9260b01f25e031acdb6c31db7fb', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'cmp002', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000020', u'OS-SRV-USG:launched_at': u'2018-05-27T23:30:39.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp002.mcp-pike-ovs-ha.local', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'17e0c72255804297b05647b8b64ec56a', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp002.mcp-pike-ovs-ha.local', 'name': u'opnfv-vping-1-597c6a75-8748-4a69-b031-f5ba6b127fc1', 'adminPass': u'UMDRhwy8WW34', 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:30:30Z', 'created': u'2018-05-27T23:30:30Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True', 'region': 'RegionOne'}) 2018-05-27 23:30:44,240 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm1 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffd9fff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffda000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffda max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f6a40-0x000f6a4f] mapped at [ffff8800000f6a40] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb14000-0x1ffc9fff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F6830 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE1591 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE1425 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 0013E5 (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1519 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffd9fff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd5000-0x1ffd9fff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd1001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 972157832 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffd9fff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffd9fff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffd9fff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128867 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491788K/523744K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.912 MHz processor [ 0.280587] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967648) [ 0.283757] pid_max: default: 32768 minimum: 301 [ 0.285645] ACPI: Core revision 20150930 [ 0.289768] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.292189] Security Framework initialized [ 0.293758] Yama: becoming mindful. [ 0.295128] AppArmor: AppArmor initialized [ 0.296564] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.299373] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.301894] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.304261] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.307012] Initializing cgroup subsys io [ 0.308634] Initializing cgroup subsys memory [ 0.310262] Initializing cgroup subsys devices [ 0.312015] Initializing cgroup subsys freezer [ 0.313696] Initializing cgroup subsys net_cls [ 0.315385] Initializing cgroup subsys perf_event [ 0.317158] Initializing cgroup subsys net_prio [ 0.318855] Initializing cgroup subsys hugetlb [ 0.320543] Initializing cgroup subsys pids [ 0.322199] CPU: Physical Processor ID: 0 [ 0.324441] mce: CPU supports 10 MCE banks [ 0.326063] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.328015] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.348109] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.359219] ftrace: allocating 31920 entries in 125 pages [ 0.412616] smpboot: Max logical packages: 1 [ 0.414222] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.416841] x2apic enabled [ 0.418314] Switched APIC routing to physical x2apic. [ 0.421476] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.423798] smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2637 v3 @ 3.50GHz (family: 0x6, model: 0x3f, stepping: 0x2) [ 0.427824] Performance Events: 16-deep LBR, Haswell events, Intel PMU driver. [ 0.430999] ... version: 2 [ 0.432661] ... bit width: 48 [ 0.434176] ... generic registers: 4 [ 0.435411] ... value mask: 0000ffffffffffff [ 0.436945] ... max period: 000000007fffffff [ 0.438469] ... fixed-purpose events: 3 [ 0.439707] ... event mask: 000000070000000f [ 0.441283] KVM setup paravirtual spinlock [ 0.443222] x86: Booted up 1 node, 1 CPUs [ 0.444470] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.446649] devtmpfs: initialized [ 0.458783] evm: security.selinux [ 0.459874] evm: security.SMACK64 [ 0.460970] evm: security.SMACK64EXEC [ 0.462126] evm: security.SMACK64TRANSMUTE [ 0.463379] evm: security.SMACK64MMAP [ 0.464548] evm: security.ima [ 0.465549] evm: security.capability [ 0.466878] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.469750] pinctrl core: initialized pinctrl subsystem [ 0.471464] RTC time: 23:30:40, date: 05/27/18 [ 0.472957] NET: Registered protocol family 16 [ 0.474565] cpuidle: using governor ladder [ 0.475830] cpuidle: using governor menu [ 0.477060] PCCT header not found. [ 0.478274] ACPI: bus type PCI registered [ 0.479547] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.481550] PCI: Using configuration type 1 for base access [ 0.483244] core: PMU erratum BJ122, BV98, HSD29 workaround disabled, HT off [ 0.487516] ACPI: Added _OSI(Module Device) [ 0.488820] ACPI: Added _OSI(Processor Device) [ 0.490155] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.491550] ACPI: Added _OSI(Processor Aggregator Device) [ 0.495901] ACPI: Interpreter enabled [ 0.497090] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.499952] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.502858] ACPI: (supports S0 S3 S4 S5) [ 0.504082] ACPI: Using IOAPIC for interrupt routing [ 0.505556] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.513436] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.515181] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.517071] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.518905] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.522655] acpiphp: Slot [3] registered [ 0.523914] acpiphp: Slot [4] registered [ 0.525168] acpiphp: Slot [5] registered [ 0.526426] acpiphp: Slot [6] registered [ 0.527684] acpiphp: Slot [7] registered [ 0.528942] acpiphp: Slot [8] registered [ 0.530194] acpiphp: Slot [9] registered [ 0.531441] acpiphp: Slot [10] registered [ 0.532716] acpiphp: Slot [11] registered [ 0.550987] acpiphp: Slot [12] registered [ 0.552262] acpiphp: Slot [13] registered [ 0.553849] acpiphp: Slot [14] registered [ 0.555124] acpiphp: Slot [15] registered [ 0.556397] acpiphp: Slot [16] registered [ 0.557663] acpiphp: Slot [17] registered [ 0.558939] acpiphp: Slot [18] registered [ 0.560205] acpiphp: Slot [19] registered [ 0.561472] acpiphp: Slot [20] registered [ 0.562747] acpiphp: Slot [21] registered [ 0.564018] acpiphp: Slot [22] registered [ 0.565287] acpiphp: Slot [23] registered [ 0.566553] acpiphp: Slot [24] registered [ 0.567823] acpiphp: Slot [25] registered [ 0.569101] acpiphp: Slot [26] registered [ 0.570369] acpiphp: Slot [27] registered [ 0.571636] acpiphp: Slot [28] registered [ 0.572910] acpiphp: Slot [29] registered [ 0.574179] acpiphp: Slot [30] registered [ 0.575441] acpiphp: Slot [31] registered [ 0.576715] PCI host bridge to bus 0000:00 [ 0.577975] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.579835] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.581702] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.583956] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.586222] pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] [ 0.588530] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.594345] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.596314] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.598124] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.600084] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.605405] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.607606] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.647686] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.650031] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.652258] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.654488] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.656719] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.658673] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.660461] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.662028] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.664371] vgaarb: loaded [ 0.665299] vgaarb: bridge control possible 0000:00:02.0 [ 0.667215] SCSI subsystem initialized [ 0.668494] ACPI: bus type USB registered [ 0.669736] usbcore: registered new interface driver usbfs [ 0.671285] usbcore: registered new interface driver hub [ 0.672833] usbcore: registered new device driver usb [ 0.674468] PCI: Using ACPI for IRQ routing [ 0.676105] NetLabel: Initializing [ 0.677235] NetLabel: domain hash size = 128 [ 0.678514] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.679940] NetLabel: unlabeled traffic allowed by default [ 0.681651] clocksource: Switched to clocksource kvm-clock [ 0.695987] AppArmor: AppArmor Filesystem Enabled [ 0.697578] pnp: PnP ACPI init [ 0.699229] pnp: PnP ACPI: found 5 devices [ 0.707629] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.710412] NET: Registered protocol family 2 [ 0.711885] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.713790] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.715545] TCP: Hash tables configured (established 4096 bind 4096) [ 0.717300] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.718907] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.720672] NET: Registered protocol family 1 [ 0.721974] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.723610] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.725226] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.782836] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.840804] Trying to unpack rootfs image as initramfs... [ 0.958620] Freeing initrd memory: 4824K (ffff88001fb14000 - ffff88001ffca000) [ 0.961131] Scanning for low memory corruption every 60 seconds [ 0.963183] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.964884] audit: initializing netlink subsys (disabled) [ 0.966421] audit: type=2000 audit(1527463841.213:1): initialized [ 0.968544] Initialise system trusted keyring [ 0.969965] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.971685] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.976288] zbud: loaded [ 0.977488] VFS: Disk quotas dquot_6.6.0 [ 0.978736] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.981382] fuse init (API version 7.23) [ 0.982797] Key type big_key registered [ 0.984009] Allocating IMA MOK and blacklist keyrings. [ 0.985788] Key type asymmetric registered [ 0.987021] Asymmetric key parser 'x509' registered [ 0.988477] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.991077] io scheduler noop registered [ 0.992281] io scheduler deadline registered (default) [ 0.993829] io scheduler cfq registered [ 0.995141] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.996713] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.998655] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 1.000879] ACPI: Power Button [PWRF] [ 1.002185] GHES: HEST is not enabled! [ 1.059294] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 1.175156] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 1.233699] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 [ 1.236868] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.262345] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 1.266589] Linux agpgart interface v0.103 [ 1.270868] brd: module loaded [ 1.273234] loop: module loaded [ 1.281879] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 1.284338] GPT:90111 != 2097151 [ 1.285381] GPT:Alternate GPT header not at the end of the disk. [ 1.287040] GPT:90111 != 2097151 [ 1.288110] GPT: Use GNU Parted to correct GPT errors. [ 1.289600] vda: vda1 vda15 [ 1.292632] vdb: [ 1.294481] scsi host0: ata_piix [ 1.295625] scsi host1: ata_piix [ 1.296748] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 [ 1.298601] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 [ 1.301052] libphy: Fixed MDIO Bus: probed [ 1.302293] tun: Universal TUN/TAP device driver, 1.6 [ 1.303748] tun: (C) 1999-2004 Max Krasnyansky [ 1.306902] PPP generic driver version 2.4.2 [ 1.308432] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.310254] ehci-pci: EHCI PCI platform driver [ 1.311611] ehci-platform: EHCI generic platform driver [ 1.313184] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 1.314914] ohci-pci: OHCI PCI platform driver [ 1.316276] ohci-platform: OHCI generic platform driver [ 1.317810] uhci_hcd: USB Universal Host Controller Interface driver [ 1.376143] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 1.377650] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 1.379858] uhci_hcd 0000:00:01.2: detected 2 ports [ 1.381321] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c080 [ 1.383009] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 1.384875] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 1.387059] usb usb1: Product: UHCI Host Controller [ 1.388476] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 1.390192] usb usb1: SerialNumber: 0000:00:01.2 [ 1.391685] hub 1-0:1.0: USB hub found [ 1.392873] hub 1-0:1.0: 2 ports detected [ 1.394344] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 1.397581] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 1.399029] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 1.400643] mousedev: PS/2 mouse device common for all mice [ 1.402603] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 1.405417] rtc_cmos 00:00: RTC can wake from S4 [ 1.407495] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 1.409536] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 1.411339] i2c /dev entries driver [ 1.412514] device-mapper: uevent: version 1.0.3 [ 1.413991] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 1.416483] ledtrig-cpu: registered to indicate activity on CPUs [ 1.418627] NET: Registered protocol family 10 [ 1.420206] NET: Registered protocol family 17 [ 1.423517] Key type dns_resolver registered [ 1.425022] microcode: CPU0 sig=0x306f2, pf=0x1, revision=0x1 [ 1.426683] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 1.429427] registered taskstats version 1 [ 1.430686] Loading compiled-in X.509 certificates [ 1.432888] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 1.435695] zswap: loaded using pool lzo/zbud [ 1.438179] Key type trusted registered [ 1.441383] Key type encrypted registered [ 1.442651] AppArmor: AppArmor sha1 policy hashing enabled [ 1.444228] ima: No TPM chip found, activating TPM-bypass! [ 1.445814] evm: HMAC attrs: 0x1 [ 1.464601] Magic number: 2:941:555 [ 1.465896] rtc_cmos 00:00: setting system clock to 2018-05-27 23:30:41 UTC (1527463841) [ 1.468356] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 1.470027] EDD information not available. [ 1.475424] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 1.477761] Write protecting the kernel read-only data: 14336k [ 1.480315] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 1.483228] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 1.13 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.39 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.807974] random: dd urandom read with 6 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.12... Lease of 192.168.130.12 obtained, lease time 600 route: SIOCADDRT: File exists WARN: failed: route add -net "0.0.0.0/0" gw "192.168.130.1" Top of dropbear init script Starting dropbear sshd: 2018-05-27 23:30:44,240 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating keypair with name: 'vping-keypair-597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:30:44,684 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - keypair: Munch({'public_key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDDkgKbIRPOMby5tjAlqJ6MQFwo4sz3jAuRzIEBttBbEGH6mnLDC6dIYJeBgSlvjhujP4CKzOxQ4PAn1OiP54ibvS75tkRoMu7CD2u0OHJ44iGOzll3AKhvbXrzIOI4TaPiacBPfsxOT4YrkvF3pq0j0AQtaCe/DYhRsxKV2HAfeUEgCPLQaNlrde7zmnrCj2N8+6eVwFdYqd4SWmjOUIYGOAxTS+JDroFXaCOctPlyuz/DsxJO1bUrlhyFIvqn8QJ1Pbvg4P7ea8+ZdptIuyGsWUWq8nsJUrzPDJmwDNeMuAR0rHUp9LCsaJCWGCL9jIadvpbUGx6P9Wu6UiC0w8Ad Generated-by-Nova', 'private_key': u'-----BEGIN RSA PRIVATE KEY-----\nMIIEpAIBAAKCAQEAw5ICmyETzjG8ubYwJaiejEBcKOLM94wLkcyBAbbQWxBh+ppy\nwwunSGCXgYEpb44boz+AiszsUODwJ9Toj+eIm70u+bZEaDLuwg9rtDhyeOIhjs5Z\ndwCob2168yDiOE2j4mnAT37MTk+GK5Lxd6atI9AELWgnvw2IUbMSldhwH3lBIAjy\n0GjZa3Xu85p6wo9jfPunlcBXWKneElpozlCGBjgMU0viQ66BV2gjnLT5crs/w7MS\nTtW1K5YchSL6p/ECdT274OD+3mvPmXabSLshrFlFqvJ7CVK8zwyZsAzXjLgEdKx1\nKfSwrGiQlhgi/YyGnb6W1Bsej/VrulIgtMPAHQIDAQABAoIBAQCv8ZwGlCuNRZHU\nQePu/VQmOYCwB9r+mi+Oa71kHWQ1iPeczXaRotpMcxnamKj+g4q5w1eRh+rSmIt2\nSkUvsc1kzz6DyUaht7C1RcpPyLizqD0ojNxQA3eFR6llMiBTJwZZm2o4GosTqNe8\nO+ahDoKVxX78msenSjVpywDdbmrw5ELdXCdfa0l4gJn6RtHbTLeSfP2pWFyfpVCW\ncMOW74VJVqdDB320vAwia9+c/tSh0PwMBcrvkKmkbsP9R6k838h3OpOYZ8GBPxHI\naEuM5xiqDXx3U2x8aDm/hQivwb8vjTyyOZfvxvUZdbCEjYMUfs995mQor33npZWP\nuchZucbBAoGBAOL/COcym6LkdE7AqoOiv2xPEBz808yAsZeyZW/OKhs40ZSCWpYn\nuJ88T+a3oV3s9vya10irndJtO5bKeNTVsZoLJD/KvJ1MqiGD9YzideEzueJr80i1\nHY5RM57Bx4WY/80zEIT1GuxEMVfu7MCHUo/qSYfDvQODMoa2nxiRbcBPAoGBANyP\nC9W57P32L3c7wbbsjJD9wcntX9aINXpJuGuGSOQhIVeKKD7bT21myHn9m0bnfNsJ\nK/8JkWTFVWoY0QXqG0KFvFjISRaSPYiJi86aSuydalpnAzWYRMkrZBg6BKW6a/QC\nx7rze8CD0XkCC74EAheBaagct7HFQruoAC3GpxHTAoGAepFRAl2OMh9/OcLIj3mC\nOP5b6fsOdf8LZai7IurES5ybdcAJH8jk5H5Reneu1yOLnYwSMLgR6Lx4j4xWQD3+\nvvnDIfrba2go+R3iqabiFa6zcTHu1FSPV/g6kj9594ZMoUUwZ0pdtjOAHUXyambn\nSrQr7fXgCpR95MWZFE/6XfsCgYBObW8/6IUlKU1nkJApg6PYSuOF8iqKFpUtjtlN\nIsr4k+9POYlmEIYF2O6gslVsuRPkrzY90iEpPCJLP9fTypM27Gc1CsMyi33l90MH\njEXoLXGMA+VYQXT0M8G7+6V7aPbKJdLv33S52CW8acXTI4m64gG4Db4kMIiyQeMO\nY/00kQKBgQC47SiGikwxNJNmrfRsfV2h+TSsOYzx/moQZ09hd1ZvuLiiiL3qaQxt\n9xOsMlAYZgrE+MGo/46Xja1Wt2tZMjXAHOWBkIjcdU4VeCgW7PMO1YjF0RVh20FV\ndwSv4IoxfYAH1WY6aPoSq7dmPiBK4b7bx2koNQ72kayu1eDwBoqvYA==\n-----END RSA PRIVATE KEY-----\n', 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'name': u'vping-keypair-597c6a75-8748-4a69-b031-f5ba6b127fc1', 'created_at': '2018-05-27T23:30:44.684860', 'properties': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'fingerprint': u'2f:6a:32:d2:0c:2f:58:48:92:e4:3f:43:0e:24:66:f5', 'type': 'ssh', 'id': u'vping-keypair-597c6a75-8748-4a69-b031-f5ba6b127fc1'}) 2018-05-27 23:30:44,685 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - private_key: -----BEGIN RSA PRIVATE KEY----- MIIEpAIBAAKCAQEAw5ICmyETzjG8ubYwJaiejEBcKOLM94wLkcyBAbbQWxBh+ppy wwunSGCXgYEpb44boz+AiszsUODwJ9Toj+eIm70u+bZEaDLuwg9rtDhyeOIhjs5Z dwCob2168yDiOE2j4mnAT37MTk+GK5Lxd6atI9AELWgnvw2IUbMSldhwH3lBIAjy 0GjZa3Xu85p6wo9jfPunlcBXWKneElpozlCGBjgMU0viQ66BV2gjnLT5crs/w7MS TtW1K5YchSL6p/ECdT274OD+3mvPmXabSLshrFlFqvJ7CVK8zwyZsAzXjLgEdKx1 KfSwrGiQlhgi/YyGnb6W1Bsej/VrulIgtMPAHQIDAQABAoIBAQCv8ZwGlCuNRZHU QePu/VQmOYCwB9r+mi+Oa71kHWQ1iPeczXaRotpMcxnamKj+g4q5w1eRh+rSmIt2 SkUvsc1kzz6DyUaht7C1RcpPyLizqD0ojNxQA3eFR6llMiBTJwZZm2o4GosTqNe8 O+ahDoKVxX78msenSjVpywDdbmrw5ELdXCdfa0l4gJn6RtHbTLeSfP2pWFyfpVCW cMOW74VJVqdDB320vAwia9+c/tSh0PwMBcrvkKmkbsP9R6k838h3OpOYZ8GBPxHI aEuM5xiqDXx3U2x8aDm/hQivwb8vjTyyOZfvxvUZdbCEjYMUfs995mQor33npZWP uchZucbBAoGBAOL/COcym6LkdE7AqoOiv2xPEBz808yAsZeyZW/OKhs40ZSCWpYn uJ88T+a3oV3s9vya10irndJtO5bKeNTVsZoLJD/KvJ1MqiGD9YzideEzueJr80i1 HY5RM57Bx4WY/80zEIT1GuxEMVfu7MCHUo/qSYfDvQODMoa2nxiRbcBPAoGBANyP C9W57P32L3c7wbbsjJD9wcntX9aINXpJuGuGSOQhIVeKKD7bT21myHn9m0bnfNsJ K/8JkWTFVWoY0QXqG0KFvFjISRaSPYiJi86aSuydalpnAzWYRMkrZBg6BKW6a/QC x7rze8CD0XkCC74EAheBaagct7HFQruoAC3GpxHTAoGAepFRAl2OMh9/OcLIj3mC OP5b6fsOdf8LZai7IurES5ybdcAJH8jk5H5Reneu1yOLnYwSMLgR6Lx4j4xWQD3+ vvnDIfrba2go+R3iqabiFa6zcTHu1FSPV/g6kj9594ZMoUUwZ0pdtjOAHUXyambn SrQr7fXgCpR95MWZFE/6XfsCgYBObW8/6IUlKU1nkJApg6PYSuOF8iqKFpUtjtlN Isr4k+9POYlmEIYF2O6gslVsuRPkrzY90iEpPCJLP9fTypM27Gc1CsMyi33l90MH jEXoLXGMA+VYQXT0M8G7+6V7aPbKJdLv33S52CW8acXTI4m64gG4Db4kMIiyQeMO Y/00kQKBgQC47SiGikwxNJNmrfRsfV2h+TSsOYzx/moQZ09hd1ZvuLiiiL3qaQxt 9xOsMlAYZgrE+MGo/46Xja1Wt2tZMjXAHOWBkIjcdU4VeCgW7PMO1YjF0RVh20FV dwSv4IoxfYAH1WY6aPoSq7dmPiBK4b7bx2koNQ72kayu1eDwBoqvYA== -----END RSA PRIVATE KEY----- 2018-05-27 23:30:46,405 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating VM 2 instance with name: 'opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1' 2018-05-27 23:31:03,141 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm2: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-597c6a75-8748-4a69-b031-f5ba6b127fc1': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:af:d7:3d', u'version': 4, u'addr': u'192.168.130.10', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'f0d12337-f28a-42db-a696-5587b57bab0b'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000023', u'OS-SRV-USG:launched_at': u'2018-05-27T23:31:01.000000', 'flavor': Munch({u'id': u'bfe22903-ffec-4064-a678-7218ce0187bf'}), 'az': u'nova', 'id': u'9a0f121d-7171-40f2-9db9-c4806981de00', 'security_groups': [Munch({u'name': u'vping-sg-597c6a75-8748-4a69-b031-f5ba6b127fc1'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-27T23:31:01.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-27T23:31:02Z', 'hostId': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, 'key_name': u'vping-keypair-597c6a75-8748-4a69-b031-f5ba6b127fc1', 'public_v6': '', 'private_v4': u'192.168.130.10', 'cloud': 'envvars', 'host_id': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000023', u'OS-SRV-USG:launched_at': u'2018-05-27T23:31:01.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'17e0c72255804297b05647b8b64ec56a', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', 'name': u'opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1', 'adminPass': u'gwhLD6RcWuh9', 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:30:49Z', 'created': u'2018-05-27T23:30:49Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True', 'region': 'RegionOne'}) 2018-05-27 23:31:08,063 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - floating_ip2: Munch({'status': u'DOWN', 'router_id': u'914ac656-b88e-4bde-97a1-559d9bf2e459', 'properties': Munch({u'tags': []}), 'description': u'', u'tags': [], 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:31:05Z', 'attached': True, 'updated_at': u'2018-05-27T23:31:05Z', 'id': u'83478aa6-1a79-4001-b53d-5e372991444f', 'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', 'fixed_ip_address': u'192.168.130.10', 'floating_ip_address': u'172.30.10.123', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'revision_number': 0, 'router': u'914ac656-b88e-4bde-97a1-559d9bf2e459', 'project_id': u'17e0c72255804297b05647b8b64ec56a', 'port_id': u'f4ba9e8e-33de-4652-8c1a-fc21bd48dedc', 'port': u'f4ba9e8e-33de-4652-8c1a-fc21bd48dedc', 'network': u'11c92fd4-326a-487a-a640-1b09c88fcb5b'}) 2018-05-27 23:31:11,327 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm2 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffd9fff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffda000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffda max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f6a40-0x000f6a4f] mapped at [ffff8800000f6a40] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb14000-0x1ffc9fff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F6830 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE1591 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE1425 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 0013E5 (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1519 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffd9fff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd5000-0x1ffd9fff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd1001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 960785285 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffd9fff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffd9fff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffd9fff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128867 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491788K/523744K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.914 MHz processor [ 0.268986] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967656) [ 0.278661] pid_max: default: 32768 minimum: 301 [ 0.280426] ACPI: Core revision 20150930 [ 0.284669] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.287245] Security Framework initialized [ 0.288962] Yama: becoming mindful. [ 0.290410] AppArmor: AppArmor initialized [ 0.292277] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.295183] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.297875] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.300372] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.303281] Initializing cgroup subsys io [ 0.304954] Initializing cgroup subsys memory [ 0.306770] Initializing cgroup subsys devices [ 0.308628] Initializing cgroup subsys freezer [ 0.310474] Initializing cgroup subsys net_cls [ 0.312325] Initializing cgroup subsys perf_event [ 0.314261] Initializing cgroup subsys net_prio [ 0.316126] Initializing cgroup subsys hugetlb [ 0.318066] Initializing cgroup subsys pids [ 0.320069] CPU: Physical Processor ID: 0 [ 0.322651] mce: CPU supports 10 MCE banks [ 0.324588] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.326959] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.347933] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.359704] ftrace: allocating 31920 entries in 125 pages [ 0.413486] smpboot: Max logical packages: 1 [ 0.415287] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.418153] x2apic enabled [ 0.419735] Switched APIC routing to physical x2apic. [ 0.423116] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.425423] smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2637 v3 @ 3.50GHz (family: 0x6, model: 0x3f, stepping: 0x2) [ 0.429544] Performance Events: 16-deep LBR, Haswell events, Intel PMU driver. [ 0.432786] ... version: 2 [ 0.434416] ... bit width: 48 [ 0.436067] ... generic registers: 4 [ 0.437705] ... value mask: 0000ffffffffffff [ 0.439739] ... max period: 000000007fffffff [ 0.441787] ... fixed-purpose events: 3 [ 0.443407] ... event mask: 000000070000000f [ 0.445490] KVM setup paravirtual spinlock [ 0.447822] x86: Booted up 1 node, 1 CPUs [ 0.449440] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.452254] devtmpfs: initialized [ 0.464974] evm: security.selinux [ 0.466426] evm: security.SMACK64 [ 0.467878] evm: security.SMACK64EXEC [ 0.469472] evm: security.SMACK64TRANSMUTE [ 0.471179] evm: security.SMACK64MMAP [ 0.472737] evm: security.ima [ 0.474010] evm: security.capability [ 0.475673] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.479449] pinctrl core: initialized pinctrl subsystem [ 0.481743] RTC time: 23:31:02, date: 05/27/18 [ 0.483747] NET: Registered protocol family 16 [ 0.485874] cpuidle: using governor ladder [ 0.487615] cpuidle: using governor menu [ 0.489283] PCCT header not found. [ 0.490882] ACPI: bus type PCI registered [ 0.492585] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.495287] PCI: Using configuration type 1 for base access [ 0.497622] core: PMU erratum BJ122, BV98, HSD29 workaround disabled, HT off [ 0.502675] ACPI: Added _OSI(Module Device) [ 0.504444] ACPI: Added _OSI(Processor Device) [ 0.506288] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.508200] ACPI: Added _OSI(Processor Aggregator Device) [ 0.512816] ACPI: Interpreter enabled [ 0.514417] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.518370] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.522280] ACPI: (supports S0 S3 S4 S5) [ 0.523957] ACPI: Using IOAPIC for interrupt routing [ 0.525995] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.534841] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.537307] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.539962] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.542545] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.547612] acpiphp: Slot [3] registered [ 0.549329] acpiphp: Slot [4] registered [ 0.551038] acpiphp: Slot [5] registered [ 0.552637] acpiphp: Slot [6] registered [ 0.553915] acpiphp: Slot [7] registered [ 0.555175] acpiphp: Slot [8] registered [ 0.556432] acpiphp: Slot [9] registered [ 0.557698] acpiphp: Slot [10] registered [ 0.558971] acpiphp: Slot [11] registered [ 0.577308] acpiphp: Slot [12] registered [ 0.578585] acpiphp: Slot [13] registered [ 0.579859] acpiphp: Slot [14] registered [ 0.581142] acpiphp: Slot [15] registered [ 0.582413] acpiphp: Slot [16] registered [ 0.583686] acpiphp: Slot [17] registered [ 0.584969] acpiphp: Slot [18] registered [ 0.586241] acpiphp: Slot [19] registered [ 0.587507] acpiphp: Slot [20] registered [ 0.588782] acpiphp: Slot [21] registered [ 0.590068] acpiphp: Slot [22] registered [ 0.591341] acpiphp: Slot [23] registered [ 0.592608] acpiphp: Slot [24] registered [ 0.593891] acpiphp: Slot [25] registered [ 0.595165] acpiphp: Slot [26] registered [ 0.596438] acpiphp: Slot [27] registered [ 0.597715] acpiphp: Slot [28] registered [ 0.598993] acpiphp: Slot [29] registered [ 0.600269] acpiphp: Slot [30] registered [ 0.601551] acpiphp: Slot [31] registered [ 0.602806] PCI host bridge to bus 0000:00 [ 0.604072] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.605945] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.607803] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.610069] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.612330] pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] [ 0.614635] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.620249] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.622211] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.624024] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.625982] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.630973] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.633193] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.670456] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.673412] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.675697] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.677977] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.680187] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.682210] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.684013] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.685639] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.688044] vgaarb: loaded [ 0.689002] vgaarb: bridge control possible 0000:00:02.0 [ 0.690917] SCSI subsystem initialized [ 0.692195] ACPI: bus type USB registered [ 0.693478] usbcore: registered new interface driver usbfs [ 0.695071] usbcore: registered new interface driver hub [ 0.696624] usbcore: registered new device driver usb [ 0.698300] PCI: Using ACPI for IRQ routing [ 0.699957] NetLabel: Initializing [ 0.701075] NetLabel: domain hash size = 128 [ 0.702394] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.703866] NetLabel: unlabeled traffic allowed by default [ 0.705596] clocksource: Switched to clocksource kvm-clock [ 0.720314] AppArmor: AppArmor Filesystem Enabled [ 0.721816] pnp: PnP ACPI init [ 0.723517] pnp: PnP ACPI: found 5 devices [ 0.732022] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.734699] NET: Registered protocol family 2 [ 0.736217] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.738178] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.739984] TCP: Hash tables configured (established 4096 bind 4096) [ 0.741785] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.743446] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.745249] NET: Registered protocol family 1 [ 0.746580] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.748264] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.749935] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.809052] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.868300] Trying to unpack rootfs image as initramfs... [ 0.989066] Freeing initrd memory: 4824K (ffff88001fb14000 - ffff88001ffca000) [ 0.991662] Scanning for low memory corruption every 60 seconds [ 0.993794] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.995541] audit: initializing netlink subsys (disabled) [ 0.997114] audit: type=2000 audit(1527463863.178:1): initialized [ 0.999297] Initialise system trusted keyring [ 1.000718] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 1.002512] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 1.007212] zbud: loaded [ 1.008449] VFS: Disk quotas dquot_6.6.0 [ 1.009753] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 1.012441] fuse init (API version 7.23) [ 1.013897] Key type big_key registered [ 1.015154] Allocating IMA MOK and blacklist keyrings. [ 1.016991] Key type asymmetric registered [ 1.018264] Asymmetric key parser 'x509' registered [ 1.019763] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 1.022420] io scheduler noop registered [ 1.023664] io scheduler deadline registered (default) [ 1.025218] io scheduler cfq registered [ 1.026588] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 1.028200] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 1.030207] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 1.032460] ACPI: Power Button [PWRF] [ 1.033829] GHES: HEST is not enabled! [ 1.091516] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 1.209372] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 1.268491] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 [ 1.271750] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.296603] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 1.300901] Linux agpgart interface v0.103 [ 1.305284] brd: module loaded [ 1.307667] loop: module loaded [ 1.314370] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 1.316640] GPT:90111 != 2097151 [ 1.317721] GPT:Alternate GPT header not at the end of the disk. [ 1.319413] GPT:90111 != 2097151 [ 1.320480] GPT: Use GNU Parted to correct GPT errors. [ 1.322001] vda: vda1 vda15 [ 1.324920] vdb: [ 1.326751] scsi host0: ata_piix [ 1.327943] scsi host1: ata_piix [ 1.329096] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 [ 1.331058] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 [ 1.333581] libphy: Fixed MDIO Bus: probed [ 1.334853] tun: Universal TUN/TAP device driver, 1.6 [ 1.336333] tun: (C) 1999-2004 Max Krasnyansky [ 1.339562] PPP generic driver version 2.4.2 [ 1.340963] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.342790] ehci-pci: EHCI PCI platform driver [ 1.344161] ehci-platform: EHCI generic platform driver [ 1.345720] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 1.347462] ohci-pci: OHCI PCI platform driver [ 1.348820] ohci-platform: OHCI generic platform driver [ 1.350366] uhci_hcd: USB Universal Host Controller Interface driver [ 1.409558] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 1.411093] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 1.413385] uhci_hcd 0000:00:01.2: detected 2 ports [ 1.414903] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c080 [ 1.416642] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 1.418557] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 1.420801] usb usb1: Product: UHCI Host Controller [ 1.422284] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 1.424062] usb usb1: SerialNumber: 0000:00:01.2 [ 1.425629] hub 1-0:1.0: USB hub found [ 1.426842] hub 1-0:1.0: 2 ports detected [ 1.428322] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 1.431621] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 1.435108] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 1.436848] mousedev: PS/2 mouse device common for all mice [ 1.438855] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 1.441717] rtc_cmos 00:00: RTC can wake from S4 [ 1.443448] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 1.445300] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 1.447142] i2c /dev entries driver [ 1.448359] device-mapper: uevent: version 1.0.3 [ 1.449861] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 1.452397] ledtrig-cpu: registered to indicate activity on CPUs [ 1.454994] NET: Registered protocol family 10 [ 1.456605] NET: Registered protocol family 17 [ 1.458020] Key type dns_resolver registered [ 1.459541] microcode: CPU0 sig=0x306f2, pf=0x1, revision=0x1 [ 1.461323] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 1.464107] registered taskstats version 1 [ 1.465409] Loading compiled-in X.509 certificates [ 1.467672] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 1.470558] zswap: loaded using pool lzo/zbud [ 1.473069] Key type trusted registered [ 1.476365] Key type encrypted registered [ 1.477672] AppArmor: AppArmor sha1 policy hashing enabled [ 1.479269] ima: No TPM chip found, activating TPM-bypass! [ 1.480883] evm: HMAC attrs: 0x1 [ 1.500189] Magic number: 2:941:555 [ 1.501543] rtc_cmos 00:00: setting system clock to 2018-05-27 23:31:03 UTC (1527463863) [ 1.504058] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 1.505787] EDD information not available. [ 1.511328] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 1.513744] Write protecting the kernel read-only data: 14336k [ 1.516370] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 1.519400] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 1.18 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.48 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.890341] random: dd urandom read with 6 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.10... Lease of 192.168.130.10 obtained, lease time 600 route: SIOCADDRT: File exists WARN: failed: route add -net "0.0.0.0/0" gw "192.168.130.1" Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: OpenStack Foundation OpenStack Nova Container: none Arch: x86_64 CPU(s): 1 @ 3491.914 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: VT-x RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 vdb 253:16 67108864 config-2 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCSUxoCMUMDJXHub6gwjx9mFdwZhGUNoPeP57NVEDfTwV1PMf6ZaGRyqHeYqZyAh43V9VQF5uScjTNRfG6NuAxo9AyBHktsJHRSRmkn+IXcZ+tlc8/ywDawZd3uNVB7DjC4nNQv6L2Sxa7ARf+8YwonB84SdnPSQbCDOIWYIy47iCS5IAqu+5jGBnW3IWoP06ZnqjaTyw1h20kttBWyxOlFEM2RwaLxO9g4imn2Gu8OKMzgOfeRvUf0qug49AGz9eftxjLi1LTo0WjSP7LAWCMVNtfTMTVq5sSWZFJzZnqq7E9efK90ma4BcRBgZVaAK94+rxr4MvDCd2recJWZaGyr root@opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1 ssh-dss AAAAB3NzaC1kc3MAAACBAPgZSvlBR7rv4hzcpBosV0Ty+35bKI0IhCjIOAZmGtC10b1gBpyM35SwhDxWEo5B1Rm7fnjyPtfcMlwm0MeDVRIKUJJLoRJtFY4fhQfB8VyQVISimr2A4moEz5S8WOFlRYn6tQ8uuLLWcopAqqsYCuNk3I2ADynGj1o8CE4j0FkTAAAAFQC3w8B6tKIvkaKBlZeXy0J0mj0BhwAAAIAuZPj+KHYu2HYfNdhDf1isNK/42X4KOi3abLdH2ZYmVQ0jUUmSsTjol7pMsez3rwGgjTCW84fG6MzbugndZHOa8Ye1LcXI7M24cOENrbnbXxLBoFJ5l9tLmAU2/PIQaXdBH3/cphVeUstlcUD1lnM0rR1KOKZQ3m2qI3vE1cC8swAAAIBCNOCZl9MZrfu8HUBl9HuBUTc4ZKzi+K4Q2qqvo3NmRyhfOo37d23pU3EA8acJeMs7/39erNvtrnTr7S1/O7gCy1s59cYjPVj4HhIFSjTcF4RZpJ0opWNU7ebO29vLir8ca+1eIJjBCWuRpdx5XvQmuUAVmHr6lqTqx2fx3d30GA== root@opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1 -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.10,24,fe80::f816:3eff:feaf:d73d/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:169.254.169.254 via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.10 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: 9a0f121d-7171-40f2-9db9-c4806981de00 name: opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1 availability-zone: nova local-hostname: opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1.novalocal launch-index: 0 === cirros: current=0.4.0 latest=0.4.0 uptime=4.27 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-2-ssh--597c6a75-8748-4a69-b031-f5ba6b127fc1 login: 2018-05-27 23:31:11,328 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Begin test execution 2018-05-27 23:31:11,408 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - ssh: 2018-05-27 23:31:11,412 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - ping output: >> 2018-05-27 23:31:11,415 - xtesting.energy.energy - DEBUG - Restoring previous scenario (cloudify_ims/running) 2018-05-27 23:31:11,415 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-27 23:31:12,148 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:31:12,149 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | vping_ssh | functest | 00:57 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-27 23:31:40,187 - xtesting.ci.run_tests - INFO - Running test case 'vping_userdata'... 2018-05-27 23:31:41,162 - functest.opnfv_tests.openstack.vping.vping_base - DEBUG - ext_net: Munch({u'status': u'ACTIVE', u'subnets': [u'7b531737-77a7-41af-93de-a06ade38f99f'], u'description': u'', u'provider:physical_network': u'physnet1', u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:10:42Z', u'is_default': True, u'revision_number': 4, u'port_security_enabled': True, u'mtu': 1500, u'id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'provider:segmentation_id': None, u'router:external': True, u'availability_zone_hints': [], u'availability_zones': [u'nova'], u'name': u'floating_net', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:10:32Z', u'provider:network_type': u'flat', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:31:41,163 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Begin virtual environment setup 2018-05-27 23:31:41,163 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - vPing Start Time:'2018-05-27 23:31:41' 2018-05-27 23:31:41,163 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating image with name: 'functest-vping--3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:31:41,163 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Image metadata: None 2018-05-27 23:31:42,760 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/functest-vping--3313f109-1a04-49ad-b5fd-b41c930f2667', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-27T23:31:42Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'shared', u'file': u'/v2/images/e4724bfd-a7c4-4466-8dff-ead367688e6c/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'e4724bfd-a7c4-4466-8dff-ead367688e6c', u'size': None, u'name': u'functest-vping--3313f109-1a04-49ad-b5fd-b41c930f2667', u'checksum': None, u'self': u'/v2/images/e4724bfd-a7c4-4466-8dff-ead367688e6c', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-27T23:31:42Z', u'schema': u'/v2/schemas/image'}) 2018-05-27 23:31:42,760 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating network with name: 'vping-net-3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:31:43,511 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:31:43Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'6a9975d5-3be7-4a95-b30e-2b2c47a35df3', u'provider:segmentation_id': 26, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'vping-net-3313f109-1a04-49ad-b5fd-b41c930f2667', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:31:43Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:31:44,882 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:31:44Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.130.2', u'end': u'192.168.130.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.130.0/24', u'id': u'6a31794e-39ec-440f-acc9-ec4a4170067a', u'subnetpool_id': None, u'service_types': [], u'name': u'vping-subnet-3313f109-1a04-49ad-b5fd-b41c930f2667', u'enable_dhcp': True, u'network_id': u'6a9975d5-3be7-4a95-b30e-2b2c47a35df3', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:31:44Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.130.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:31:44,882 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating router with name: 'vping-router-3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:31:47,528 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - router: Munch({u'status': u'ACTIVE', u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:31:47Z', u'revision_number': 3, u'ha': False, u'id': u'4bf6ac22-856c-4e07-b99b-70f63ca500dc', u'external_gateway_info': {u'network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'enable_snat': True, u'external_fixed_ips': [{u'subnet_id': u'7b531737-77a7-41af-93de-a06ade38f99f', u'ip_address': u'172.30.10.119'}]}, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'vping-router-3313f109-1a04-49ad-b5fd-b41c930f2667', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:31:45Z', u'distributed': False, u'flavor_id': None, u'routes': [], u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:31:51,259 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating flavor with name: 'vping-flavor-3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:31:51,552 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - flavor: Munch({'name': u'vping-flavor-3313f109-1a04-49ad-b5fd-b41c930f2667', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'abd57bc4-92bc-4c44-90d2-1ebe133b6890', 'swap': 0}) 2018-05-27 23:31:52,855 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating VM 1 instance with name: 'opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:32:06,725 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm1: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-3313f109-1a04-49ad-b5fd-b41c930f2667': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:63:23:2b', u'version': 4, u'addr': u'192.168.130.7', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'e4724bfd-a7c4-4466-8dff-ead367688e6c'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000026', u'OS-SRV-USG:launched_at': u'2018-05-27T23:32:04.000000', 'flavor': Munch({u'id': u'abd57bc4-92bc-4c44-90d2-1ebe133b6890'}), 'az': u'nova', 'id': u'f944531a-5bbe-4a41-89eb-f5ed8754fd03', 'security_groups': [Munch({u'name': u'vping-sg-3313f109-1a04-49ad-b5fd-b41c930f2667'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-27T23:32:04.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-27T23:32:05Z', 'hostId': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.7', 'cloud': 'envvars', 'host_id': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000026', u'OS-SRV-USG:launched_at': u'2018-05-27T23:32:04.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'17e0c72255804297b05647b8b64ec56a', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', 'name': u'opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667', 'adminPass': u'9xBWEpP8KhxJ', 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:31:55Z', 'created': u'2018-05-27T23:31:55Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True', 'region': 'RegionOne'}) 2018-05-27 23:32:09,728 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm1 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffd9fff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffda000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffda max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f6a40-0x000f6a4f] mapped at [ffff8800000f6a40] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb14000-0x1ffc9fff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F6830 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE1591 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE1425 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 0013E5 (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1519 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffd9fff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd5000-0x1ffd9fff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd1001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 937622797 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffd9fff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffd9fff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffd9fff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128867 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491788K/523744K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.914 MHz processor [ 0.281544] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967656) [ 0.291119] pid_max: default: 32768 minimum: 301 [ 0.292938] ACPI: Core revision 20150930 [ 0.297220] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.299845] Security Framework initialized [ 0.301487] Yama: becoming mindful. [ 0.302931] AppArmor: AppArmor initialized [ 0.304709] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.307646] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.310310] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.312795] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.315692] Initializing cgroup subsys io [ 0.317378] Initializing cgroup subsys memory [ 0.319238] Initializing cgroup subsys devices [ 0.320985] Initializing cgroup subsys freezer [ 0.322879] Initializing cgroup subsys net_cls [ 0.324680] Initializing cgroup subsys perf_event [ 0.326650] Initializing cgroup subsys net_prio [ 0.328628] Initializing cgroup subsys hugetlb [ 0.330472] Initializing cgroup subsys pids [ 0.331960] CPU: Physical Processor ID: 0 [ 0.333899] mce: CPU supports 10 MCE banks [ 0.335214] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.336778] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.356437] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.366967] ftrace: allocating 31920 entries in 125 pages [ 0.419890] smpboot: Max logical packages: 1 [ 0.421197] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.423302] x2apic enabled [ 0.424623] Switched APIC routing to physical x2apic. [ 0.427463] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.429173] smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2637 v3 @ 3.50GHz (family: 0x6, model: 0x3f, stepping: 0x2) [ 0.432308] Performance Events: 16-deep LBR, Haswell events, Intel PMU driver. [ 0.434833] ... version: 2 [ 0.436068] ... bit width: 48 [ 0.437316] ... generic registers: 4 [ 0.438543] ... value mask: 0000ffffffffffff [ 0.440093] ... max period: 000000007fffffff [ 0.441634] ... fixed-purpose events: 3 [ 0.442880] ... event mask: 000000070000000f [ 0.444465] KVM setup paravirtual spinlock [ 0.446410] x86: Booted up 1 node, 1 CPUs [ 0.447680] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.449894] devtmpfs: initialized [ 0.462049] evm: security.selinux [ 0.463156] evm: security.SMACK64 [ 0.464253] evm: security.SMACK64EXEC [ 0.465419] evm: security.SMACK64TRANSMUTE [ 0.466696] evm: security.SMACK64MMAP [ 0.467876] evm: security.ima [ 0.468894] evm: security.capability [ 0.470240] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.473156] pinctrl core: initialized pinctrl subsystem [ 0.474889] RTC time: 23:32:05, date: 05/27/18 [ 0.476391] NET: Registered protocol family 16 [ 0.478021] cpuidle: using governor ladder [ 0.479317] cpuidle: using governor menu [ 0.480663] PCCT header not found. [ 0.481892] ACPI: bus type PCI registered [ 0.483147] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.485152] PCI: Using configuration type 1 for base access [ 0.486854] core: PMU erratum BJ122, BV98, HSD29 workaround disabled, HT off [ 0.491133] ACPI: Added _OSI(Module Device) [ 0.492450] ACPI: Added _OSI(Processor Device) [ 0.493811] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.495222] ACPI: Added _OSI(Processor Aggregator Device) [ 0.499503] ACPI: Interpreter enabled [ 0.500701] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.503622] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.506547] ACPI: (supports S0 S3 S4 S5) [ 0.507802] ACPI: Using IOAPIC for interrupt routing [ 0.509311] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.517197] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.518984] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.520918] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.522793] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.526616] acpiphp: Slot [3] registered [ 0.527915] acpiphp: Slot [4] registered [ 0.529194] acpiphp: Slot [5] registered [ 0.530471] acpiphp: Slot [6] registered [ 0.531746] acpiphp: Slot [7] registered [ 0.533021] acpiphp: Slot [8] registered [ 0.534295] acpiphp: Slot [9] registered [ 0.535577] acpiphp: Slot [10] registered [ 0.536866] acpiphp: Slot [11] registered [ 0.555084] acpiphp: Slot [12] registered [ 0.556378] acpiphp: Slot [13] registered [ 0.557671] acpiphp: Slot [14] registered [ 0.558968] acpiphp: Slot [15] registered [ 0.560265] acpiphp: Slot [16] registered [ 0.561546] acpiphp: Slot [17] registered [ 0.562830] acpiphp: Slot [18] registered [ 0.564114] acpiphp: Slot [19] registered [ 0.565403] acpiphp: Slot [20] registered [ 0.566689] acpiphp: Slot [21] registered [ 0.567984] acpiphp: Slot [22] registered [ 0.569276] acpiphp: Slot [23] registered [ 0.570562] acpiphp: Slot [24] registered [ 0.571860] acpiphp: Slot [25] registered [ 0.573142] acpiphp: Slot [26] registered [ 0.574420] acpiphp: Slot [27] registered [ 0.575715] acpiphp: Slot [28] registered [ 0.577005] acpiphp: Slot [29] registered [ 0.578286] acpiphp: Slot [30] registered [ 0.579568] acpiphp: Slot [31] registered [ 0.580832] PCI host bridge to bus 0000:00 [ 0.582101] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.583985] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.585876] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.588164] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.590433] pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] [ 0.592754] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.598439] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.600405] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.602241] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.604216] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.609226] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.611467] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.649692] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.652083] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.654403] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.656736] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.658968] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.661007] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.662819] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.664461] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.666892] vgaarb: loaded [ 0.667871] vgaarb: bridge control possible 0000:00:02.0 [ 0.669807] SCSI subsystem initialized [ 0.671102] ACPI: bus type USB registered [ 0.672387] usbcore: registered new interface driver usbfs [ 0.674002] usbcore: registered new interface driver hub [ 0.675576] usbcore: registered new device driver usb [ 0.677268] PCI: Using ACPI for IRQ routing [ 0.678941] NetLabel: Initializing [ 0.680076] NetLabel: domain hash size = 128 [ 0.681409] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.682899] NetLabel: unlabeled traffic allowed by default [ 0.684795] clocksource: Switched to clocksource kvm-clock [ 0.699612] AppArmor: AppArmor Filesystem Enabled [ 0.701108] pnp: PnP ACPI init [ 0.702813] pnp: PnP ACPI: found 5 devices [ 0.711320] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.714001] NET: Registered protocol family 2 [ 0.715541] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.717524] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.719368] TCP: Hash tables configured (established 4096 bind 4096) [ 0.721178] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.722862] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.724695] NET: Registered protocol family 1 [ 0.726053] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.727755] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.729420] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.788510] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.847810] Trying to unpack rootfs image as initramfs... [ 0.968627] Freeing initrd memory: 4824K (ffff88001fb14000 - ffff88001ffca000) [ 0.971204] Scanning for low memory corruption every 60 seconds [ 0.973354] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.975112] audit: initializing netlink subsys (disabled) [ 0.976705] audit: type=2000 audit(1527463926.229:1): initialized [ 0.978943] Initialise system trusted keyring [ 0.980384] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.982193] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.986896] zbud: loaded [ 0.988163] VFS: Disk quotas dquot_6.6.0 [ 0.989471] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.992212] fuse init (API version 7.23) [ 0.993678] Key type big_key registered [ 0.994925] Allocating IMA MOK and blacklist keyrings. [ 0.996750] Key type asymmetric registered [ 0.998058] Asymmetric key parser 'x509' registered [ 0.999591] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 1.002206] io scheduler noop registered [ 1.003433] io scheduler deadline registered (default) [ 1.005000] io scheduler cfq registered [ 1.006335] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 1.007927] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 1.009893] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 1.012128] ACPI: Power Button [PWRF] [ 1.013460] GHES: HEST is not enabled! [ 1.072018] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 1.190544] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 1.249540] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 [ 1.252770] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.278567] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 1.282824] Linux agpgart interface v0.103 [ 1.287198] brd: module loaded [ 1.289642] loop: module loaded [ 1.296882] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 1.299122] GPT:90111 != 2097151 [ 1.300188] GPT:Alternate GPT header not at the end of the disk. [ 1.301863] GPT:90111 != 2097151 [ 1.302921] GPT: Use GNU Parted to correct GPT errors. [ 1.304434] vda: vda1 vda15 [ 1.307436] vdb: [ 1.309238] scsi host0: ata_piix [ 1.310413] scsi host1: ata_piix [ 1.311545] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 [ 1.313403] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 [ 1.315917] libphy: Fixed MDIO Bus: probed [ 1.317190] tun: Universal TUN/TAP device driver, 1.6 [ 1.318658] tun: (C) 1999-2004 Max Krasnyansky [ 1.321802] PPP generic driver version 2.4.2 [ 1.323191] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.325018] ehci-pci: EHCI PCI platform driver [ 1.326371] ehci-platform: EHCI generic platform driver [ 1.328264] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 1.330487] ohci-pci: OHCI PCI platform driver [ 1.332190] ohci-platform: OHCI generic platform driver [ 1.334123] uhci_hcd: USB Universal Host Controller Interface driver [ 1.393830] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 1.395736] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 1.398553] uhci_hcd 0000:00:01.2: detected 2 ports [ 1.400431] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c080 [ 1.402612] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 1.405023] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 1.407820] usb usb1: Product: UHCI Host Controller [ 1.409660] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 1.411935] usb usb1: SerialNumber: 0000:00:01.2 [ 1.413833] hub 1-0:1.0: USB hub found [ 1.415325] hub 1-0:1.0: 2 ports detected [ 1.417162] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 1.421110] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 1.422953] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 1.424981] mousedev: PS/2 mouse device common for all mice [ 1.427389] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 1.430722] rtc_cmos 00:00: RTC can wake from S4 [ 1.435020] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 1.437366] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 1.439733] i2c /dev entries driver [ 1.441235] device-mapper: uevent: version 1.0.3 [ 1.443113] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 1.446401] ledtrig-cpu: registered to indicate activity on CPUs [ 1.449439] NET: Registered protocol family 10 [ 1.451410] NET: Registered protocol family 17 [ 1.453139] Key type dns_resolver registered [ 1.455027] microcode: CPU0 sig=0x306f2, pf=0x1, revision=0x1 [ 1.457178] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 1.460649] registered taskstats version 1 [ 1.462283] Loading compiled-in X.509 certificates [ 1.464929] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 1.468586] zswap: loaded using pool lzo/zbud [ 1.471788] Key type trusted registered [ 1.475367] Key type encrypted registered [ 1.476975] AppArmor: AppArmor sha1 policy hashing enabled [ 1.479004] ima: No TPM chip found, activating TPM-bypass! [ 1.481088] evm: HMAC attrs: 0x1 [ 1.500227] Magic number: 2:941:555 [ 1.501806] rtc_cmos 00:00: setting system clock to 2018-05-27 23:32:06 UTC (1527463926) [ 1.504941] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 1.507092] EDD information not available. [ 1.512981] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 1.515999] Write protecting the kernel read-only data: 14336k [ 1.519168] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 1.522808] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 1.17 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.38 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.807202] random: dd urandom read with 6 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.7... Lease of 192.168.130.7 obtained, lease time 600 route: SIOCADDRT: File exists WARN: failed: route add -net "0.0.0.0/0" gw "192.168.130.1" Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: OpenStack Foundation OpenStack Nova Container: none Arch: x86_64 CPU(s): 1 @ 3491.914 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: VT-x RAM Size: 488MB Disks: 2018-05-27 23:32:09,729 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating VM 2 instance with name: 'opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667' 2018-05-27 23:32:25,575 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm2: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-3313f109-1a04-49ad-b5fd-b41c930f2667': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:db:43:b8', u'version': 4, u'addr': u'192.168.130.5', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'e4724bfd-a7c4-4466-8dff-ead367688e6c'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000029', u'OS-SRV-USG:launched_at': u'2018-05-27T23:32:23.000000', 'flavor': Munch({u'id': u'abd57bc4-92bc-4c44-90d2-1ebe133b6890'}), 'az': u'nova', 'id': u'2a2a7795-f062-4e27-b5c6-58040af84005', 'security_groups': [Munch({u'name': u'default'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-27T23:32:23.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-27T23:32:24Z', 'hostId': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.5', 'cloud': 'envvars', 'host_id': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-00000029', u'OS-SRV-USG:launched_at': u'2018-05-27T23:32:23.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'17e0c72255804297b05647b8b64ec56a', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', 'name': u'opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667', 'adminPass': u'xsV3BxA8Rbsq', 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:32:12Z', 'created': u'2018-05-27T23:32:12Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True', 'region': 'RegionOne'}) 2018-05-27 23:32:29,031 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm2 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffd9fff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffda000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffda max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f6a40-0x000f6a4f] mapped at [ffff8800000f6a40] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb14000-0x1ffc9fff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F6830 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE1591 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE1425 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 0013E5 (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1519 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffd9fff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd5000-0x1ffd9fff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd1001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 937622797 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffd9fff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffd9fff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffd9fff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128867 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491788K/523744K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.914 MHz processor [ 0.281544] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967656) [ 0.291119] pid_max: default: 32768 minimum: 301 [ 0.292938] ACPI: Core revision 20150930 [ 0.297220] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.299845] Security Framework initialized [ 0.301487] Yama: becoming mindful. [ 0.302931] AppArmor: AppArmor initialized [ 0.304709] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.307646] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.310310] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.312795] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.315692] Initializing cgroup subsys io [ 0.317378] Initializing cgroup subsys memory [ 0.319238] Initializing cgroup subsys devices [ 0.320985] Initializing cgroup subsys freezer [ 0.322879] Initializing cgroup subsys net_cls [ 0.324680] Initializing cgroup subsys perf_event [ 0.326650] Initializing cgroup subsys net_prio [ 0.328628] Initializing cgroup subsys hugetlb [ 0.330472] Initializing cgroup subsys pids [ 0.331960] CPU: Physical Processor ID: 0 [ 0.333899] mce: CPU supports 10 MCE banks [ 0.335214] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.336778] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.356437] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.366967] ftrace: allocating 31920 entries in 125 pages [ 0.419890] smpboot: Max logical packages: 1 [ 0.421197] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.423302] x2apic enabled [ 0.424623] Switched APIC routing to physical x2apic. [ 0.427463] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.429173] smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2637 v3 @ 3.50GHz (family: 0x6, model: 0x3f, stepping: 0x2) [ 0.432308] Performance Events: 16-deep LBR, Haswell events, Intel PMU driver. [ 0.434833] ... version: 2 [ 0.436068] ... bit width: 48 [ 0.437316] ... generic registers: 4 [ 0.438543] ... value mask: 0000ffffffffffff [ 0.440093] ... max period: 000000007fffffff [ 0.441634] ... fixed-purpose events: 3 [ 0.442880] ... event mask: 000000070000000f [ 0.444465] KVM setup paravirtual spinlock [ 0.446410] x86: Booted up 1 node, 1 CPUs [ 0.447680] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.449894] devtmpfs: initialized [ 0.462049] evm: security.selinux [ 0.463156] evm: security.SMACK64 [ 0.464253] evm: security.SMACK64EXEC [ 0.465419] evm: security.SMACK64TRANSMUTE [ 0.466696] evm: security.SMACK64MMAP [ 0.467876] evm: security.ima [ 0.468894] evm: security.capability [ 0.470240] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.473156] pinctrl core: initialized pinctrl subsystem [ 0.474889] RTC time: 23:32:05, date: 05/27/18 [ 0.476391] NET: Registered protocol family 16 [ 0.478021] cpuidle: using governor ladder [ 0.479317] cpuidle: using governor menu [ 0.480663] PCCT header not found. [ 0.481892] ACPI: bus type PCI registered [ 0.483147] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.485152] PCI: Using configuration type 1 for base access [ 0.486854] core: PMU erratum BJ122, BV98, HSD29 workaround disabled, HT off [ 0.491133] ACPI: Added _OSI(Module Device) [ 0.492450] ACPI: Added _OSI(Processor Device) [ 0.493811] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.495222] ACPI: Added _OSI(Processor Aggregator Device) [ 0.499503] ACPI: Interpreter enabled [ 0.500701] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.503622] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.506547] ACPI: (supports S0 S3 S4 S5) [ 0.507802] ACPI: Using IOAPIC for interrupt routing [ 0.509311] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.517197] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.518984] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.520918] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.522793] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.526616] acpiphp: Slot [3] registered [ 0.527915] acpiphp: Slot [4] registered [ 0.529194] acpiphp: Slot [5] registered [ 0.530471] acpiphp: Slot [6] registered [ 0.531746] acpiphp: Slot [7] registered [ 0.533021] acpiphp: Slot [8] registered [ 0.534295] acpiphp: Slot [9] registered [ 0.535577] acpiphp: Slot [10] registered [ 0.536866] acpiphp: Slot [11] registered [ 0.555084] acpiphp: Slot [12] registered [ 0.556378] acpiphp: Slot [13] registered [ 0.557671] acpiphp: Slot [14] registered [ 0.558968] acpiphp: Slot [15] registered [ 0.560265] acpiphp: Slot [16] registered [ 0.561546] acpiphp: Slot [17] registered [ 0.562830] acpiphp: Slot [18] registered [ 0.564114] acpiphp: Slot [19] registered [ 0.565403] acpiphp: Slot [20] registered [ 0.566689] acpiphp: Slot [21] registered [ 0.567984] acpiphp: Slot [22] registered [ 0.569276] acpiphp: Slot [23] registered [ 0.570562] acpiphp: Slot [24] registered [ 0.571860] acpiphp: Slot [25] registered [ 0.573142] acpiphp: Slot [26] registered [ 0.574420] acpiphp: Slot [27] registered [ 0.575715] acpiphp: Slot [28] registered [ 0.577005] acpiphp: Slot [29] registered [ 0.578286] acpiphp: Slot [30] registered [ 0.579568] acpiphp: Slot [31] registered [ 0.580832] PCI host bridge to bus 0000:00 [ 0.582101] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.583985] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.585876] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.588164] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.590433] pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] [ 0.592754] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.598439] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.600405] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.602241] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.604216] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.609226] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.611467] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.649692] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.652083] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.654403] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.656736] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.658968] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.661007] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.662819] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.664461] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.666892] vgaarb: loaded [ 0.667871] vgaarb: bridge control possible 0000:00:02.0 [ 0.669807] SCSI subsystem initialized [ 0.671102] ACPI: bus type USB registered [ 0.672387] usbcore: registered new interface driver usbfs [ 0.674002] usbcore: registered new interface driver hub [ 0.675576] usbcore: registered new device driver usb [ 0.677268] PCI: Using ACPI for IRQ routing [ 0.678941] NetLabel: Initializing [ 0.680076] NetLabel: domain hash size = 128 [ 0.681409] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.682899] NetLabel: unlabeled traffic allowed by default [ 0.684795] clocksource: Switched to clocksource kvm-clock [ 0.699612] AppArmor: AppArmor Filesystem Enabled [ 0.701108] pnp: PnP ACPI init [ 0.702813] pnp: PnP ACPI: found 5 devices [ 0.711320] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.714001] NET: Registered protocol family 2 [ 0.715541] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.717524] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.719368] TCP: Hash tables configured (established 4096 bind 4096) [ 0.721178] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.722862] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.724695] NET: Registered protocol family 1 [ 0.726053] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.727755] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.729420] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.788510] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.847810] Trying to unpack rootfs image as initramfs... [ 0.968627] Freeing initrd memory: 4824K (ffff88001fb14000 - ffff88001ffca000) [ 0.971204] Scanning for low memory corruption every 60 seconds [ 0.973354] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.975112] audit: initializing netlink subsys (disabled) [ 0.976705] audit: type=2000 audit(1527463926.229:1): initialized [ 0.978943] Initialise system trusted keyring [ 0.980384] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.982193] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.986896] zbud: loaded [ 0.988163] VFS: Disk quotas dquot_6.6.0 [ 0.989471] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.992212] fuse init (API version 7.23) [ 0.993678] Key type big_key registered [ 0.994925] Allocating IMA MOK and blacklist keyrings. [ 0.996750] Key type asymmetric registered [ 0.998058] Asymmetric key parser 'x509' registered [ 0.999591] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 1.002206] io scheduler noop registered [ 1.003433] io scheduler deadline registered (default) [ 1.005000] io scheduler cfq registered [ 1.006335] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 1.007927] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 1.009893] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 1.012128] ACPI: Power Button [PWRF] [ 1.013460] GHES: HEST is not enabled! [ 1.072018] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 1.190544] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 1.249540] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 [ 1.252770] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.278567] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 1.282824] Linux agpgart interface v0.103 [ 1.287198] brd: module loaded [ 1.289642] loop: module loaded [ 1.296882] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 1.299122] GPT:90111 != 2097151 [ 1.300188] GPT:Alternate GPT header not at the end of the disk. [ 1.301863] GPT:90111 != 2097151 [ 1.302921] GPT: Use GNU Parted to correct GPT errors. [ 1.304434] vda: vda1 vda15 [ 1.307436] vdb: [ 1.309238] scsi host0: ata_piix [ 1.310413] scsi host1: ata_piix [ 1.311545] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 [ 1.313403] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 [ 1.315917] libphy: Fixed MDIO Bus: probed [ 1.317190] tun: Universal TUN/TAP device driver, 1.6 [ 1.318658] tun: (C) 1999-2004 Max Krasnyansky [ 1.321802] PPP generic driver version 2.4.2 [ 1.323191] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.325018] ehci-pci: EHCI PCI platform driver [ 1.326371] ehci-platform: EHCI generic platform driver [ 1.328264] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 1.330487] ohci-pci: OHCI PCI platform driver [ 1.332190] ohci-platform: OHCI generic platform driver [ 1.334123] uhci_hcd: USB Universal Host Controller Interface driver [ 1.393830] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 1.395736] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 1.398553] uhci_hcd 0000:00:01.2: detected 2 ports [ 1.400431] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c080 [ 1.402612] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 1.405023] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 1.407820] usb usb1: Product: UHCI Host Controller [ 1.409660] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 1.411935] usb usb1: SerialNumber: 0000:00:01.2 [ 1.413833] hub 1-0:1.0: USB hub found [ 1.415325] hub 1-0:1.0: 2 ports detected [ 1.417162] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 1.421110] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 1.422953] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 1.424981] mousedev: PS/2 mouse device common for all mice [ 1.427389] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 1.430722] rtc_cmos 00:00: RTC can wake from S4 [ 1.435020] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 1.437366] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 1.439733] i2c /dev entries driver [ 1.441235] device-mapper: uevent: version 1.0.3 [ 1.443113] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 1.446401] ledtrig-cpu: registered to indicate activity on CPUs [ 1.449439] NET: Registered protocol family 10 [ 1.451410] NET: Registered protocol family 17 [ 1.453139] Key type dns_resolver registered [ 1.455027] microcode: CPU0 sig=0x306f2, pf=0x1, revision=0x1 [ 1.457178] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 1.460649] registered taskstats version 1 [ 1.462283] Loading compiled-in X.509 certificates [ 1.464929] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 1.468586] zswap: loaded using pool lzo/zbud [ 1.471788] Key type trusted registered [ 1.475367] Key type encrypted registered [ 1.476975] AppArmor: AppArmor sha1 policy hashing enabled [ 1.479004] ima: No TPM chip found, activating TPM-bypass! [ 1.481088] evm: HMAC attrs: 0x1 [ 1.500227] Magic number: 2:941:555 [ 1.501806] rtc_cmos 00:00: setting system clock to 2018-05-27 23:32:06 UTC (1527463926) [ 1.504941] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 1.507092] EDD information not available. [ 1.512981] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 1.515999] Write protecting the kernel read-only data: 14336k [ 1.519168] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 1.522808] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 1.17 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.38 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.807202] random: dd urandom read with 6 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.7... Lease of 192.168.130.7 obtained, lease time 600 route: SIOCADDRT: File exists WARN: failed: route add -net "0.0.0.0/0" gw "192.168.130.1" Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: OpenStack Foundation OpenStack Nova Container: none Arch: x86_64 CPU(s): 1 @ 3491.914 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: VT-x RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 vdb 253:16 67108864 config-2 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCIYWNEXDHyHHE+D0JPE90fQ98Sa7G+j6t+Jae12a1CGqQwXpIZ7P5kxjiq5QcMdoHAyIcGac0GxNJheaLzKkep/obhBw+s3HEYeynu0CH+1h3fSoHiB6kg2rlKh29ZmJd1OkRu8w9xNkQBIC+IEXMcWmdyMW9IjUR98dz/T9F2CqB0rDBFkpQ/5ueiybBfM+pr8/NHiPmxT1owb8S0JBhATk1JJ5U6ko98CSNS5EQD3rX6eDyy+0iiio2ZA08MfHrufkzMQowaJHvF1Iz4A8K5ytb6BQTDOeYqvLzqa50CGN8Q03LH2eM5GpCdpq/ytZhk7F50rvFgeOeFC7EFVBaz root@opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667 ssh-dss AAAAB3NzaC1kc3MAAACBAL+k2Fdd30vBf7LrLLddJimkL4wwL5d2scwfgcKsH3ELmVHmCjll8hxkOnPNfSMF9VIxS6oopRde6cz7zz83RePi70eX1bpaj1P1ZOZUPvMdFNE3gOpgGF27P0dOGiRDWBrFp6CUQpm7JP8K48imBULqfn4wMupNnEc49ns7UwI/AAAAFQCx4/zXLSaIS7c5s5kb04Ln9B3mJQAAAIEAojlk7KrojPeugk5zhMR0NB8mkqjiSsM3cc15PUsNwU79qeUKPGftZJfDbOia0zgdocJMEJSHUbb2cncgC+q6cM26QqhmmbqMWo7V/XTffmm9gf+JbwaQwadXkjn9U4AHPm/gxf+o+RxUg3V+ebgoUNclT3at5wnMUuRfAh+OZbUAAACAZ1kjEcjIzGVBOlCp8bUBB7MLw+72/60nOrOXUbOGWdFlUZIcBD12DMpJJZtU+8hAXNjBg+lbGv6rpssEErGXWFa1DTy24c7tAPjaxSqPLx+84tIvHmBx56fGC9u5swPWUVMljTSGmC1fpUhN2HzyMWV/PDFQN2ck2xMZY1H2Ysk= root@opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667 -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.7,24,fe80::f816:3eff:fe63:232b/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:169.254.169.254 via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.7 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: f944531a-5bbe-4a41-89eb-f5ed8754fd03 name: opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667 availability-zone: nova local-hostname: opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667.novalocal launch-index: 0 === cirros: current=0.4.0 latest=0.4.0 uptime=3.88 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-1-3313f109-1a04-49ad-b5fd-b41c930f2667 login: /dev/root resized successfully [took 8.66s] 2018-05-27 23:32:29,032 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Begin test execution 2018-05-27 23:32:29,033 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Waiting for ping... 2018-05-27 23:32:31,898 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffd9fff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffda000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffda max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f6a40-0x000f6a4f] mapped at [ffff8800000f6a40] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb14000-0x1ffc9fff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F6830 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE1591 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE1425 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 0013E5 (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1519 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffd9fff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd5000-0x1ffd9fff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd1001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 908358206 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffd9fff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffd9fff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffd9fff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128867 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491788K/523744K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.914 MHz processor [ 0.281193] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967656) [ 0.284776] pid_max: default: 32768 minimum: 301 [ 0.286744] ACPI: Core revision 20150930 [ 0.291104] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.293868] Security Framework initialized [ 0.295586] Yama: becoming mindful. [ 0.303179] AppArmor: AppArmor initialized [ 0.305062] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.308049] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.310793] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.313283] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.316332] Initializing cgroup subsys io [ 0.317994] Initializing cgroup subsys memory [ 0.319804] Initializing cgroup subsys devices [ 0.321574] Initializing cgroup subsys freezer [ 0.323355] Initializing cgroup subsys net_cls [ 0.325173] Initializing cgroup subsys perf_event [ 0.327069] Initializing cgroup subsys net_prio [ 0.328989] Initializing cgroup subsys hugetlb [ 0.330811] Initializing cgroup subsys pids [ 0.332550] CPU: Physical Processor ID: 0 [ 0.334899] mce: CPU supports 10 MCE banks [ 0.336755] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.338849] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.358956] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.370371] ftrace: allocating 31920 entries in 125 pages [ 0.422591] smpboot: Max logical packages: 1 [ 0.424402] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.427354] x2apic enabled [ 0.428946] Switched APIC routing to physical x2apic. [ 0.432415] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.434763] smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2637 v3 @ 3.50GHz (family: 0x6, model: 0x3f, stepping: 0x2) [ 0.439144] Performance Events: 16-deep LBR, Haswell events, Intel PMU driver. [ 0.442431] ... version: 2 [ 0.444100] ... bit width: 48 [ 0.445818] ... generic registers: 4 [ 0.447517] ... value mask: 0000ffffffffffff [ 0.449564] ... max period: 000000007fffffff [ 0.451616] ... fixed-purpose events: 3 [ 0.453268] ... event mask: 000000070000000f [ 0.455325] KVM setup paravirtual spinlock [ 0.457636] x86: Booted up 1 node, 1 CPUs [ 0.459260] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.462092] devtmpfs: initialized [ 0.474302] evm: security.selinux [ 0.475768] evm: security.SMACK64 [ 0.477245] evm: security.SMACK64EXEC [ 0.478825] evm: security.SMACK64TRANSMUTE [ 0.480550] evm: security.SMACK64MMAP [ 0.482117] evm: security.ima [ 0.483445] evm: security.capability [ 0.485203] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.489216] pinctrl core: initialized pinctrl subsystem [ 0.491555] RTC time: 23:32:24, date: 05/27/18 [ 0.493519] NET: Registered protocol family 16 [ 0.495574] cpuidle: using governor ladder [ 0.497251] cpuidle: using governor menu [ 0.498849] PCCT header not found. [ 0.500364] ACPI: bus type PCI registered [ 0.502053] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.504621] PCI: Using configuration type 1 for base access [ 0.506938] core: PMU erratum BJ122, BV98, HSD29 workaround disabled, HT off [ 0.511773] ACPI: Added _OSI(Module Device) [ 0.513410] ACPI: Added _OSI(Processor Device) [ 0.515118] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.516854] ACPI: Added _OSI(Processor Aggregator Device) [ 0.521715] ACPI: Interpreter enabled [ 0.523236] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.526957] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.530700] ACPI: (supports S0 S3 S4 S5) [ 0.532332] ACPI: Using IOAPIC for interrupt routing [ 0.534366] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.542992] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.545403] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.548018] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.550509] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.555408] acpiphp: Slot [3] registered [ 0.557096] acpiphp: Slot [4] registered [ 0.558777] acpiphp: Slot [5] registered [ 0.560421] acpiphp: Slot [6] registered [ 0.562077] acpiphp: Slot [7] registered [ 0.563742] acpiphp: Slot [8] registered [ 0.565471] acpiphp: Slot [9] registered [ 0.567127] acpiphp: Slot [10] registered [ 0.568873] acpiphp: Slot [11] registered [ 0.587152] acpiphp: Slot [12] registered [ 0.588924] acpiphp: Slot [13] registered [ 0.590670] acpiphp: Slot [14] registered [ 0.592412] acpiphp: Slot [15] registered [ 0.594029] acpiphp: Slot [16] registered [ 0.595671] acpiphp: Slot [17] registered [ 0.597417] acpiphp: Slot [18] registered [ 0.599150] acpiphp: Slot [19] registered [ 0.600879] acpiphp: Slot [20] registered [ 0.602614] acpiphp: Slot [21] registered [ 0.604326] acpiphp: Slot [22] registered [ 0.605979] acpiphp: Slot [23] registered [ 0.607576] acpiphp: Slot [24] registered [ 0.609214] acpiphp: Slot [25] registered [ 0.610800] acpiphp: Slot [26] registered [ 0.612441] acpiphp: Slot [27] registered [ 0.614101] acpiphp: Slot [28] registered [ 0.615740] acpiphp: Slot [29] registered [ 0.617318] acpiphp: Slot [30] registered [ 0.618903] acpiphp: Slot [31] registered [ 0.620435] PCI host bridge to bus 0000:00 [ 0.621995] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.624355] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.626687] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.629557] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.632413] pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] [ 0.635230] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.641417] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.644083] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.646521] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.649157] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.654534] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.657373] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.697181] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.700033] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.702839] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.706004] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.708762] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.711275] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.713560] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.715699] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.718922] vgaarb: loaded [ 0.720115] vgaarb: bridge control possible 0000:00:02.0 [ 0.722507] SCSI subsystem initialized [ 0.724106] ACPI: bus type USB registered [ 0.725789] usbcore: registered new interface driver usbfs [ 0.727952] usbcore: registered new interface driver hub [ 0.729954] usbcore: registered new device driver usb [ 0.732065] PCI: Using ACPI for IRQ routing [ 0.734129] NetLabel: Initializing [ 0.735536] NetLabel: domain hash size = 128 [ 0.737305] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.739216] NetLabel: unlabeled traffic allowed by default [ 0.741420] clocksource: Switched to clocksource kvm-clock [ 0.756233] AppArmor: AppArmor Filesystem Enabled [ 0.758254] pnp: PnP ACPI init [ 0.760164] pnp: PnP ACPI: found 5 devices [ 0.768961] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.772345] NET: Registered protocol family 2 [ 0.774260] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.776846] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.779228] TCP: Hash tables configured (established 4096 bind 4096) [ 0.781565] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.783747] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.786147] NET: Registered protocol family 1 [ 0.787941] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.790231] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.792447] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.850684] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.908798] Trying to unpack rootfs image as initramfs... [ 1.026801] Freeing initrd memory: 4824K (ffff88001fb14000 - ffff88001ffca000) [ 1.030080] Scanning for low memory corruption every 60 seconds [ 1.032750] futex hash table entries: 256 (order: 2, 16384 bytes) [ 1.035078] audit: initializing netlink subsys (disabled) [ 1.037236] audit: type=2000 audit(1527463944.861:1): initialized [ 1.039989] Initialise system trusted keyring [ 1.041807] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 1.044287] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 1.049523] zbud: loaded [ 1.050939] VFS: Disk quotas dquot_6.6.0 [ 1.052586] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 1.055944] fuse init (API version 7.23) [ 1.057747] Key type big_key registered [ 1.059322] Allocating IMA MOK and blacklist keyrings. [ 1.061592] Key type asymmetric registered [ 1.063295] Asymmetric key parser 'x509' registered [ 1.065354] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 1.068648] io scheduler noop registered [ 1.070234] io scheduler deadline registered (default) [ 1.072306] io scheduler cfq registered [ 1.074004] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 1.076132] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 1.078769] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 1.081278] ACPI: Power Button [PWRF] [ 1.082997] GHES: HEST is not enabled! [ 1.140353] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 1.256261] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 1.315333] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 [ 1.319059] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.344459] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 1.349302] Linux agpgart interface v0.103 [ 1.354009] brd: module loaded [ 1.356658] loop: module loaded [ 1.364173] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 1.367099] GPT:90111 != 2097151 [ 1.368514] GPT:Alternate GPT header not at the end of the disk. [ 1.370802] GPT:90111 != 2097151 [ 1.372175] GPT: Use GNU Parted to correct GPT errors. [ 1.374080] vda: vda1 vda15 [ 1.377324] vdb: [ 1.379375] scsi host0: ata_piix [ 1.380945] scsi host1: ata_piix [ 1.382550] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 [ 1.385295] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 [ 1.388740] libphy: Fixed MDIO Bus: probed [ 1.390538] tun: Universal TUN/TAP device driver, 1.6 [ 1.392575] tun: (C) 1999-2004 Max Krasnyansky [ 1.396204] PPP generic driver version 2.4.2 [ 1.397622] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.399439] ehci-pci: EHCI PCI platform driver [ 1.400814] ehci-platform: EHCI generic platform driver [ 1.402341] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 1.404074] ohci-pci: OHCI PCI platform driver [ 1.405460] ohci-platform: OHCI generic platform driver [ 1.407007] uhci_hcd: USB Universal Host Controller Interface driver [ 1.466226] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 1.467757] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 1.470027] uhci_hcd 0000:00:01.2: detected 2 ports [ 1.471551] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c080 [ 1.473288] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 1.475164] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 1.477403] usb usb1: Product: UHCI Host Controller [ 1.478861] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 1.480634] usb usb1: SerialNumber: 0000:00:01.2 [ 1.482138] hub 1-0:1.0: USB hub found [ 1.483315] hub 1-0:1.0: 2 ports detected [ 1.484753] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 1.487962] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 1.489468] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 1.491080] mousedev: PS/2 mouse device common for all mice [ 1.493014] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 1.495813] rtc_cmos 00:00: RTC can wake from S4 [ 1.500310] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 1.502799] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 1.505303] i2c /dev entries driver [ 1.506802] device-mapper: uevent: version 1.0.3 [ 1.508740] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 1.512024] ledtrig-cpu: registered to indicate activity on CPUs [ 1.515255] NET: Registered protocol family 10 [ 1.517642] NET: Registered protocol family 17 [ 1.519463] Key type dns_resolver registered [ 1.521519] microcode: CPU0 sig=0x306f2, pf=0x1, revision=0x1 [ 1.523901] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 1.527691] registered taskstats version 1 [ 1.529383] Loading compiled-in X.509 certificates [ 1.532100] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 1.535987] zswap: loaded using pool lzo/zbud [ 1.538916] Key type trusted registered [ 1.542916] Key type encrypted registered [ 1.544595] AppArmor: AppArmor sha1 policy hashing enabled [ 1.546746] ima: No TPM chip found, activating TPM-bypass! [ 1.548888] evm: HMAC attrs: 0x1 [ 1.567624] Magic number: 2:941:555 [ 1.569299] rtc_cmos 00:00: setting system clock to 2018-05-27 23:32:25 UTC (1527463945) [ 1.572612] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 1.574967] EDD information not available. [ 1.580837] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 1.583891] Write protecting the kernel read-only data: 14336k [ 1.587379] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 1.591236] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 1.22 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.48 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.918800] random: dd urandom read with 6 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.5... Lease of 192.168.130.5 obtained, lease time 600 route: SIOCADDRT: File exists WARN: failed: route add -net "0.0.0.0/0" gw "192.168.130.1" Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 vPing OK /run/cirros/datasource/data/user-data returned 0 === system information === Platform: OpenStack Foundation OpenStack Nova Container: none Arch: x86_64 CPU(s): 1 @ 3491.914 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: VT-x RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 vdb 253:16 67108864 config-2 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCOnOI/rSr+kBUWUYpiD/8EjDTdCXPChtAiH7yKcrCwGKWeY91qsxY/exCw9CZNzFVm49nOVakRjW79inCy6y8fkqpXV02rTtYaGR0Jrqu9CTE4i1MauQVPU1X5naHGVfF7TLOGvbkB90ElaOvL+OoXRhmb6ytGRfvGbGtmyrOLsEtoV2hF4842rUToo1F4IC674mqRyKkeDEn1o3Wpxu/xLoq9XMNQrbn62zviy0Yud3Cg6aPRKjxqWuEXjM4lHEArNBHHvdHyoHzRyHH6YQPCE6cKkTn2xskGQqhPdbKQCgU4RI22sNp1u5e/M3MqrYoL6HlrjBZMWLKK4TOMX9K/ root@opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667 ssh-dss AAAAB3NzaC1kc3MAAACBANpCDNrbJ/xyEYIxIqRY5VjALr8XYaqojuJmNjA+39fqq1WzXwvjyd697Qpxx6FNLrxV1XQhSR45TaZdFiliYQaLKcHefSgzXV6h/ibghHSpkABYIo4jyZOv8Bm1Rksb4jJcCooqswcYlTaAkPSpPwDt3DC8m+r5e47dFsfLH0HRAAAAFQCO/rvrfl/7+pp4nscE6qLDf8erHQAAAIAKjOUVWahPUlBdUqU6ijQfeufC5cKnIEgyWIjD0D2iU4z0qp5kQxRcYlzFvdh1glB4jBY+1jgyC4Ah/BvoFc3AxmqjAK8SxSkZKCsI1OEGtfGaK56iDSN+D/GCw7Awport4a+tmJ49O4HoXap/wsYzFaz4QRwg0S/mln+LJEWcuAAAAIANZUYVWJZjA6ah/YGTbfTVQaqnSbwRsC5P6PjRm0oI8u8GoMy+J/sOOstWPq1yztym3riAwij7RE1JZBQDUfRxoYCU/pS8XLDcSklnyFkIeeiOsgznKoz5pMhhWcr75dNRwzW5r2O+BrAiryu1gcD2lxHmshRALjt5zd4PrXXLHw== root@opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667 -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.5,24,fe80::f816:3eff:fedb:43b8/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:169.254.169.254 via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.5 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: 2a2a7795-f062-4e27-b5c6-58040af84005 name: opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667 availability-zone: nova local-hostname: opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667.novalocal launch-index: 0 === cirros: current=0.4.0 latest=0.4.0 uptime=4.45 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-2-userdata--3313f109-1a04-49ad-b5fd-b41c930f2667 login: 2018-05-27 23:32:31,899 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - vPing detected! 2018-05-27 23:32:32,010 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:32:32,010 - xtesting.ci.run_tests - INFO - Test result: +------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +------------------------+------------------+------------------+----------------+ | vping_userdata | functest | 00:51 | PASS | +------------------------+------------------+------------------+----------------+ 2018-05-27 23:32:56,796 - xtesting.ci.run_tests - INFO - Running test case 'cinder_test'... 2018-05-27 23:32:57,907 - functest.opnfv_tests.openstack.cinder.cinder_base - DEBUG - ext_net: Munch({u'status': u'ACTIVE', u'subnets': [u'7b531737-77a7-41af-93de-a06ade38f99f'], u'description': u'', u'provider:physical_network': u'physnet1', u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:10:42Z', u'is_default': True, u'revision_number': 4, u'port_security_enabled': True, u'mtu': 1500, u'id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'provider:segmentation_id': None, u'router:external': True, u'availability_zone_hints': [], u'availability_zones': [u'nova'], u'name': u'floating_net', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:10:32Z', u'provider:network_type': u'flat', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:32:57,908 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-27 23:32:58,365 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-27 23:32:58,366 - xtesting.energy.energy - DEBUG - Submitting scenario (cinder_test/running) 2018-05-27 23:32:58,758 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Begin virtual environment setup 2018-05-27 23:32:58,758 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - CinderCheck Start Time:'2018-05-27 23:32:58' 2018-05-27 23:32:58,758 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating image with name: 'functest-cinder--24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:32:58,758 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Image metadata: None 2018-05-27 23:33:00,431 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/functest-cinder--24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-27T23:32:59Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'shared', u'file': u'/v2/images/f9bee430-45a5-4d5e-883c-c02194b0ad4a/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'f9bee430-45a5-4d5e-883c-c02194b0ad4a', u'size': None, u'name': u'functest-cinder--24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'checksum': None, u'self': u'/v2/images/f9bee430-45a5-4d5e-883c-c02194b0ad4a', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-27T23:32:59Z', u'schema': u'/v2/schemas/image'}) 2018-05-27 23:33:00,431 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating network with name: 'cinder-net-24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:01,133 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:33:00Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'1c3e9192-dd03-487e-a02a-e4b87ed46eaa', u'provider:segmentation_id': 78, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'cinder-net-24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:33:00Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:33:02,540 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:33:02Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.130.2', u'end': u'192.168.130.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.130.0/24', u'id': u'de603113-e638-4f85-beda-0ee3653cd2bf', u'subnetpool_id': None, u'service_types': [], u'name': u'cinder-subnet-24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'enable_dhcp': True, u'network_id': u'1c3e9192-dd03-487e-a02a-e4b87ed46eaa', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:33:02Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.130.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:33:02,540 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating router with name: 'cinder-router-24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:05,125 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - router: Munch({u'status': u'ACTIVE', u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:33:04Z', u'revision_number': 3, u'ha': False, u'id': u'11b0208e-7fb4-4547-bdeb-29700eee40fd', u'external_gateway_info': {u'network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'enable_snat': True, u'external_fixed_ips': [{u'subnet_id': u'7b531737-77a7-41af-93de-a06ade38f99f', u'ip_address': u'172.30.10.121'}]}, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'cinder-router-24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:33:02Z', u'distributed': False, u'flavor_id': None, u'routes': [], u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:33:09,571 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating flavor with name: 'cinder-flavor-24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:09,828 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - flavor: Munch({'name': u'cinder-flavor-24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'763ae727-90bf-4da3-8250-b981ee6ec0ad', 'swap': 0}) 2018-05-27 23:33:09,873 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating volume with name: cinder-volume-24f9faf0-1e0d-4927-b44c-b6d51ff171af 2018-05-27 23:33:13,222 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - volume: Munch({'status': u'available', 'migration_status': None, 'attachments': [], 'multiattach': False, 'source_volume_id': None, 'encrypted': False, 'display_description': None, 'is_bootable': False, 'updated_at': u'2018-05-27T23:33:11.000000', u'source_volid': None, 'host': u'cmp002@lvm-driver#lvm-driver', 'consistencygroup_id': None, 'replication_status': None, 'snapshot_id': None, 'replication_extended_status': None, 'replication_driver': None, 'id': u'26c90e06-4301-465a-85dc-cba2206b5f6b', u'os-vol-mig-status-attr:name_id': None, 'size': 2, 'display_name': u'cinder-volume-24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'name': u'cinder-volume-24f9faf0-1e0d-4927-b44c-b6d51ff171af', u'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'bootable': False, 'description': None, 'availability_zone': u'nova', 'is_encrypted': False, 'volume_type': u'lvm-driver', 'properties': Munch({u'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', u'os-vol-host-attr:host': u'cmp002@lvm-driver#lvm-driver', u'source_volid': None, u'os-vol-mig-status-attr:migstat': None, u'os-vol-tenant-attr:tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'os-vol-mig-status-attr:name_id': None}), u'os-vol-host-attr:host': u'cmp002@lvm-driver#lvm-driver', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-vol-tenant-attr:tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'can_multiattach': False, 'created_at': u'2018-05-27T23:33:10.000000', u'os-vol-mig-status-attr:migstat': None, 'metadata': Munch({})}) 2018-05-27 23:33:13,223 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating keypair with name: 'cinder-keypair_1-24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:13,518 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - keypair: Munch({'public_key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDAG92WFiIB5aqAT1hEa3IMiafncudxiqZLg+AlXJToBJNakRWITm1jb84Usn9YzlvqcGwWId7ekmwcv2BRYcBQ22SZwgmQKmrlSx5H4tbdD3KmXhTmAuFaV48sHdEeIycwkkpRj5neTb6CXxO9h1gw+RIOvM/aB7SVAnUpWrdzIyj3wTrYAjJGhsYRBOx3+SHvBsFUzuTmClPeH0yj8z6QwHtHLVZl9B17Yj6UVkT+EzFIvzNj7UEH59N0+VWwB2MwqWF8S1qUYOYv8Y9OzvSvtHEAfWDt3f4nsoOhgo+RcdjjeyBBoC5YfNkvSm9vvryOoxVm+qm6dgDTBBu8bihn Generated-by-Nova', 'private_key': u'-----BEGIN RSA PRIVATE KEY-----\nMIIEpQIBAAKCAQEAwBvdlhYiAeWqgE9YRGtyDImn53LncYqmS4PgJVyU6ASTWpEV\niE5tY2/OFLJ/WM5b6nBsFiHe3pJsHL9gUWHAUNtkmcIJkCpq5UseR+LW3Q9ypl4U\n5gLhWlePLB3RHiMnMJJKUY+Z3k2+gl8TvYdYMPkSDrzP2ge0lQJ1KVq3cyMo98E6\n2AIyRobGEQTsd/kh7wbBVM7k5gpT3h9Mo/M+kMB7Ry1WZfQde2I+lFZE/hMxSL8z\nY+1BB+fTdPlVsAdjMKlhfEtalGDmL/GPTs70r7RxAH1g7d3+J7KDoYKPkXHY43sg\nQaAuWHzZL0pvb768jqMVZvqpunYA0wQbvG4oZwIDAQABAoIBAC21M5anM3YyF8wp\nlL3KtrsG8Z0e+LRKIeDrwqQBECjYFdtBk9b3mg1Dp8vC4IvXpYT1NLJbYrSgL/rz\nCc6zLFfQq3Ht+eqEPSTLJ6xA65BzQXsV/XDwlH+zP4xml7/M62EiHW6kdR6wbriE\nOKUqk7ArQsGg6O9eT/f4OroZLHbHKCt0ofmXdP+5oqQPVqaMpV3iQFnXxmTp9oJ7\nTQN8IkMbCajHTmezQK78wpT9HVEo5mPNmnR8OkJsS1WkrwhfNFgF5J2WFKeJQ5Ax\nUFMqqJLR9kg9JP/WqvdS+ZTxPd4Sr/EhYjZyQPwMbEEiQXeuGmN9bSIwv8c7+F5E\n3bWXrwECgYEA6N0jf/YKsazFtInyNyvIH/xStjY4MV17GNeEQvVvpc/Zofw05DwO\n3yrlVVo/Cpwqv4/VuOtgA9oZfXYhaCrhMflq+wYCzz+5oKnq/E7Kg0SXRGQgbne1\n+4zPk2pqbbo7nldcIClTBeamFSBz2XMPLnD+EpMsaJARUjZ0sDcIhoECgYEA0zIh\nIS1dqNZTJX8YLhV7qPB+QWUEqaOC6Q/NdvHKgjEYST9lib0/0ColP30+aXKArQLf\nn6OSM4f2Zh5dcPVs6FnEILIZBxfPn9y6qsPQb6/L58S16QnBlXX21ooWl5D312u1\nFaukVBgcJEn3VESzVH8irX6uprFEqGbatiobyucCgYEAse2m3nOFoFU/i7+L0BQo\nCiimmou7TBz9nfGvIeqfsLasuFUZpPsu8d649QrL+LNzBoRE7dMLvmW77F2DETES\neTj3QW9KXkl28QhkgP1DSCH041EOZtoZt2fUg/Nf9w/B1i8yOXDZ+zxN0v7FBnYg\nN4Am4tKJsCvE0GjDqJGkdgECgYEApsNtn4n368qYywHpcPIrvAFyYGbI4L6pLyj4\nP5S7KERl/eieX73c9p1g3hoiPxdX/cVIGQEK/7+7U3VkqT5SKcI/+OVIl+44Vu3v\nQ8ns+1Pf+Xm9fm7iugb0ywEsx6+D1ElLKvpmfrSu/ASf1J2O0Qi2tDJU7HELWsIk\nrzTezz8CgYEA0p5m1qo34Ym9cOoLai2TtdIEcStCtZS3JJJn24dJhW4NEFa7kiXp\n5ftsaqHbOiylVLMFRnAyRXJj4Ob+xZwDbGLsSZ9Wxp+/muOqCQat299CbgkDxzYO\nVsHjvg9qOlTACxRl0d8AXQOkB6M1TjUiE+Rt/hzYLllWn8qzFCaK7EI=\n-----END RSA PRIVATE KEY-----\n', 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'name': u'cinder-keypair_1-24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'created_at': '2018-05-27T23:33:13.517984', 'properties': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'fingerprint': u'46:81:d0:20:7d:d1:a1:ae:49:fd:7a:6a:d5:5d:e5:18', 'type': 'ssh', 'id': u'cinder-keypair_1-24f9faf0-1e0d-4927-b44c-b6d51ff171af'}) 2018-05-27 23:33:13,518 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - private_key: -----BEGIN RSA PRIVATE KEY----- MIIEpQIBAAKCAQEAwBvdlhYiAeWqgE9YRGtyDImn53LncYqmS4PgJVyU6ASTWpEV iE5tY2/OFLJ/WM5b6nBsFiHe3pJsHL9gUWHAUNtkmcIJkCpq5UseR+LW3Q9ypl4U 5gLhWlePLB3RHiMnMJJKUY+Z3k2+gl8TvYdYMPkSDrzP2ge0lQJ1KVq3cyMo98E6 2AIyRobGEQTsd/kh7wbBVM7k5gpT3h9Mo/M+kMB7Ry1WZfQde2I+lFZE/hMxSL8z Y+1BB+fTdPlVsAdjMKlhfEtalGDmL/GPTs70r7RxAH1g7d3+J7KDoYKPkXHY43sg QaAuWHzZL0pvb768jqMVZvqpunYA0wQbvG4oZwIDAQABAoIBAC21M5anM3YyF8wp lL3KtrsG8Z0e+LRKIeDrwqQBECjYFdtBk9b3mg1Dp8vC4IvXpYT1NLJbYrSgL/rz Cc6zLFfQq3Ht+eqEPSTLJ6xA65BzQXsV/XDwlH+zP4xml7/M62EiHW6kdR6wbriE OKUqk7ArQsGg6O9eT/f4OroZLHbHKCt0ofmXdP+5oqQPVqaMpV3iQFnXxmTp9oJ7 TQN8IkMbCajHTmezQK78wpT9HVEo5mPNmnR8OkJsS1WkrwhfNFgF5J2WFKeJQ5Ax UFMqqJLR9kg9JP/WqvdS+ZTxPd4Sr/EhYjZyQPwMbEEiQXeuGmN9bSIwv8c7+F5E 3bWXrwECgYEA6N0jf/YKsazFtInyNyvIH/xStjY4MV17GNeEQvVvpc/Zofw05DwO 3yrlVVo/Cpwqv4/VuOtgA9oZfXYhaCrhMflq+wYCzz+5oKnq/E7Kg0SXRGQgbne1 +4zPk2pqbbo7nldcIClTBeamFSBz2XMPLnD+EpMsaJARUjZ0sDcIhoECgYEA0zIh IS1dqNZTJX8YLhV7qPB+QWUEqaOC6Q/NdvHKgjEYST9lib0/0ColP30+aXKArQLf n6OSM4f2Zh5dcPVs6FnEILIZBxfPn9y6qsPQb6/L58S16QnBlXX21ooWl5D312u1 FaukVBgcJEn3VESzVH8irX6uprFEqGbatiobyucCgYEAse2m3nOFoFU/i7+L0BQo Ciimmou7TBz9nfGvIeqfsLasuFUZpPsu8d649QrL+LNzBoRE7dMLvmW77F2DETES eTj3QW9KXkl28QhkgP1DSCH041EOZtoZt2fUg/Nf9w/B1i8yOXDZ+zxN0v7FBnYg N4Am4tKJsCvE0GjDqJGkdgECgYEApsNtn4n368qYywHpcPIrvAFyYGbI4L6pLyj4 P5S7KERl/eieX73c9p1g3hoiPxdX/cVIGQEK/7+7U3VkqT5SKcI/+OVIl+44Vu3v Q8ns+1Pf+Xm9fm7iugb0ywEsx6+D1ElLKvpmfrSu/ASf1J2O0Qi2tDJU7HELWsIk rzTezz8CgYEA0p5m1qo34Ym9cOoLai2TtdIEcStCtZS3JJJn24dJhW4NEFa7kiXp 5ftsaqHbOiylVLMFRnAyRXJj4Ob+xZwDbGLsSZ9Wxp+/muOqCQat299CbgkDxzYO VsHjvg9qOlTACxRl0d8AXQOkB6M1TjUiE+Rt/hzYLllWn8qzFCaK7EI= -----END RSA PRIVATE KEY----- 2018-05-27 23:33:13,518 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating keypair with name: 'cinder-keypair_2-24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:14,114 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - keypair: Munch({'public_key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCsESOLOqpEYTdCzELT/kfTwtW3Ei+3GlDV6CSYOSrz+hUdNvZ6CRetzL//aWwt1WyBHMkdb3qAPrRq0JpPjMisDNzZxy/qn0BLu0alttM6L13BCIMwMxQ4R1CrbNjuGYk+BHJSMlwMlYzokM+hiYlxlNO5hs3us6dq15Cn/KQ6hpuhlI1IG9dj3OiMGfkYNLpz+wB5i1yL231MAGq5EjZZrT/H5oRbtiqZAfDDyCtsLdgjkZk5qQvuheRYE4coSUEU9U5/9EwuMRHxcKEH6LKyWi+CU1Rf6j00+Ci7xhe1u0FYnJQXlDTh6Gofn5M6WxsquKfDHrEUFEYOTEfT9q5T Generated-by-Nova', 'private_key': u'-----BEGIN RSA PRIVATE KEY-----\nMIIEowIBAAKCAQEArBEjizqqRGE3QsxC0/5H08LVtxIvtxpQ1egkmDkq8/oVHTb2\negkXrcy//2lsLdVsgRzJHW96gD60atCaT4zIrAzc2ccv6p9AS7tGpbbTOi9dwQiD\nMDMUOEdQq2zY7hmJPgRyUjJcDJWM6JDPoYmJcZTTuYbN7rOnateQp/ykOoaboZSN\nSBvXY9zojBn5GDS6c/sAeYtci9t9TABquRI2Wa0/x+aEW7YqmQHww8grbC3YI5GZ\nOakL7oXkWBOHKElBFPVOf/RMLjER8XChB+iyslovglNUX+o9NPgou8YXtbtBWJyU\nF5Q04ehqH5+TOlsbKrinwx6xFBRGDkxH0/auUwIDAQABAoIBAHMgvJGJ3ScHjPwK\nw6QofVK5CFoHtxliaDfXrbSUe6Lm732uqtgYgVluqABzP0ijhogVBJPHZUWtrvXR\nQY58ekB0Esk8N1la2KBTW//BBi/mRWrFEVIMuzi0rsskdzlIqMVwIDXTMDfTUIbD\nbcl9xxLSK5/1DJDsb9ZTCZz8s73b/Zrg+LeUIC+lkSVReJCcOp5Gx9FLSBPFI3jb\n/JLz61kqwpPkGq0jzozZZiubfwqdpH/SL3O1IdjVLthZE+4lz00vWR4TV80UK7A7\nD5pHBwknc2VewHzsX+WiqXymDlhYxZsj/cCNsiQXi42mGWLE/iONXdzrCZB0XjB1\nxlQmkoECgYEA31kvTkKWa2PIA2yVaigW7FkME4W9DL3WY/S67qYhhmCqCnQc5mKQ\naG5r+XshJKBuGr3BdsMFy+A8UA/KoYoeut/x6fscVyJOEaedIa0cnABi+2G+yo4C\nr2uqXKl+h6gFRGWh9YgXGua8ZNAjmTHP58NRGOiw9l9vUCJRHCmJ5PMCgYEAxTi+\n/deMFCNzPtlsEvi6kudx5YmdzQmGuHa38BI8mqVLNH6WorfZ/l1mmflBpneDWxlj\ndQO6puBY7HCvS5odsgKgdW3rqtSKtpANzypHcJDKchaAuYfZZcWlX6COa2lQyn82\nTYBZEJVDocyYsavMOFb6omu7uSrXtpLhaIi16SECgYA5RIDgrCotdvQ+DIVzJrxZ\n9asnBk+nCLYEAyg7MNW/wuFWtNcEK5mjbUy6N1wULB6PNMB6Vx8RW3mbfbETInsm\np2079Wsa3GzwEe40SFLhnSfEFRf6j9cYa57PC5ap6ecP1o9kiXSDLU54+vVlvmP0\npRwSKvfU9DtybJongm8dCQKBgAFUgqU/oOg53PsdiEcQemGnQfNkXDYXzFrOIity\nVApO9xThja6HQuceRiTfs3ul7rLclvkhD0800CS4FdaSsYST0/U2ypIaYN0eV5mA\nFX9C2rquQCwAKI0xKg5dDNjYmvziosEfDnq4Jv6eXKFGIVh37bTOuTNkgJPwpaiI\n04FBAoGBAKXG6RMFbBYL46oCV71mDCHTGXoh76ya6zalJil/Dc4rbS4GSfdnnFrW\nWs/VoMR5bBM8xDffeOHZ6owtVY+c88llFuIBTni+sMNczkwtEdfK2dfLTGmWYRLz\np7+wfe2Ly/w4PpRS474c1E91m8hfn7r9j/xG7AKlvlB0nGt+ntlj\n-----END RSA PRIVATE KEY-----\n', 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'name': u'cinder-keypair_2-24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'created_at': '2018-05-27T23:33:14.114262', 'properties': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'fingerprint': u'ff:af:90:ff:e4:31:02:f4:99:00:25:47:b2:05:8a:d4', 'type': 'ssh', 'id': u'cinder-keypair_2-24f9faf0-1e0d-4927-b44c-b6d51ff171af'}) 2018-05-27 23:33:14,114 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - private_key: -----BEGIN RSA PRIVATE KEY----- MIIEowIBAAKCAQEArBEjizqqRGE3QsxC0/5H08LVtxIvtxpQ1egkmDkq8/oVHTb2 egkXrcy//2lsLdVsgRzJHW96gD60atCaT4zIrAzc2ccv6p9AS7tGpbbTOi9dwQiD MDMUOEdQq2zY7hmJPgRyUjJcDJWM6JDPoYmJcZTTuYbN7rOnateQp/ykOoaboZSN SBvXY9zojBn5GDS6c/sAeYtci9t9TABquRI2Wa0/x+aEW7YqmQHww8grbC3YI5GZ OakL7oXkWBOHKElBFPVOf/RMLjER8XChB+iyslovglNUX+o9NPgou8YXtbtBWJyU F5Q04ehqH5+TOlsbKrinwx6xFBRGDkxH0/auUwIDAQABAoIBAHMgvJGJ3ScHjPwK w6QofVK5CFoHtxliaDfXrbSUe6Lm732uqtgYgVluqABzP0ijhogVBJPHZUWtrvXR QY58ekB0Esk8N1la2KBTW//BBi/mRWrFEVIMuzi0rsskdzlIqMVwIDXTMDfTUIbD bcl9xxLSK5/1DJDsb9ZTCZz8s73b/Zrg+LeUIC+lkSVReJCcOp5Gx9FLSBPFI3jb /JLz61kqwpPkGq0jzozZZiubfwqdpH/SL3O1IdjVLthZE+4lz00vWR4TV80UK7A7 D5pHBwknc2VewHzsX+WiqXymDlhYxZsj/cCNsiQXi42mGWLE/iONXdzrCZB0XjB1 xlQmkoECgYEA31kvTkKWa2PIA2yVaigW7FkME4W9DL3WY/S67qYhhmCqCnQc5mKQ aG5r+XshJKBuGr3BdsMFy+A8UA/KoYoeut/x6fscVyJOEaedIa0cnABi+2G+yo4C r2uqXKl+h6gFRGWh9YgXGua8ZNAjmTHP58NRGOiw9l9vUCJRHCmJ5PMCgYEAxTi+ /deMFCNzPtlsEvi6kudx5YmdzQmGuHa38BI8mqVLNH6WorfZ/l1mmflBpneDWxlj dQO6puBY7HCvS5odsgKgdW3rqtSKtpANzypHcJDKchaAuYfZZcWlX6COa2lQyn82 TYBZEJVDocyYsavMOFb6omu7uSrXtpLhaIi16SECgYA5RIDgrCotdvQ+DIVzJrxZ 9asnBk+nCLYEAyg7MNW/wuFWtNcEK5mjbUy6N1wULB6PNMB6Vx8RW3mbfbETInsm p2079Wsa3GzwEe40SFLhnSfEFRf6j9cYa57PC5ap6ecP1o9kiXSDLU54+vVlvmP0 pRwSKvfU9DtybJongm8dCQKBgAFUgqU/oOg53PsdiEcQemGnQfNkXDYXzFrOIity VApO9xThja6HQuceRiTfs3ul7rLclvkhD0800CS4FdaSsYST0/U2ypIaYN0eV5mA FX9C2rquQCwAKI0xKg5dDNjYmvziosEfDnq4Jv6eXKFGIVh37bTOuTNkgJPwpaiI 04FBAoGBAKXG6RMFbBYL46oCV71mDCHTGXoh76ya6zalJil/Dc4rbS4GSfdnnFrW Ws/VoMR5bBM8xDffeOHZ6owtVY+c88llFuIBTni+sMNczkwtEdfK2dfLTGmWYRLz p7+wfe2Ly/w4PpRS474c1E91m8hfn7r9j/xG7AKlvlB0nGt+ntlj -----END RSA PRIVATE KEY----- 2018-05-27 23:33:16,268 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating VM 1 instance with name: 'opnfv-cinder-1-ssh--24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:33:34,221 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - vm1: None 2018-05-27 23:33:38,543 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - floating_ip1: Munch({'status': u'DOWN', 'router_id': u'11b0208e-7fb4-4547-bdeb-29700eee40fd', 'properties': Munch({u'tags': []}), 'description': u'', u'tags': [], 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:33:36Z', 'attached': True, 'updated_at': u'2018-05-27T23:33:36Z', 'id': u'96d89f17-f394-4ac2-8ebb-50ec4ff419b9', 'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', 'fixed_ip_address': u'192.168.130.13', 'floating_ip_address': u'172.30.10.114', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'revision_number': 0, 'router': u'11b0208e-7fb4-4547-bdeb-29700eee40fd', 'project_id': u'17e0c72255804297b05647b8b64ec56a', 'port_id': u'd429d743-343f-4cc3-ac3c-3cc5d2256b78', 'port': u'd429d743-343f-4cc3-ac3c-3cc5d2256b78', 'network': u'11c92fd4-326a-487a-a640-1b09c88fcb5b'}) 2018-05-27 23:33:48,659 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Creating VM 2 instance with name: 'opnfv-cinder-2-ssh--24f9faf0-1e0d-4927-b44c-b6d51ff171af' 2018-05-27 23:34:05,194 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - vm2: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'cinder-net-24f9faf0-1e0d-4927-b44c-b6d51ff171af': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:ac:b3:cf', u'version': 4, u'addr': u'192.168.130.9', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'f9bee430-45a5-4d5e-883c-c02194b0ad4a'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000002f', u'OS-SRV-USG:launched_at': u'2018-05-27T23:34:01.000000', 'flavor': Munch({u'id': u'763ae727-90bf-4da3-8250-b981ee6ec0ad'}), 'az': u'nova', 'id': u'64616080-e73b-42e8-9aad-6ae1136a059e', 'security_groups': [Munch({u'name': u'cinder-sg-24f9faf0-1e0d-4927-b44c-b6d51ff171af'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'37b0f068b13143159c3fbcc2c9fc04f8', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-27T23:34:01.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-27T23:34:02Z', 'hostId': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, 'key_name': u'cinder-keypair_2-24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'public_v6': '', 'private_v4': u'192.168.130.9', 'cloud': 'envvars', 'host_id': u'aef28b520aac3b35ae2ae4ee558f0576807f611be54858527d27fcd6', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'cmp001', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000002f', u'OS-SRV-USG:launched_at': u'2018-05-27T23:34:01.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'17e0c72255804297b05647b8b64ec56a', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'cmp001.mcp-pike-ovs-ha.local', 'name': u'opnfv-cinder-2-ssh--24f9faf0-1e0d-4927-b44c-b6d51ff171af', 'adminPass': u'Ap7G9SpRtPzc', 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:33:51Z', 'created': u'2018-05-27T23:33:51Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True', 'region': 'RegionOne'}) 2018-05-27 23:34:10,035 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - floating_ip2: Munch({'status': u'DOWN', 'router_id': u'11b0208e-7fb4-4547-bdeb-29700eee40fd', 'properties': Munch({u'tags': []}), 'description': u'', u'tags': [], 'tenant_id': u'17e0c72255804297b05647b8b64ec56a', 'created_at': u'2018-05-27T23:34:07Z', 'attached': True, 'updated_at': u'2018-05-27T23:34:07Z', 'id': u'46adb171-543b-4d5d-8ad5-f4b2a4351eee', 'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', 'fixed_ip_address': u'192.168.130.9', 'floating_ip_address': u'172.30.10.116', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), 'revision_number': 0, 'router': u'11b0208e-7fb4-4547-bdeb-29700eee40fd', 'project_id': u'17e0c72255804297b05647b8b64ec56a', 'port_id': u'124c7705-5d7e-41fd-9039-35b4b29694eb', 'port': u'124c7705-5d7e-41fd-9039-35b4b29694eb', 'network': u'11c92fd4-326a-487a-a640-1b09c88fcb5b'}) 2018-05-27 23:34:11,363 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Begin test execution 2018-05-27 23:34:21,515 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - ssh: 2018-05-27 23:34:21,545 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - volume_write output: New data added to the volume! 2018-05-27 23:34:21,546 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Detach volume from VM 1 2018-05-27 23:34:25,999 - functest.opnfv_tests.openstack.cinder.cinder_test - INFO - Attach volume to VM 2 2018-05-27 23:34:42,441 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - ssh: 2018-05-27 23:34:42,517 - functest.opnfv_tests.openstack.cinder.cinder_test - DEBUG - read volume output: 2018-05-27 23:34:42,518 - xtesting.energy.energy - DEBUG - Restoring previous scenario (cloudify_ims/running) 2018-05-27 23:34:42,518 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-27 23:34:43,084 - xtesting.core.testcase - ERROR - The HTTP request raises issues Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/xtesting/core/testcase.py", line 207, in push_to_db req.raise_for_status() File "/usr/lib/python2.7/site-packages/requests/models.py", line 935, in raise_for_status raise HTTPError(http_error_msg, response=self) HTTPError: 403 Client Error: Could Not Found testcases [{'project_name': u'functest', 'name': u'cinder_test'}] for url: http://testresults.opnfv.org/test/api/v1/results 2018-05-27 23:34:43,085 - xtesting.ci.run_tests - INFO - Test result: +---------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +---------------------+------------------+------------------+----------------+ | cinder_test | functest | 01:13 | PASS | +---------------------+------------------+------------------+----------------+ 2018-05-27 23:35:38,775 - xtesting.ci.run_tests - INFO - Running test case 'tempest_smoke_serial'... 2018-05-27 23:35:38,880 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-27 23:35:38,881 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-27 23:35:38,881 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-27 23:35:45,574 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally +--------------------------------------+----------------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+----------------------------+-------------+------------------+--------+ | 78f29472-ed6f-495e-8c84-b428c7a3bcff | 2018-05-27T23:35:44.739521 | opnfv-rally | deploy->finished | | +--------------------------------------+----------------------------+-------------+------------------+--------+ Using deployment: 78f29472-ed6f-495e-8c84-b428c7a3bcff ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-27 23:35:49,719 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | compute_legacy | Available | | __unknown__ | event | Available | | __unknown__ | placement | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | designate | dns | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-27 23:35:49,719 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-27 23:35:57,166 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide Using verifier 'opnfv-tempest' (UUID=74a0fb4a-680b-40b9-9e27-b6f705784167) as the default verifier for the future CLI operations. 2018-05-27 23:36:00,481 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-635280d3-b53a-48e6-add9-e3743185e995' 2018-05-27 23:36:01,852 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-27T23:36:01Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'd8bbbf63-c956-498d-a35d-e4ff42cb1c7f', u'provider:segmentation_id': 12, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'tempest-net-635280d3-b53a-48e6-add9-e3743185e995', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:36:01Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:36:03,484 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-27T23:36:02Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'f1085e60-4f58-4c8f-a62a-e6d9d4583d7c', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-635280d3-b53a-48e6-add9-e3743185e995', u'enable_dhcp': True, u'network_id': u'd8bbbf63-c956-498d-a35d-e4ff42cb1c7f', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-27T23:36:02Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-27 23:36:03,484 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-27 23:36:03,484 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-635280d3-b53a-48e6-add9-e3743185e995' 2018-05-27 23:36:04,564 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-635280d3-b53a-48e6-add9-e3743185e995', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-27T23:36:04Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/2a1d0b8b-e076-44c5-b70c-170aaf8d8a49/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'2a1d0b8b-e076-44c5-b70c-170aaf8d8a49', u'size': None, u'name': u'Cirros-0.4.0-635280d3-b53a-48e6-add9-e3743185e995', u'checksum': None, u'self': u'/v2/images/2a1d0b8b-e076-44c5-b70c-170aaf8d8a49', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-27T23:36:04Z', u'schema': u'/v2/schemas/image'}) 2018-05-27 23:36:04,565 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-635280d3-b53a-48e6-add9-e3743185e995' 2018-05-27 23:36:05,670 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-635280d3-b53a-48e6-add9-e3743185e995', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-27T23:36:05Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/0079c486-5fa9-47d9-bc12-bbf2eba720c0/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'0079c486-5fa9-47d9-bc12-bbf2eba720c0', u'size': None, u'name': u'Cirros-0.4.0-1-635280d3-b53a-48e6-add9-e3743185e995', u'checksum': None, u'self': u'/v2/images/0079c486-5fa9-47d9-bc12-bbf2eba720c0', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-27T23:36:05Z', u'schema': u'/v2/schemas/image'}) 2018-05-27 23:36:05,670 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-27 23:36:05,977 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-635280d3-b53a-48e6-add9-e3743185e995', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'4c8c86d6-cb3b-48f6-826e-8867b052dc3d', 'swap': 0}) 2018-05-27 23:36:06,145 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-635280d3-b53a-48e6-add9-e3743185e995', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'614a6045-0fb9-4d08-bac7-42a9c5482ba6', 'swap': 0}) 2018-05-27 23:36:10,382 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-27 23:36:10,382 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-27 23:36:10,383 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-27 23:36:10,385 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-27 23:36:10,390 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-27 23:36:13,593 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-74a0fb4a-680b-40b9-9e27-b6f705784167/repo; stestr list '^tempest\.(api|scenario).*\[.*\bsmoke\b.*\]$' >/home/opnfv/functest/results/tempest/test_list.txt 2>/dev/null) 2018-05-27 23:36:13,594 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Applying tempest blacklist... 2018-05-27 23:36:13,597 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Tempest blacklist file does not exist. 2018-05-27 23:36:13,598 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', '/home/opnfv/functest/results/tempest/test_list.txt', '--concurrency', '1']'. 2018-05-28 00:00:50,620 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Verification UUID: ee8d91ac-5ff2-427e-ad77-f5047a64df12 2018-05-28 00:00:50,911 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', 'ee8d91ac-5ff2-427e-ad77-f5047a64df12']'. 2018-05-28 00:00:51,934 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-28 00:00:51,934 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-28 00:00:51,934 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 00:00:51,934 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | ee8d91ac-5ff2-427e-ad77-f5047a64df12 | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | finished | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-27 23:36:16 | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-28 00:00:50 | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:24:34 | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | concurrency: 1 | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 74a0fb4a-680b-40b9-9e27-b6f705784167) | 2018-05-28 00:00:51,935 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: 78f29472-ed6f-495e-8c84-b428c7a3bcff) | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 109 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 1436.298 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 90 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 19 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 0 | 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 00:00:51,936 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-28 00:00:51,940 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest tempest_smoke_serial success_rate is 100.0% 2018-05-28 00:00:59,606 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 00:00:59,606 - xtesting.ci.run_tests - INFO - Test result: +------------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +------------------------------+------------------+------------------+----------------+ | tempest_smoke_serial | functest | 24:59 | PASS | +------------------------------+------------------+------------------+----------------+ 2018-05-28 00:00:59,611 - xtesting.ci.run_tests - INFO - Running test case 'rally_sanity'... 2018-05-28 00:00:59,714 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-28 00:01:00,159 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-28 00:01:00,160 - xtesting.energy.energy - DEBUG - Submitting scenario (rally_sanity/running) 2018-05-28 00:01:00,573 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-28 00:01:04,063 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-28 00:01:07,310 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally +--------------------------------------+----------------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+----------------------------+-------------+------------------+--------+ | ab5978d4-6b09-42f2-b591-d6a789dcbb31 | 2018-05-28T00:01:06.594910 | opnfv-rally | deploy->finished | | +--------------------------------------+----------------------------+-------------+------------------+--------+ Using deployment: ab5978d4-6b09-42f2-b591-d6a789dcbb31 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-28 00:01:11,463 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | compute_legacy | Available | | __unknown__ | event | Available | | __unknown__ | placement | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | designate | dns | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-28 00:01:11,464 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Validating the test name... 2018-05-28 00:01:12,846 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating image 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587'... 2018-05-28 00:01:13,888 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating network 'rally-net-3e1f9ebd-fad7-43a1-b71f-33566334d587'... 2018-05-28 00:01:15,938 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating router 'rally-router-3e1f9ebd-fad7-43a1-b71f-33566334d587'... 2018-05-28 00:01:22,340 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating flavor 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587'... 2018-05-28 00:01:22,595 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating flavor 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587'... 2018-05-28 00:01:22,786 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "authenticate" ... 2018-05-28 00:01:22,787 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-authenticate.yaml 2018-05-28 00:01:22,787 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:01:22,806 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:01:22,807 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['authenticate'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:02:52,224 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : b4576a8b-1620-4020-983b-4b5dc8e69f21 2018-05-28 00:02:52,225 - functest.opnfv_tests.openstack.rally.rally - DEBUG - /home/opnfv/functest/results/rally does not exist, we create it. 2018-05-28 00:02:52,226 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', 'b4576a8b-1620-4020-983b-4b5dc8e69f21'] 2018-05-28 00:02:53,320 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21: finished -------------------------------------------------------------------------------- test scenario Authenticate.keystone args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.keystone | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 100.0% | 1 | | total | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 100.0% | 1 | | -> duration | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 0.807 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.806818 Full duration: 10.134331 -------------------------------------------------------------------------------- test scenario Authenticate.validate_cinder args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_cinder | 0.622 | 0.622 | 0.622 | 0.622 | 0.622 | 0.622 | 100.0% | 1 | | total | 1.468 | 1.468 | 1.468 | 1.468 | 1.468 | 1.468 | 100.0% | 1 | | -> duration | 1.468 | 1.468 | 1.468 | 1.468 | 1.468 | 1.468 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.467613 Full duration: 10.811465 -------------------------------------------------------------------------------- test scenario Authenticate.validate_glance args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_glance | 1.051 | 1.051 | 1.051 | 1.051 | 1.051 | 1.051 | 100.0% | 1 | | total | 1.88 | 1.88 | 1.88 | 1.88 | 1.88 | 1.88 | 100.0% | 1 | | -> duration | 1.88 | 1.88 | 1.88 | 1.88 | 1.88 | 1.88 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.879837 Full duration: 11.077979 -------------------------------------------------------------------------------- test scenario Authenticate.validate_heat args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_heat | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 100.0% | 1 | | total | 1.758 | 1.758 | 1.758 | 1.758 | 1.758 | 1.758 | 100.0% | 1 | | -> duration | 1.758 | 1.758 | 1.758 | 1.758 | 1.758 | 1.758 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.757832 Full duration: 11.008266 -------------------------------------------------------------------------------- test scenario Authenticate.validate_neutron args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_neutron | 1.257 | 1.257 | 1.257 | 1.257 | 1.257 | 1.257 | 100.0% | 1 | | total | 1.966 | 1.966 | 1.966 | 1.966 | 1.966 | 1.966 | 100.0% | 1 | | -> duration | 1.966 | 1.966 | 1.966 | 1.966 | 1.966 | 1.966 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.965522 Full duration: 11.413531 -------------------------------------------------------------------------------- test scenario Authenticate.validate_nova args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b4576a8b-1620-4020-983b-4b5dc8e69f21 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_nova | 0.53 | 0.53 | 0.53 | 0.53 | 0.53 | 0.53 | 100.0% | 1 | | total | 1.346 | 1.346 | 1.346 | 1.346 | 1.346 | 1.346 | 100.0% | 1 | | -> duration | 1.346 | 1.346 | 1.346 | 1.346 | 1.346 | 1.346 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.346078 Full duration: 10.711694 HINTS: * To plot HTML graphics with this data, run: rally task report b4576a8b-1620-4020-983b-4b5dc8e69f21 --out output.html * To generate a JUnit report, run: rally task export b4576a8b-1620-4020-983b-4b5dc8e69f21 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report b4576a8b-1620-4020-983b-4b5dc8e69f21 --json --out output.json 2018-05-28 00:02:53,320 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', 'b4576a8b-1620-4020-983b-4b5dc8e69f21'] 2018-05-28 00:02:55,980 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:02:55,981 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', 'b4576a8b-1620-4020-983b-4b5dc8e69f21', '--out', '/home/opnfv/functest/results/rally/opnfv-authenticate.html'] 2018-05-28 00:02:56,006 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "authenticate" OK. 2018-05-28 00:02:56,007 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "glance" ... 2018-05-28 00:02:56,007 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-glance.yaml 2018-05-28 00:02:56,008 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:02:56,030 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:02:56,030 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['glance'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:05:03,289 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 0480bfb7-6234-475a-9134-d92280dfcbc4 2018-05-28 00:05:03,290 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', '0480bfb7-6234-475a-9134-d92280dfcbc4'] 2018-05-28 00:05:04,306 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 0480bfb7-6234-475a-9134-d92280dfcbc4: finished -------------------------------------------------------------------------------- test scenario GlanceImages.create_and_delete_image args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "container_format": "bare", "disk_format": "qcow2", "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0480bfb7-6234-475a-9134-d92280dfcbc4 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 4.014 | 4.014 | 4.014 | 4.014 | 4.014 | 4.014 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 100.0% | 1 | | -> glance_v2.upload_data | 0.625 | 0.625 | 0.625 | 0.625 | 0.625 | 0.625 | 100.0% | 1 | | glance_v2.delete_image | 0.396 | 0.396 | 0.396 | 0.396 | 0.396 | 0.396 | 100.0% | 1 | | total | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 100.0% | 1 | | -> duration | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 4.409781 Full duration: 16.333858 -------------------------------------------------------------------------------- test scenario GlanceImages.create_and_list_image args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "container_format": "bare", "disk_format": "qcow2", "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0480bfb7-6234-475a-9134-d92280dfcbc4 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 4.227 | 4.227 | 4.227 | 4.227 | 4.227 | 4.227 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.774 | 0.774 | 0.774 | 0.774 | 0.774 | 0.774 | 100.0% | 1 | | -> glance_v2.upload_data | 0.597 | 0.597 | 0.597 | 0.597 | 0.597 | 0.597 | 100.0% | 1 | | glance_v2.list_images | 0.293 | 0.293 | 0.293 | 0.293 | 0.293 | 0.293 | 100.0% | 1 | | total | 4.52 | 4.52 | 4.52 | 4.52 | 4.52 | 4.52 | 100.0% | 1 | | -> duration | 4.52 | 4.52 | 4.52 | 4.52 | 4.52 | 4.52 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 4.519875 Full duration: 19.459449 -------------------------------------------------------------------------------- test scenario GlanceImages.list_images args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0480bfb7-6234-475a-9134-d92280dfcbc4 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.list_images | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 100.0% | 1 | | total | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 100.0% | 1 | | -> duration | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 0.606 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.60646 Full duration: 11.95498 -------------------------------------------------------------------------------- test scenario GlanceImages.create_image_and_boot_instances args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "nova": { "ram": -1, "floating_ips": -1, "security_group_rules": -1, "instances": -1, "cores": -1, "security_groups": -1 } } }, "args": { "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img", "container_format": "bare", "disk_format": "qcow2", "number_instances": 2, "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ], "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0480bfb7-6234-475a-9134-d92280dfcbc4 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 4.392 | 4.392 | 4.392 | 4.392 | 4.392 | 4.392 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.848 | 0.848 | 0.848 | 0.848 | 0.848 | 0.848 | 100.0% | 1 | | -> glance_v2.upload_data | 0.797 | 0.797 | 0.797 | 0.797 | 0.797 | 0.797 | 100.0% | 1 | | nova.boot_servers | 23.708 | 23.708 | 23.708 | 23.708 | 23.708 | 23.708 | 100.0% | 1 | | total | 28.101 | 28.101 | 28.101 | 28.101 | 28.101 | 28.101 | 100.0% | 1 | | -> duration | 27.101 | 27.101 | 27.101 | 27.101 | 27.101 | 27.101 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 27.101191 Full duration: 54.480913 HINTS: * To plot HTML graphics with this data, run: rally task report 0480bfb7-6234-475a-9134-d92280dfcbc4 --out output.html * To generate a JUnit report, run: rally task export 0480bfb7-6234-475a-9134-d92280dfcbc4 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 0480bfb7-6234-475a-9134-d92280dfcbc4 --json --out output.json 2018-05-28 00:05:04,306 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', '0480bfb7-6234-475a-9134-d92280dfcbc4'] 2018-05-28 00:05:07,141 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:05:07,141 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', '0480bfb7-6234-475a-9134-d92280dfcbc4', '--out', '/home/opnfv/functest/results/rally/opnfv-glance.html'] 2018-05-28 00:05:07,165 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "glance" OK. 2018-05-28 00:05:07,166 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "cinder" ... 2018-05-28 00:05:07,167 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-cinder.yaml 2018-05-28 00:05:07,167 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:05:07,189 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:05:07,190 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['cinder'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:11:20,890 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 5c94c663-182b-466e-ab9f-39bec324528d 2018-05-28 00:11:20,891 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', '5c94c663-182b-466e-ab9f-39bec324528d'] 2018-05-28 00:11:21,989 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d: finished -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_snapshot args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "volumes": { "volumes_per_tenant": 1, "size": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "force": false }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_snapshot | 3.854 | 3.854 | 3.854 | 3.854 | 3.854 | 3.854 | 100.0% | 1 | | cinder_v2.delete_snapshot | 11.655 | 11.655 | 11.655 | 11.655 | 11.655 | 11.655 | 100.0% | 1 | | total | 15.51 | 15.51 | 15.51 | 15.51 | 15.51 | 15.51 | 100.0% | 1 | | -> duration | 15.51 | 15.51 | 15.51 | 15.51 | 15.51 | 15.51 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 15.50951 Full duration: 48.700267 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "size": { "max": 1, "min": 1 } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 4.507 | 4.507 | 4.507 | 4.507 | 4.507 | 4.507 | 100.0% | 1 | | cinder_v2.delete_volume | 12.198 | 12.198 | 12.198 | 12.198 | 12.198 | 12.198 | 100.0% | 1 | | total | 16.706 | 16.706 | 16.706 | 16.706 | 16.706 | 16.706 | 100.0% | 1 | | -> duration | 16.706 | 16.706 | 16.706 | 16.706 | 16.706 | 16.706 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 16.705748 Full duration: 31.006544 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 7.392 | 7.392 | 7.392 | 7.392 | 7.392 | 7.392 | 100.0% | 1 | | cinder_v2.delete_volume | 10.298 | 10.298 | 10.298 | 10.298 | 10.298 | 10.298 | 100.0% | 1 | | total | 17.69 | 17.69 | 17.69 | 17.69 | 17.69 | 17.69 | 100.0% | 1 | | -> duration | 17.69 | 17.69 | 17.69 | 17.69 | 17.69 | 17.69 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 17.689963 Full duration: 33.549636 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 4.87 | 4.87 | 4.87 | 4.87 | 4.87 | 4.87 | 100.0% | 1 | | cinder_v2.delete_volume | 12.319 | 12.319 | 12.319 | 12.319 | 12.319 | 12.319 | 100.0% | 1 | | total | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 100.0% | 1 | | -> duration | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 17.189452 Full duration: 32.841593 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_extend_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "new_size": 2, "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 4.648 | 4.648 | 4.648 | 4.648 | 4.648 | 4.648 | 100.0% | 1 | | cinder_v2.extend_volume | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 100.0% | 1 | | cinder_v2.delete_volume | 21.427 | 21.427 | 21.427 | 21.427 | 21.427 | 21.427 | 100.0% | 1 | | total | 29.088 | 29.088 | 29.088 | 29.088 | 29.088 | 29.088 | 100.0% | 1 | | -> duration | 29.088 | 29.088 | 29.088 | 29.088 | 29.088 | 29.088 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 29.088114 Full duration: 44.014919 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_from_volume_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "volumes": { "volumes_per_tenant": 1, "size": 1 }, "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 } } }, "args": { "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 28.48 | 28.48 | 28.48 | 28.48 | 28.48 | 28.48 | 100.0% | 1 | | cinder_v2.delete_volume | 12.648 | 12.648 | 12.648 | 12.648 | 12.648 | 12.648 | 100.0% | 1 | | total | 41.129 | 41.129 | 41.129 | 41.129 | 41.129 | 41.129 | 100.0% | 1 | | -> duration | 41.129 | 41.129 | 41.129 | 41.129 | 41.129 | 41.129 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 41.1292 Full duration: 75.139025 -------------------------------------------------------------------------------- test scenario CinderQos.create_and_list_qos args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "write_iops_sec": "10", "consumer": "both", "read_iops_sec": "1000" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_qos | 1.081 | 1.081 | 1.081 | 1.081 | 1.081 | 1.081 | 100.0% | 1 | | cinder_v2.list_qos | 0.124 | 0.124 | 0.124 | 0.124 | 0.124 | 0.124 | 100.0% | 1 | | total | 1.206 | 1.206 | 1.206 | 1.206 | 1.206 | 1.206 | 100.0% | 1 | | -> duration | 1.206 | 1.206 | 1.206 | 1.206 | 1.206 | 1.206 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.20577 Full duration: 17.828616 -------------------------------------------------------------------------------- test scenario CinderQos.create_and_set_qos args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "set_read_iops_sec": "1001", "set_consumer": "both", "read_iops_sec": "1000", "set_write_iops_sec": "11", "write_iops_sec": "10", "consumer": "back-end" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_qos | 1.294 | 1.294 | 1.294 | 1.294 | 1.294 | 1.294 | 100.0% | 1 | | cinder_v2.set_qos | 0.159 | 0.159 | 0.159 | 0.159 | 0.159 | 0.159 | 100.0% | 1 | | total | 1.454 | 1.454 | 1.454 | 1.454 | 1.454 | 1.454 | 100.0% | 1 | | -> duration | 1.454 | 1.454 | 1.454 | 1.454 | 1.454 | 1.454 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.453878 Full duration: 18.174783 -------------------------------------------------------------------------------- test scenario CinderVolumeTypes.create_and_list_volume_types args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "description": "rally tests creating types" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume_type | 1.272 | 1.272 | 1.272 | 1.272 | 1.272 | 1.272 | 100.0% | 1 | | cinder_v2.list_types | 0.123 | 0.123 | 0.123 | 0.123 | 0.123 | 0.123 | 100.0% | 1 | | total | 1.396 | 1.396 | 1.396 | 1.396 | 1.396 | 1.396 | 100.0% | 1 | | -> duration | 1.396 | 1.396 | 1.396 | 1.396 | 1.396 | 1.396 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.396047 Full duration: 17.393517 -------------------------------------------------------------------------------- test scenario CinderVolumeTypes.create_volume_type_and_encryption_type args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "control_location": "front-end", "cipher": "aes-xts-plain64", "description": "rally tests creating types", "key_size": 512, "provider": "LuksEncryptor" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5c94c663-182b-466e-ab9f-39bec324528d has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume_type | 1.096 | 1.096 | 1.096 | 1.096 | 1.096 | 1.096 | 100.0% | 1 | | cinder_v2.create_encryption_type | 0.273 | 0.273 | 0.273 | 0.273 | 0.273 | 0.273 | 100.0% | 1 | | total | 1.369 | 1.369 | 1.369 | 1.369 | 1.369 | 1.369 | 100.0% | 1 | | -> duration | 1.369 | 1.369 | 1.369 | 1.369 | 1.369 | 1.369 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.368692 Full duration: 17.449826 HINTS: * To plot HTML graphics with this data, run: rally task report 5c94c663-182b-466e-ab9f-39bec324528d --out output.html * To generate a JUnit report, run: rally task export 5c94c663-182b-466e-ab9f-39bec324528d --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 5c94c663-182b-466e-ab9f-39bec324528d --json --out output.json 2018-05-28 00:11:21,990 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', '5c94c663-182b-466e-ab9f-39bec324528d'] 2018-05-28 00:11:24,589 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:11:24,590 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', '5c94c663-182b-466e-ab9f-39bec324528d', '--out', '/home/opnfv/functest/results/rally/opnfv-cinder.html'] 2018-05-28 00:11:24,615 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "cinder" OK. 2018-05-28 00:11:24,615 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "gnocchi" ... 2018-05-28 00:11:24,616 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-gnocchi.yaml 2018-05-28 00:11:24,616 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:11:24,631 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:11:24,632 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['gnocchi'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:14:09,940 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : aabee261-9471-4cc5-b7e9-4d29e7e1053a 2018-05-28 00:14:09,941 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', 'aabee261-9471-4cc5-b7e9-4d29e7e1053a'] 2018-05-28 00:14:11,016 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a: finished -------------------------------------------------------------------------------- test scenario Gnocchi.list_capabilities args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.list_capabilities | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 100.0% | 1 | | total | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 100.0% | 1 | | -> duration | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 1.226 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.225866 Full duration: 11.172708 -------------------------------------------------------------------------------- test scenario Gnocchi.get_status args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "detailed": false }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.get_status | 1.057 | 1.057 | 1.057 | 1.057 | 1.057 | 1.057 | 100.0% | 1 | | total | 1.058 | 1.058 | 1.058 | 1.058 | 1.058 | 1.058 | 100.0% | 1 | | -> duration | 1.058 | 1.058 | 1.058 | 1.058 | 1.058 | 1.058 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.057541 Full duration: 10.60066 -------------------------------------------------------------------------------- test scenario GnocchiArchivePolicyRule.list_archive_policy_rule args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.list_archive_policy_rule | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 100.0% | 1 | | total | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 100.0% | 1 | | -> duration | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 1.268 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.268257 Full duration: 11.301614 -------------------------------------------------------------------------------- test scenario GnocchiArchivePolicyRule.create_delete_archive_policy_rule args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "metric_pattern": "cpu_*", "archive_policy_name": "low" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.create_archive_policy_rule | 1.662 | 1.662 | 1.662 | 1.662 | 1.662 | 1.662 | 100.0% | 1 | | gnocchi.delete_archive_policy_rule | 0.051 | 0.051 | 0.051 | 0.051 | 0.051 | 0.051 | 100.0% | 1 | | total | 1.714 | 1.714 | 1.714 | 1.714 | 1.714 | 1.714 | 100.0% | 1 | | -> duration | 1.714 | 1.714 | 1.714 | 1.714 | 1.714 | 1.714 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.713582 Full duration: 12.101355 -------------------------------------------------------------------------------- test scenario GnocchiArchivePolicy.list_archive_policy args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.list_archive_policy | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 100.0% | 1 | | total | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 100.0% | 1 | | -> duration | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 1.076 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.075758 Full duration: 10.671415 -------------------------------------------------------------------------------- test scenario GnocchiArchivePolicy.create_delete_archive_policy args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "definition": [ { "timespan": "1:00:00", "granularity": "0:00:01" } ] }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.create_archive_policy | 1.395 | 1.395 | 1.395 | 1.395 | 1.395 | 1.395 | 100.0% | 1 | | gnocchi.delete_archive_policy | 0.059 | 0.059 | 0.059 | 0.059 | 0.059 | 0.059 | 100.0% | 1 | | total | 1.455 | 1.455 | 1.455 | 1.455 | 1.455 | 1.455 | 100.0% | 1 | | -> duration | 1.455 | 1.455 | 1.455 | 1.455 | 1.455 | 1.455 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.45459 Full duration: 11.97836 -------------------------------------------------------------------------------- test scenario GnocchiResourceType.list_resource_type args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.list_resource_type | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 100.0% | 1 | | total | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 100.0% | 1 | | -> duration | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 1.289 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.289198 Full duration: 11.029545 -------------------------------------------------------------------------------- test scenario GnocchiResourceType.create_delete_resource_type args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "attributes": { "foo": { "required": false, "type": "string" }, "bar": { "required": true, "type": "number" } } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.create_resource_type | 1.959 | 1.959 | 1.959 | 1.959 | 1.959 | 1.959 | 100.0% | 1 | | gnocchi.delete_resource_type | 0.369 | 0.369 | 0.369 | 0.369 | 0.369 | 0.369 | 100.0% | 1 | | total | 2.328 | 2.328 | 2.328 | 2.328 | 2.328 | 2.328 | 100.0% | 1 | | -> duration | 2.328 | 2.328 | 2.328 | 2.328 | 2.328 | 2.328 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 2.328388 Full duration: 13.101425 -------------------------------------------------------------------------------- test scenario GnocchiMetric.list_metric args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "limit": 10000 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.list_metric | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 100.0% | 1 | | total | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 100.0% | 1 | | -> duration | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 1.162 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.161825 Full duration: 11.013643 -------------------------------------------------------------------------------- test scenario GnocchiMetric.create_delete_metric args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "archive_policy_name": "low" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.create_metric | 1.184 | 1.184 | 1.184 | 1.184 | 1.184 | 1.184 | 100.0% | 1 | | gnocchi.delete_metric | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 100.0% | 1 | | total | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 100.0% | 1 | | -> duration | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.321445 Full duration: 11.384661 -------------------------------------------------------------------------------- test scenario GnocchiResource.create_delete_resource args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task aabee261-9471-4cc5-b7e9-4d29e7e1053a has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | gnocchi.create_resource | 1.327 | 1.327 | 1.327 | 1.327 | 1.327 | 1.327 | 100.0% | 1 | | gnocchi.delete_resource | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 100.0% | 1 | | total | 1.419 | 1.419 | 1.419 | 1.419 | 1.419 | 1.419 | 100.0% | 1 | | -> duration | 1.419 | 1.419 | 1.419 | 1.419 | 1.419 | 1.419 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.419388 Full duration: 11.471409 HINTS: * To plot HTML graphics with this data, run: rally task report aabee261-9471-4cc5-b7e9-4d29e7e1053a --out output.html * To generate a JUnit report, run: rally task export aabee261-9471-4cc5-b7e9-4d29e7e1053a --type junit --to output.xml * To get raw JSON output of task results, run: rally task report aabee261-9471-4cc5-b7e9-4d29e7e1053a --json --out output.json 2018-05-28 00:14:11,017 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', 'aabee261-9471-4cc5-b7e9-4d29e7e1053a'] 2018-05-28 00:14:13,945 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:14:13,946 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', 'aabee261-9471-4cc5-b7e9-4d29e7e1053a', '--out', '/home/opnfv/functest/results/rally/opnfv-gnocchi.html'] 2018-05-28 00:14:13,968 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "gnocchi" OK. 2018-05-28 00:14:13,969 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "heat" ... 2018-05-28 00:14:13,969 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-heat.yaml 2018-05-28 00:14:13,969 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:14:13,985 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:14:13,986 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['heat'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:16:35,146 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 5117f39d-da07-48fb-b44a-a1f9b6defc95 2018-05-28 00:16:35,147 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', '5117f39d-da07-48fb-b44a-a1f9b6defc95'] 2018-05-28 00:16:36,132 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 5117f39d-da07-48fb-b44a-a1f9b6defc95: finished -------------------------------------------------------------------------------- test scenario HeatStacks.create_update_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "roles": [ "heat_stack_owner" ] }, "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/autoscaling_policy.yaml.template", "updated_template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/updated_autoscaling_policy_inplace.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5117f39d-da07-48fb-b44a-a1f9b6defc95 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 6.574 | 6.574 | 6.574 | 6.574 | 6.574 | 6.574 | 100.0% | 1 | | heat.update_stack | 4.026 | 4.026 | 4.026 | 4.026 | 4.026 | 4.026 | 100.0% | 1 | | heat.delete_stack | 3.157 | 3.157 | 3.157 | 3.157 | 3.157 | 3.157 | 100.0% | 1 | | total | 13.757 | 13.757 | 13.757 | 13.757 | 13.757 | 13.757 | 100.0% | 1 | | -> duration | 9.757 | 9.757 | 9.757 | 9.757 | 9.757 | 9.757 | 100.0% | 1 | | -> idle_duration | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 9.757117 Full duration: 32.581529 -------------------------------------------------------------------------------- test scenario HeatStacks.create_check_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "roles": [ "heat_stack_owner" ] }, "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/random_strings.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5117f39d-da07-48fb-b44a-a1f9b6defc95 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 6.739 | 6.739 | 6.739 | 6.739 | 6.739 | 6.739 | 100.0% | 1 | | heat.check_stack | 0.833 | 0.833 | 0.833 | 0.833 | 0.833 | 0.833 | 100.0% | 1 | | heat.delete_stack | 2.809 | 2.809 | 2.809 | 2.809 | 2.809 | 2.809 | 100.0% | 1 | | total | 10.382 | 10.382 | 10.382 | 10.382 | 10.382 | 10.382 | 100.0% | 1 | | -> duration | 8.382 | 8.382 | 8.382 | 8.382 | 8.382 | 8.382 | 100.0% | 1 | | -> idle_duration | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 8.381814 Full duration: 26.204986 -------------------------------------------------------------------------------- test scenario HeatStacks.create_suspend_resume_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "roles": [ "heat_stack_owner" ] }, "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/random_strings.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5117f39d-da07-48fb-b44a-a1f9b6defc95 has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 6.645 | 6.645 | 6.645 | 6.645 | 6.645 | 6.645 | 100.0% | 1 | | heat.suspend_stack | 0.93 | 0.93 | 0.93 | 0.93 | 0.93 | 0.93 | 100.0% | 1 | | heat.resume_stack | 2.094 | 2.094 | 2.094 | 2.094 | 2.094 | 2.094 | 100.0% | 1 | | heat.delete_stack | 2.786 | 2.786 | 2.786 | 2.786 | 2.786 | 2.786 | 100.0% | 1 | | total | 12.455 | 12.455 | 12.455 | 12.455 | 12.455 | 12.455 | 100.0% | 1 | | -> duration | 10.455 | 10.455 | 10.455 | 10.455 | 10.455 | 10.455 | 100.0% | 1 | | -> idle_duration | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 100.0% | 1 | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 10.454542 Full duration: 27.925702 -------------------------------------------------------------------------------- test scenario HeatStacks.list_stacks_and_resources args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "roles": [ "heat_stack_owner" ] }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 5117f39d-da07-48fb-b44a-a1f9b6defc95 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.list_stacks | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 100.0% | 1 | | total | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 100.0% | 1 | | -> duration | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 1.35 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.350075 Full duration: 15.699267 HINTS: * To plot HTML graphics with this data, run: rally task report 5117f39d-da07-48fb-b44a-a1f9b6defc95 --out output.html * To generate a JUnit report, run: rally task export 5117f39d-da07-48fb-b44a-a1f9b6defc95 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 5117f39d-da07-48fb-b44a-a1f9b6defc95 --json --out output.json 2018-05-28 00:16:36,133 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', '5117f39d-da07-48fb-b44a-a1f9b6defc95'] 2018-05-28 00:16:38,704 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:16:38,704 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', '5117f39d-da07-48fb-b44a-a1f9b6defc95', '--out', '/home/opnfv/functest/results/rally/opnfv-heat.html'] 2018-05-28 00:16:38,723 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "heat" OK. 2018-05-28 00:16:38,724 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "keystone" ... 2018-05-28 00:16:38,724 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-keystone.yaml 2018-05-28 00:16:38,724 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:16:38,740 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:16:38,741 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['keystone'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:20:51,911 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 2018-05-28 00:20:51,911 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', '0ce88c51-bb0e-4aae-adc5-d103a0d06b50'] 2018-05-28 00:20:52,984 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50: finished -------------------------------------------------------------------------------- test scenario KeystoneBasic.add_and_remove_user_role args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.689 | 0.689 | 0.689 | 0.689 | 0.689 | 0.689 | 100.0% | 1 | | keystone_v3.add_role | 0.131 | 0.131 | 0.131 | 0.131 | 0.131 | 0.131 | 100.0% | 1 | | keystone_v3.revoke_role | 0.128 | 0.128 | 0.128 | 0.128 | 0.128 | 0.128 | 100.0% | 1 | | total | 0.949 | 0.949 | 0.949 | 0.949 | 0.949 | 0.949 | 100.0% | 1 | | -> duration | 0.949 | 0.949 | 0.949 | 0.949 | 0.949 | 0.949 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.948915 Full duration: 18.996082 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_add_and_list_user_roles args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.664 | 0.664 | 0.664 | 0.664 | 0.664 | 0.664 | 100.0% | 1 | | keystone_v3.add_role | 0.15 | 0.15 | 0.15 | 0.15 | 0.15 | 0.15 | 100.0% | 1 | | keystone_v3.list_roles | 0.099 | 0.099 | 0.099 | 0.099 | 0.099 | 0.099 | 100.0% | 1 | | total | 0.913 | 0.913 | 0.913 | 0.913 | 0.913 | 0.913 | 100.0% | 1 | | -> duration | 0.913 | 0.913 | 0.913 | 0.913 | 0.913 | 0.913 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.913281 Full duration: 18.578302 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_list_tenants args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.833 | 0.833 | 0.833 | 0.833 | 0.833 | 0.833 | 100.0% | 1 | | keystone_v3.list_projects | 0.09 | 0.09 | 0.09 | 0.09 | 0.09 | 0.09 | 100.0% | 1 | | total | 0.924 | 0.924 | 0.924 | 0.924 | 0.924 | 0.924 | 100.0% | 1 | | -> duration | 0.924 | 0.924 | 0.924 | 0.924 | 0.924 | 0.924 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.923905 Full duration: 19.955498 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_delete_role args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.66 | 0.66 | 0.66 | 0.66 | 0.66 | 0.66 | 100.0% | 1 | | keystone_v3.delete_role | 0.142 | 0.142 | 0.142 | 0.142 | 0.142 | 0.142 | 100.0% | 1 | | total | 0.802 | 0.802 | 0.802 | 0.802 | 0.802 | 0.802 | 100.0% | 1 | | -> duration | 0.802 | 0.802 | 0.802 | 0.802 | 0.802 | 0.802 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.802068 Full duration: 16.625168 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_delete_service args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_service | 0.653 | 0.653 | 0.653 | 0.653 | 0.653 | 0.653 | 100.0% | 1 | | keystone_v3.delete_service | 0.166 | 0.166 | 0.166 | 0.166 | 0.166 | 0.166 | 100.0% | 1 | | total | 0.82 | 0.82 | 0.82 | 0.82 | 0.82 | 0.82 | 100.0% | 1 | | -> duration | 0.82 | 0.82 | 0.82 | 0.82 | 0.82 | 0.82 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.81999 Full duration: 16.775627 -------------------------------------------------------------------------------- test scenario KeystoneBasic.get_entities args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.835 | 0.835 | 0.835 | 0.835 | 0.835 | 0.835 | 100.0% | 1 | | keystone_v3.create_user | 0.532 | 0.532 | 0.532 | 0.532 | 0.532 | 0.532 | 100.0% | 1 | | -> keystone_v3.list_roles | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 100.0% | 1 | | -> keystone_v3.add_role | 0.205 | 0.205 | 0.205 | 0.205 | 0.205 | 0.205 | 100.0% | 1 | | keystone_v3.create_role | 0.091 | 0.091 | 0.091 | 0.091 | 0.091 | 0.091 | 100.0% | 1 | | keystone_v3.get_project | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 100.0% | 1 | | keystone_v3.get_user | 0.068 | 0.068 | 0.068 | 0.068 | 0.068 | 0.068 | 100.0% | 1 | | keystone_v3.get_role | 0.067 | 0.067 | 0.067 | 0.067 | 0.067 | 0.067 | 100.0% | 1 | | keystone_v3.list_services | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 100.0% | 1 | | keystone_v3.get_services | 0.082 | 0.082 | 0.082 | 0.082 | 0.082 | 0.082 | 100.0% | 1 | | total | 1.911 | 1.911 | 1.911 | 1.911 | 1.911 | 1.911 | 100.0% | 1 | | -> duration | 1.911 | 1.911 | 1.911 | 1.911 | 1.911 | 1.911 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.910661 Full duration: 25.118059 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_update_and_delete_tenant args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.799 | 0.799 | 0.799 | 0.799 | 0.799 | 0.799 | 100.0% | 1 | | keystone_v3.update_project | 0.107 | 0.107 | 0.107 | 0.107 | 0.107 | 0.107 | 100.0% | 1 | | keystone_v3.delete_project | 0.308 | 0.308 | 0.308 | 0.308 | 0.308 | 0.308 | 100.0% | 1 | | total | 1.214 | 1.214 | 1.214 | 1.214 | 1.214 | 1.214 | 100.0% | 1 | | -> duration | 1.214 | 1.214 | 1.214 | 1.214 | 1.214 | 1.214 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.214499 Full duration: 17.052368 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_user args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": {}, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_user | 0.703 | 0.703 | 0.703 | 0.703 | 0.703 | 0.703 | 100.0% | 1 | | total | 0.84 | 0.84 | 0.84 | 0.84 | 0.84 | 0.84 | 100.0% | 1 | | -> duration | 0.84 | 0.84 | 0.84 | 0.84 | 0.84 | 0.84 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.840261 Full duration: 18.749027 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_tenant args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": {}, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 100.0% | 1 | | total | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 100.0% | 1 | | -> duration | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 0.879 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.87889 Full duration: 19.547796 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_list_users args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": {}, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_user | 0.772 | 0.772 | 0.772 | 0.772 | 0.772 | 0.772 | 100.0% | 1 | | keystone_v3.list_users | 0.17 | 0.17 | 0.17 | 0.17 | 0.17 | 0.17 | 100.0% | 1 | | total | 1.094 | 1.094 | 1.094 | 1.094 | 1.094 | 1.094 | 100.0% | 1 | | -> duration | 1.094 | 1.094 | 1.094 | 1.094 | 1.094 | 1.094 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.093571 Full duration: 19.157629 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_tenant_with_users args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": {}, "args": { "users_per_tenant": 10 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.778 | 0.778 | 0.778 | 0.778 | 0.778 | 0.778 | 100.0% | 1 | | keystone_v3.create_users | 5.235 | 5.235 | 5.235 | 5.235 | 5.235 | 5.235 | 100.0% | 1 | | -> keystone_v3.create_user (x10) | 5.235 | 5.235 | 5.235 | 5.235 | 5.235 | 5.235 | 100.0% | 1 | | --> keystone_v3.list_roles (x10) | 0.816 | 0.816 | 0.816 | 0.816 | 0.816 | 0.816 | 100.0% | 1 | | --> keystone_v3.add_role (x10) | 1.891 | 1.891 | 1.891 | 1.891 | 1.891 | 1.891 | 100.0% | 1 | | total | 6.799 | 6.799 | 6.799 | 6.799 | 6.799 | 6.799 | 100.0% | 1 | | -> duration | 6.799 | 6.799 | 6.799 | 6.799 | 6.799 | 6.799 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 6.798865 Full duration: 28.542354 HINTS: * To plot HTML graphics with this data, run: rally task report 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 --out output.html * To generate a JUnit report, run: rally task export 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 0ce88c51-bb0e-4aae-adc5-d103a0d06b50 --json --out output.json 2018-05-28 00:20:52,984 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', '0ce88c51-bb0e-4aae-adc5-d103a0d06b50'] 2018-05-28 00:20:55,941 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:20:55,942 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', '0ce88c51-bb0e-4aae-adc5-d103a0d06b50', '--out', '/home/opnfv/functest/results/rally/opnfv-keystone.html'] 2018-05-28 00:20:55,985 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "keystone" OK. 2018-05-28 00:20:55,986 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "neutron" ... 2018-05-28 00:20:55,987 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-neutron.yaml 2018-05-28 00:20:55,987 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:20:56,008 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:20:56,009 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['neutron'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:29:28,908 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 3be07085-cf96-467b-a6e0-20b6d5210143 2018-05-28 00:29:28,909 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', '3be07085-cf96-467b-a6e0-20b6d5210143'] 2018-05-28 00:29:30,070 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143: finished -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_networks args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "neutron": { "network": -1 } } }, "args": { "network_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 1.932 | 1.932 | 1.932 | 1.932 | 1.932 | 1.932 | 100.0% | 1 | | neutron.delete_network | 2.13 | 2.13 | 2.13 | 2.13 | 2.13 | 2.13 | 100.0% | 1 | | total | 4.063 | 4.063 | 4.063 | 4.063 | 4.063 | 4.063 | 100.0% | 1 | | -> duration | 4.063 | 4.063 | 4.063 | 4.063 | 4.063 | 4.063 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 4.062608 Full duration: 22.416645 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_ports args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "ports_per_network": 1, "port_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_port | 3.018 | 3.018 | 3.018 | 3.018 | 3.018 | 3.018 | 100.0% | 1 | | neutron.delete_port | 1.539 | 1.539 | 1.539 | 1.539 | 1.539 | 1.539 | 100.0% | 1 | | total | 4.557 | 4.557 | 4.557 | 4.557 | 4.557 | 4.557 | 100.0% | 1 | | -> duration | 4.557 | 4.557 | 4.557 | 4.557 | 4.557 | 4.557 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 4.55722 Full duration: 52.079167 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_routers args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "subnet": -1, "router": -1, "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnets_per_network": 1, "router_create_args": {}, "subnet_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 1.493 | 1.493 | 1.493 | 1.493 | 1.493 | 1.493 | 100.0% | 1 | | neutron.create_subnet | 1.358 | 1.358 | 1.358 | 1.358 | 1.358 | 1.358 | 100.0% | 1 | | neutron.create_router | 0.255 | 0.255 | 0.255 | 0.255 | 0.255 | 0.255 | 100.0% | 1 | | neutron.add_interface_router | 4.01 | 4.01 | 4.01 | 4.01 | 4.01 | 4.01 | 100.0% | 1 | | neutron.remove_interface_router | 5.538 | 5.538 | 5.538 | 5.538 | 5.538 | 5.538 | 100.0% | 1 | | neutron.delete_router | 0.848 | 0.848 | 0.848 | 0.848 | 0.848 | 0.848 | 100.0% | 1 | | total | 13.502 | 13.502 | 13.502 | 13.502 | 13.502 | 13.502 | 100.0% | 1 | | -> duration | 13.502 | 13.502 | 13.502 | 13.502 | 13.502 | 13.502 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 13.502249 Full duration: 68.472321 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_subnets args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "subnet": -1, "network": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "subnet_create_args": {}, "subnets_per_network": 1, "subnet_cidr_start": "1.1.0.0/30" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_subnet | 2.191 | 2.191 | 2.191 | 2.191 | 2.191 | 2.191 | 100.0% | 1 | | neutron.delete_subnet | 1.22 | 1.22 | 1.22 | 1.22 | 1.22 | 1.22 | 100.0% | 1 | | total | 3.411 | 3.411 | 3.411 | 3.411 | 3.411 | 3.411 | 100.0% | 1 | | -> duration | 3.411 | 3.411 | 3.411 | 3.411 | 3.411 | 3.411 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.41142 Full duration: 49.761141 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_networks args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "neutron": { "network": -1 } } }, "args": { "network_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 1.932 | 1.932 | 1.932 | 1.932 | 1.932 | 1.932 | 100.0% | 1 | | neutron.list_networks | 0.512 | 0.512 | 0.512 | 0.512 | 0.512 | 0.512 | 100.0% | 1 | | total | 2.444 | 2.444 | 2.444 | 2.444 | 2.444 | 2.444 | 100.0% | 1 | | -> duration | 2.444 | 2.444 | 2.444 | 2.444 | 2.444 | 2.444 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 2.44414 Full duration: 23.462888 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_ports args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "ports_per_network": 1, "port_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_port | 2.83 | 2.83 | 2.83 | 2.83 | 2.83 | 2.83 | 100.0% | 1 | | neutron.list_ports | 0.54 | 0.54 | 0.54 | 0.54 | 0.54 | 0.54 | 100.0% | 1 | | total | 3.371 | 3.371 | 3.371 | 3.371 | 3.371 | 3.371 | 100.0% | 1 | | -> duration | 3.371 | 3.371 | 3.371 | 3.371 | 3.371 | 3.371 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.370612 Full duration: 51.362558 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_routers args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "subnet": -1, "network": -1, "router": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnets_per_network": 1, "router_create_args": {}, "subnet_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 1.643 | 1.643 | 1.643 | 1.643 | 1.643 | 1.643 | 100.0% | 1 | | neutron.create_subnet | 1.336 | 1.336 | 1.336 | 1.336 | 1.336 | 1.336 | 100.0% | 1 | | neutron.create_router | 0.363 | 0.363 | 0.363 | 0.363 | 0.363 | 0.363 | 100.0% | 1 | | neutron.add_interface_router | 4.121 | 4.121 | 4.121 | 4.121 | 4.121 | 4.121 | 100.0% | 1 | | neutron.list_routers | 0.45 | 0.45 | 0.45 | 0.45 | 0.45 | 0.45 | 100.0% | 1 | | total | 7.914 | 7.914 | 7.914 | 7.914 | 7.914 | 7.914 | 100.0% | 1 | | -> duration | 7.914 | 7.914 | 7.914 | 7.914 | 7.914 | 7.914 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 7.913704 Full duration: 73.400573 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_subnets args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "neutron": { "subnet": -1, "network": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "network_create_args": {}, "subnet_create_args": {}, "subnets_per_network": 1, "subnet_cidr_start": "1.1.0.0/30" }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 1.51 | 1.51 | 1.51 | 1.51 | 1.51 | 1.51 | 100.0% | 1 | | neutron.create_subnet | 1.39 | 1.39 | 1.39 | 1.39 | 1.39 | 1.39 | 100.0% | 1 | | neutron.list_subnets | 0.283 | 0.283 | 0.283 | 0.283 | 0.283 | 0.283 | 100.0% | 1 | | total | 3.183 | 3.183 | 3.183 | 3.183 | 3.183 | 3.183 | 100.0% | 1 | | -> duration | 3.183 | 3.183 | 3.183 | 3.183 | 3.183 | 3.183 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.183391 Full duration: 57.747795 -------------------------------------------------------------------------------- test scenario NeutronSecurityGroup.create_and_delete_security_groups args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "neutron": { "security_group": -1 } } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_security_group | 1.76 | 1.76 | 1.76 | 1.76 | 1.76 | 1.76 | 100.0% | 1 | | neutron.delete_security_group | 0.47 | 0.47 | 0.47 | 0.47 | 0.47 | 0.47 | 100.0% | 1 | | total | 2.23 | 2.23 | 2.23 | 2.23 | 2.23 | 2.23 | 100.0% | 1 | | -> duration | 2.23 | 2.23 | 2.23 | 2.23 | 2.23 | 2.23 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 2.230224 Full duration: 19.816366 -------------------------------------------------------------------------------- test scenario NeutronSecurityGroup.create_and_delete_security_group_rule args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "quotas": { "neutron": { "security_group": -1 } } }, "args": {}, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_security_group | 1.975 | 1.975 | 1.975 | 1.975 | 1.975 | 1.975 | 100.0% | 1 | | neutron.create_security_group_rule | 0.567 | 0.567 | 0.567 | 0.567 | 0.567 | 0.567 | 100.0% | 1 | | neutron.delete_security_group_rule | 0.367 | 0.367 | 0.367 | 0.367 | 0.367 | 0.367 | 100.0% | 1 | | neutron.delete_security_group | 0.507 | 0.507 | 0.507 | 0.507 | 0.507 | 0.507 | 100.0% | 1 | | total | 3.416 | 3.416 | 3.416 | 3.416 | 3.416 | 3.416 | 100.0% | 1 | | -> duration | 3.416 | 3.416 | 3.416 | 3.416 | 3.416 | 3.416 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.416305 Full duration: 20.891132 -------------------------------------------------------------------------------- test scenario NeutronNetworks.set_and_clear_router_gateway args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "roles": [ "admin" ], "quotas": { "neutron": { "router": -1, "network": -1 } } }, "args": { "network_create_args": { "router:external": true } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task 3be07085-cf96-467b-a6e0-20b6d5210143 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 2.227 | 2.227 | 2.227 | 2.227 | 2.227 | 2.227 | 100.0% | 1 | | neutron.create_router | 0.364 | 0.364 | 0.364 | 0.364 | 0.364 | 0.364 | 100.0% | 1 | | neutron.add_gateway_router | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 4.41 | 100.0% | 1 | | neutron.remove_gateway_router | 2.094 | 2.094 | 2.094 | 2.094 | 2.094 | 2.094 | 100.0% | 1 | | total | 9.095 | 9.095 | 9.095 | 9.095 | 9.095 | 9.095 | 100.0% | 1 | | -> duration | 9.095 | 9.095 | 9.095 | 9.095 | 9.095 | 9.095 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 9.094821 Full duration: 37.685832 HINTS: * To plot HTML graphics with this data, run: rally task report 3be07085-cf96-467b-a6e0-20b6d5210143 --out output.html * To generate a JUnit report, run: rally task export 3be07085-cf96-467b-a6e0-20b6d5210143 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 3be07085-cf96-467b-a6e0-20b6d5210143 --json --out output.json 2018-05-28 00:29:30,071 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', '3be07085-cf96-467b-a6e0-20b6d5210143'] 2018-05-28 00:29:32,764 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:29:32,765 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', '3be07085-cf96-467b-a6e0-20b6d5210143', '--out', '/home/opnfv/functest/results/rally/opnfv-neutron.html'] 2018-05-28 00:29:32,805 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "neutron" OK. 2018-05-28 00:29:32,806 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "nova" ... 2018-05-28 00:29:32,806 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-nova.yaml 2018-05-28 00:29:32,806 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:29:32,822 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:29:32,823 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['nova'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:42:14,507 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : b73a82fd-1541-46b5-a509-26eb362ef12f 2018-05-28 00:42:14,508 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', 'b73a82fd-1541-46b5-a509-26eb362ef12f'] 2018-05-28 00:42:15,579 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f: finished -------------------------------------------------------------------------------- test scenario NovaServers.boot_and_live_migrate_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "block_migration": false, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ] }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 23.366 | 23.366 | 23.366 | 23.366 | 23.366 | 23.366 | 100.0% | 1 | | nova.live_migrate | 21.204 | 21.204 | 21.204 | 21.204 | 21.204 | 21.204 | 100.0% | 1 | | nova.delete_server | 6.051 | 6.051 | 6.051 | 6.051 | 6.051 | 6.051 | 100.0% | 1 | | total | 50.621 | 50.621 | 50.621 | 50.621 | 50.621 | 50.621 | 100.0% | 1 | | -> duration | 49.621 | 49.621 | 49.621 | 49.621 | 49.621 | 49.621 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 49.621136 Full duration: 65.699303 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_attach_created_volume_and_live_migrate args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "block_migration": false, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "boot_server_kwargs": { "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ] }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "size": 10 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 19.026 | 19.026 | 19.026 | 19.026 | 19.026 | 19.026 | 100.0% | 1 | | cinder_v2.create_volume | 3.783 | 3.783 | 3.783 | 3.783 | 3.783 | 3.783 | 100.0% | 1 | | nova.attach_volume | 9.659 | 9.659 | 9.659 | 9.659 | 9.659 | 9.659 | 100.0% | 1 | | nova.live_migrate | 24.41 | 24.41 | 24.41 | 24.41 | 24.41 | 24.41 | 100.0% | 1 | | nova.detach_volume | 5.794 | 5.794 | 5.794 | 5.794 | 5.794 | 5.794 | 100.0% | 1 | | cinder_v2.delete_volume | 93.388 | 93.388 | 93.388 | 93.388 | 93.388 | 93.388 | 100.0% | 1 | | nova.delete_server | 3.237 | 3.237 | 3.237 | 3.237 | 3.237 | 3.237 | 100.0% | 1 | | total | 159.299 | 159.299 | 159.299 | 159.299 | 159.299 | 159.299 | 100.0% | 1 | | -> duration | 158.299 | 158.299 | 158.299 | 158.299 | 158.299 | 158.299 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 158.29883 Full duration: 177.253844 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_from_volume_and_live_migrate args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "volume_size": 10, "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ], "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "block_migration": false, "force_delete": false, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 7.973 | 7.973 | 7.973 | 7.973 | 7.973 | 7.973 | 100.0% | 1 | | nova.boot_server | 21.718 | 21.718 | 21.718 | 21.718 | 21.718 | 21.718 | 100.0% | 1 | | nova.live_migrate | 23.556 | 23.556 | 23.556 | 23.556 | 23.556 | 23.556 | 100.0% | 1 | | nova.delete_server | 8.357 | 8.357 | 8.357 | 8.357 | 8.357 | 8.357 | 100.0% | 1 | | total | 61.604 | 61.604 | 61.604 | 61.604 | 61.604 | 61.604 | 100.0% | 1 | | -> duration | 60.604 | 60.604 | 60.604 | 60.604 | 60.604 | 60.604 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 60.603945 Full duration: 80.972941 -------------------------------------------------------------------------------- test scenario NovaKeypair.boot_and_delete_server_with_keypair args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "nova": { "ram": -1, "floating_ips": -1, "security_group_rules": -1, "instances": -1, "key_pairs": -1, "cores": -1, "security_groups": -1 }, "neutron": { "subnet": -1, "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" } }, "args": { "server_kwargs": { "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ] }, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.create_keypair | 1.446 | 1.446 | 1.446 | 1.446 | 1.446 | 1.446 | 100.0% | 1 | | nova.boot_server | 19.365 | 19.365 | 19.365 | 19.365 | 19.365 | 19.365 | 100.0% | 1 | | nova.delete_server | 5.915 | 5.915 | 5.915 | 5.915 | 5.915 | 5.915 | 100.0% | 1 | | nova.delete_keypair | 0.066 | 0.066 | 0.066 | 0.066 | 0.066 | 0.066 | 100.0% | 1 | | total | 26.794 | 26.794 | 26.794 | 26.794 | 26.794 | 26.794 | 100.0% | 1 | | -> duration | 25.794 | 25.794 | 25.794 | 25.794 | 25.794 | 25.794 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 25.794461 Full duration: 68.776481 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_from_volume_and_delete args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "cinder": { "gigabytes": -1, "volumes": -1, "snapshots": -1 }, "nova": { "ram": -1, "floating_ips": -1, "security_group_rules": -1, "instances": -1, "cores": -1, "security_groups": -1 }, "neutron": { "subnet": -1, "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" } }, "args": { "volume_size": 5, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ] }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 6.839 | 6.839 | 6.839 | 6.839 | 6.839 | 6.839 | 100.0% | 1 | | nova.boot_server | 21.198 | 21.198 | 21.198 | 21.198 | 21.198 | 21.198 | 100.0% | 1 | | nova.delete_server | 9.077 | 9.077 | 9.077 | 9.077 | 9.077 | 9.077 | 100.0% | 1 | | total | 37.115 | 37.115 | 37.115 | 37.115 | 37.115 | 37.115 | 100.0% | 1 | | -> duration | 36.115 | 36.115 | 36.115 | 36.115 | 36.115 | 36.115 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 36.11498 Full duration: 86.191801 -------------------------------------------------------------------------------- test scenario NovaServers.pause_and_unpause_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "quotas": { "nova": { "ram": -1, "floating_ips": -1, "security_group_rules": -1, "instances": -1, "cores": -1, "security_groups": -1 }, "neutron": { "subnet": -1, "network": -1, "port": -1 } }, "users": { "users_per_tenant": 1, "tenants": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" } }, "args": { "force_delete": false, "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ] }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 16.395 | 16.395 | 16.395 | 16.395 | 16.395 | 16.395 | 100.0% | 1 | | nova.pause_server | 3.397 | 3.397 | 3.397 | 3.397 | 3.397 | 3.397 | 100.0% | 1 | | nova.unpause_server | 3.402 | 3.402 | 3.402 | 3.402 | 3.402 | 3.402 | 100.0% | 1 | | nova.delete_server | 6.164 | 6.164 | 6.164 | 6.164 | 6.164 | 6.164 | 100.0% | 1 | | total | 29.358 | 29.358 | 29.358 | 29.358 | 29.358 | 29.358 | 100.0% | 1 | | -> duration | 24.358 | 24.358 | 24.358 | 24.358 | 24.358 | 24.358 | 100.0% | 1 | | -> idle_duration | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 24.358356 Full duration: 71.859622 -------------------------------------------------------------------------------- test scenario NovaServers.boot_and_migrate_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "nics": [ { "net-id": "187f0eb3-1c97-45c4-afc0-1161d450165b" } ], "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 19.202 | 19.202 | 19.202 | 19.202 | 19.202 | 19.202 | 100.0% | 1 | | nova.migrate | 20.022 | 20.022 | 20.022 | 20.022 | 20.022 | 20.022 | 100.0% | 1 | | nova.resize_confirm | 4.516 | 4.516 | 4.516 | 4.516 | 4.516 | 4.516 | 100.0% | 1 | | nova.delete_server | 5.977 | 5.977 | 5.977 | 5.977 | 5.977 | 5.977 | 100.0% | 1 | | total | 49.718 | 49.718 | 49.718 | 49.718 | 49.718 | 49.718 | 100.0% | 1 | | -> duration | 48.718 | 48.718 | 48.718 | 48.718 | 48.718 | 48.718 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 48.718261 Full duration: 62.992067 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_and_list_interfaces args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 }, "network": {} }, "args": { "flavor": { "name": "rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587" }, "auto_assign_nic": true, "image": { "name": "Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587" } }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 17.787 | 17.787 | 17.787 | 17.787 | 17.787 | 17.787 | 100.0% | 1 | | nova.list_interfaces | 0.402 | 0.402 | 0.402 | 0.402 | 0.402 | 0.402 | 100.0% | 1 | | total | 18.189 | 18.189 | 18.189 | 18.189 | 18.189 | 18.189 | 100.0% | 1 | | -> duration | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 17.189 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 17.188983 Full duration: 66.954108 -------------------------------------------------------------------------------- test scenario NovaServerGroups.create_and_delete_server_group args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "policies": [ "affinity" ] }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task b73a82fd-1541-46b5-a509-26eb362ef12f has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.create_server_group | 1.564 | 1.564 | 1.564 | 1.564 | 1.564 | 1.564 | 100.0% | 1 | | nova.delete_server_group | 0.138 | 0.138 | 0.138 | 0.138 | 0.138 | 0.138 | 100.0% | 1 | | total | 1.702 | 1.702 | 1.702 | 1.702 | 1.702 | 1.702 | 100.0% | 1 | | -> duration | 1.702 | 1.702 | 1.702 | 1.702 | 1.702 | 1.702 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.702364 Full duration: 13.694079 HINTS: * To plot HTML graphics with this data, run: rally task report b73a82fd-1541-46b5-a509-26eb362ef12f --out output.html * To generate a JUnit report, run: rally task export b73a82fd-1541-46b5-a509-26eb362ef12f --type junit --to output.xml * To get raw JSON output of task results, run: rally task report b73a82fd-1541-46b5-a509-26eb362ef12f --json --out output.json 2018-05-28 00:42:15,580 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', 'b73a82fd-1541-46b5-a509-26eb362ef12f'] 2018-05-28 00:42:18,292 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:42:18,293 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', 'b73a82fd-1541-46b5-a509-26eb362ef12f', '--out', '/home/opnfv/functest/results/rally/opnfv-nova.html'] 2018-05-28 00:42:18,320 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "nova" OK. 2018-05-28 00:42:18,321 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "quotas" ... 2018-05-28 00:42:18,321 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-quotas.yaml 2018-05-28 00:42:18,321 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-28 00:42:18,336 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-28 00:42:18,337 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'floating_net', 'service_list': ['quotas'], 'concurrency': 4, 'netid': '187f0eb3-1c97-45c4-afc0-1161d450165b', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'flavor_name': 'rally-tiny-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-3e1f9ebd-fad7-43a1-b71f-33566334d587', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-28 00:43:32,749 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : e1f365c2-0142-493d-aac8-3b7540873272 2018-05-28 00:43:32,750 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '--uuid', 'e1f365c2-0142-493d-aac8-3b7540873272'] 2018-05-28 00:43:33,749 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task e1f365c2-0142-493d-aac8-3b7540873272: finished -------------------------------------------------------------------------------- test scenario Quotas.cinder_update_and_delete args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task e1f365c2-0142-493d-aac8-3b7540873272 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 1.643 | 1.643 | 1.643 | 1.643 | 1.643 | 1.643 | 100.0% | 1 | | quotas.delete_quotas | 0.111 | 0.111 | 0.111 | 0.111 | 0.111 | 0.111 | 100.0% | 1 | | total | 1.755 | 1.755 | 1.755 | 1.755 | 1.755 | 1.755 | 100.0% | 1 | | -> duration | 1.755 | 1.755 | 1.755 | 1.755 | 1.755 | 1.755 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.754646 Full duration: 12.490913 -------------------------------------------------------------------------------- test scenario Quotas.cinder_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task e1f365c2-0142-493d-aac8-3b7540873272 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 100.0% | 1 | | total | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 100.0% | 1 | | -> duration | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 1.427 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.426953 Full duration: 12.138048 -------------------------------------------------------------------------------- test scenario Quotas.neutron_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task e1f365c2-0142-493d-aac8-3b7540873272 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 0.515 | 0.515 | 0.515 | 0.515 | 0.515 | 0.515 | 100.0% | 1 | | total | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 100.0% | 1 | | -> duration | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 1.321 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.321426 Full duration: 12.438197 -------------------------------------------------------------------------------- test scenario Quotas.nova_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "contexts": { "users": { "users_per_tenant": 1, "tenants": 1 } }, "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "hooks": [] } -------------------------------------------------------------------------------- Task e1f365c2-0142-493d-aac8-3b7540873272 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 100.0% | 1 | | total | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 100.0% | 1 | | -> duration | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 1.673 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.673047 Full duration: 12.228285 HINTS: * To plot HTML graphics with this data, run: rally task report e1f365c2-0142-493d-aac8-3b7540873272 --out output.html * To generate a JUnit report, run: rally task export e1f365c2-0142-493d-aac8-3b7540873272 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report e1f365c2-0142-493d-aac8-3b7540873272 --json --out output.json 2018-05-28 00:43:33,749 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--json', '--uuid', 'e1f365c2-0142-493d-aac8-3b7540873272'] 2018-05-28 00:43:36,644 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-28 00:43:36,645 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '--html', '--uuid', 'e1f365c2-0142-493d-aac8-3b7540873272', '--out', '/home/opnfv/functest/results/rally/opnfv-quotas.html'] 2018-05-28 00:43:36,659 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "quotas" OK. 2018-05-28 00:43:36,668 - functest.opnfv_tests.openstack.rally.rally - INFO - Rally Summary Report: +----------------+------------+----------------+-----------+ | Module | Duration | nb. Test Run | Success | +----------------+------------+----------------+-----------+ | authenticate | 01:05 | 6 | 100.00% | | glance | 01:42 | 4 | 100.00% | | cinder | 05:36 | 10 | 100.00% | | gnocchi | 02:05 | 11 | 100.00% | | heat | 01:42 | 4 | 100.00% | | keystone | 03:39 | 11 | 100.00% | | neutron | 07:57 | 11 | 100.00% | | nova | 11:34 | 9 | 100.00% | | quotas | 00:49 | 4 | 100.00% | | | | | | | TOTAL: | 00:36:11 | 70 | 100.00% | +----------------+------------+----------------+-----------+ 2018-05-28 00:43:36,668 - functest.opnfv_tests.openstack.rally.rally - INFO - Rally 'rally_sanity' success_rate is 100.00% in 9/9 modules 2018-05-28 00:43:52,600 - xtesting.energy.energy - DEBUG - Restoring previous scenario (cloudify_ims/running) 2018-05-28 00:43:52,601 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-28 00:43:53,124 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 00:43:53,125 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | rally_sanity | functest | 42:52 | PASS | +----------------------+------------------+------------------+----------------+ 2018-05-28 00:43:53,129 - xtesting.ci.run_tests - INFO - Running test case 'patrole'... 2018-05-28 00:43:53,224 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-28 00:43:53,224 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-28 00:43:53,224 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-28 00:43:56,687 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-28 00:43:59,833 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally +--------------------------------------+----------------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+----------------------------+-------------+------------------+--------+ | cc745f77-5c77-45a9-a621-dcf984496790 | 2018-05-28T00:43:59.136069 | opnfv-rally | deploy->finished | | +--------------------------------------+----------------------------+-------------+------------------+--------+ Using deployment: cc745f77-5c77-45a9-a621-dcf984496790 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-28 00:44:04,091 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | compute_legacy | Available | | __unknown__ | event | Available | | __unknown__ | placement | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | designate | dns | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-28 00:44:04,092 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-28 00:44:07,202 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify delete-verifier --id opnfv-tempest --force 2018-05-28 00:44:11,914 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide Using verifier 'opnfv-tempest' (UUID=1426bb2b-2d75-4c5d-a182-57edc53aa051) as the default verifier for the future CLI operations. 2018-05-28 00:44:15,284 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6' 2018-05-28 00:44:16,818 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-28T00:44:16Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'21f4f825-dc5b-4593-a1e4-b6c95cd81a78', u'provider:segmentation_id': 87, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'tempest-net-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-28T00:44:16Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-28 00:44:18,239 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-28T00:44:17Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'0469cef4-169d-429f-a25f-563363412fd3', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'enable_dhcp': True, u'network_id': u'21f4f825-dc5b-4593-a1e4-b6c95cd81a78', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-28T00:44:17Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-28 00:44:18,239 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-28 00:44:18,240 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6' 2018-05-28 00:44:19,686 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-28T00:44:19Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/9fc25e87-d005-47df-8d90-110403fd6375/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'9fc25e87-d005-47df-8d90-110403fd6375', u'size': None, u'name': u'Cirros-0.4.0-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'checksum': None, u'self': u'/v2/images/9fc25e87-d005-47df-8d90-110403fd6375', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-28T00:44:19Z', u'schema': u'/v2/schemas/image'}) 2018-05-28 00:44:19,686 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6' 2018-05-28 00:44:20,601 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-28T00:44:20Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/05577dc8-9285-4804-b984-0c2525e77019/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'05577dc8-9285-4804-b984-0c2525e77019', u'size': None, u'name': u'Cirros-0.4.0-1-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', u'checksum': None, u'self': u'/v2/images/05577dc8-9285-4804-b984-0c2525e77019', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-28T00:44:20Z', u'schema': u'/v2/schemas/image'}) 2018-05-28 00:44:20,602 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-28 00:44:20,881 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'4bd186ac-df5a-4b48-b793-297108dce020', 'swap': 0}) 2018-05-28 00:44:21,070 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-c5bc1f1b-d1ca-4671-94f2-e4920f280aa6', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'fa219ded-ca05-4838-ad03-d165ef4ec7dd', 'swap': 0}) 2018-05-28 00:44:25,761 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-28 00:44:25,761 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-28 00:44:25,761 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-28 00:44:25,762 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-28 00:44:25,765 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-28 00:44:29,027 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-1426bb2b-2d75-4c5d-a182-57edc53aa051/repo; stestr list '(?!.*test_networks_multiprovider_rbac)(?=patrole_tempest_plugin.tests.api.(image|network))' >/home/opnfv/functest/results/patrole/tempest-list.txt 2>/dev/null) 2018-05-28 00:44:29,028 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', '/home/opnfv/functest/results/patrole/tempest-list.txt']'. 2018-05-28 00:47:23,933 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Verification UUID: b272f6c6-9e28-47a1-b2bc-7de8d43bd215 2018-05-28 00:47:24,173 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', 'b272f6c6-9e28-47a1-b2bc-7de8d43bd215']'. 2018-05-28 00:47:25,065 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-28 00:47:25,066 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-28 00:47:25,066 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 00:47:25,066 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | b272f6c6-9e28-47a1-b2bc-7de8d43bd215 | 2018-05-28 00:47:25,066 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | finished | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-28 00:44:31 | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-28 00:47:23 | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:02:52 | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 1426bb2b-2d75-4c5d-a182-57edc53aa051) | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: cc745f77-5c77-45a9-a621-dcf984496790) | 2018-05-28 00:47:25,067 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 136 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 135.357 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 136 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 0 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 0 | 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 00:47:25,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-28 00:47:25,069 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest patrole success_rate is 100.0% 2018-05-28 00:47:32,586 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 00:47:32,587 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | patrole | functest | 03:17 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-28 00:47:32,591 - xtesting.ci.run_tests - INFO - Running test case 'snaps_smoke'... 2018-05-28 00:47:34,388 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-28 02:04:44,014 - xtesting.core.unit - DEBUG - test_add_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_delete_group (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_admin_user_to_new_project (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_new_user_to_admin_project (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_one_complex_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_one_simple_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_several_rules (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_without_rules (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_remove_rule_by_id (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_remove_rule_by_setting (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_delete_image (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_file (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_url (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_url_properties (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_same_image (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_same_image_new_settings (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_bad_image_file (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_image_type (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_name (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_url (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_create_three_part_image_from_file_3_creators (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_three_part_image_from_url (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_three_part_image_from_url_3_creators (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_delete_keypair (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_from_file (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_large_key (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_only (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_save_both (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_save_pub_only (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_exist_files_delete (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_exist_files_keep (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_delete_1 (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_delete_2 (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_keep (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_delete_network (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_router_admin_user_to_new_project (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_router_new_user_to_admin_project (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_with_router (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_without_router (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_networks_same_name (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_delete_router (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_state_True (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_state_false (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_user_to_new_project (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_external_network (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_new_user_as_admin_project (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_private_network (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_vanilla (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_with_ext_port (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_ports (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_router_invalid_gateway_name (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_router_noname (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_delete_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_same_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_delete_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_create_same_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_create_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_volume_type_with_encryption (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_volume_type_with_qos (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_volume_type_with_qos_and_encryption (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_create_delete_volume (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_same_volume (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_volume_simple (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_volume_bad_image (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_create_volume_bad_size (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_create_volume_bad_type (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_bad_volume_type (snaps.openstack.tests.create_volume_tests.CreateVolumeWithTypeTests) ... ok test_valid_volume_type (snaps.openstack.tests.create_volume_tests.CreateVolumeWithTypeTests) ... ok test_bad_image_name (snaps.openstack.tests.create_volume_tests.CreateVolumeWithImageTests) ... ok test_valid_volume_image (snaps.openstack.tests.create_volume_tests.CreateVolumeWithImageTests) ... ok test_check_vm_ip_dhcp (snaps.openstack.tests.create_instance_tests.SimpleHealthCheck) ... ok test_ping_via_router (snaps.openstack.tests.create_instance_tests.CreateInstanceTwoNetTests) ... ok test_create_admin_instance (snaps.openstack.tests.create_instance_tests.CreateInstanceSimpleTests) ... ok test_create_delete_instance (snaps.openstack.tests.create_instance_tests.CreateInstanceSimpleTests) ... ok test_set_allowed_address_pairs (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_allowed_address_pairs_bad_ip (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_allowed_address_pairs_bad_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_invalid_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_invalid_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_mac_and_ip (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_valid_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_valid_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_one_port_two_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_one_port_two_ip_two_subnets (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_add_invalid_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_add_same_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_add_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_remove_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_remove_security_group_never_added (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_deploy_vm_to_each_compute_node (snaps.openstack.tests.create_instance_tests.CreateInstanceOnComputeHost) ... ok test_create_instance_from_three_part_image (snaps.openstack.tests.create_instance_tests.CreateInstanceFromThreePartImage) ... ok test_create_instance_with_one_volume (snaps.openstack.tests.create_instance_tests.CreateInstanceVolumeTests) ... ok test_create_instance_with_two_volumes (snaps.openstack.tests.create_instance_tests.CreateInstanceVolumeTests) ... ok test_create_delete_stack (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_same_stack (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_short_timeout (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_template_dict (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_template_file (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_network_creators (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_vm_inst_creators (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_volume_creator (snaps.openstack.tests.create_stack_tests.CreateStackVolumeTests) ... ok test_retrieve_volume_type_creator (snaps.openstack.tests.create_stack_tests.CreateStackVolumeTests) ... ok test_retrieve_flavor_creator (snaps.openstack.tests.create_stack_tests.CreateStackFlavorTests) ... ok test_retrieve_keypair_creator (snaps.openstack.tests.create_stack_tests.CreateStackKeypairTests) ... ok test_retrieve_security_group_creator (snaps.openstack.tests.create_stack_tests.CreateStackSecurityGroupTests) ... ok test_bad_stack_file (snaps.openstack.tests.create_stack_tests.CreateStackNegativeTests) ... ok test_missing_dependencies (snaps.openstack.tests.create_stack_tests.CreateStackNegativeTests) ... ok test_single_port_static (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_active (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_init (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_reboot (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_before_active (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_reverse_engineer (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_second_creator (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_connect_via_ssh_heat_vm (snaps.openstack.tests.create_stack_tests.CreateStackFloatingIpTests) ... ok test_connect_via_ssh_heat_vm_derived (snaps.openstack.tests.create_stack_tests.CreateStackFloatingIpTests) ... ok test_apply_simple_playbook (snaps.provisioning.tests.ansible_utils_tests.AnsibleProvisioningTests) ... ok test_apply_template_playbook (snaps.provisioning.tests.ansible_utils_tests.AnsibleProvisioningTests) ... ok ---------------------------------------------------------------------- Ran 119 tests in 4629.555s OK 2018-05-28 02:04:44,138 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 02:04:44,138 - xtesting.ci.run_tests - INFO - Test result: +---------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +---------------------+------------------+------------------+----------------+ | snaps_smoke | functest | 77:10 | PASS | +---------------------+------------------+------------------+----------------+ 2018-05-28 02:04:44,142 - xtesting.ci.run_tests - INFO - Running test case 'neutron_trunk'... 2018-05-28 02:04:44,241 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-28 02:04:44,242 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-28 02:04:44,242 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-28 02:04:47,820 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-28 02:04:50,946 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally +--------------------------------------+----------------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+----------------------------+-------------+------------------+--------+ | ad85b323-3a96-41be-90cb-0bf2daa65d90 | 2018-05-28T02:04:50.265289 | opnfv-rally | deploy->finished | | +--------------------------------------+----------------------------+-------------+------------------+--------+ Using deployment: ad85b323-3a96-41be-90cb-0bf2daa65d90 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-28 02:04:55,471 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | compute_legacy | Available | | __unknown__ | event | Available | | __unknown__ | placement | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | designate | dns | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-28 02:04:55,471 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-28 02:04:58,493 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify delete-verifier --id opnfv-tempest --force 2018-05-28 02:05:02,847 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide Using verifier 'opnfv-tempest' (UUID=0194b4f7-924d-4ae2-aabb-c473f47d2457) as the default verifier for the future CLI operations. 2018-05-28 02:05:06,251 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-3f03303a-88f0-466a-abde-dc07f004b15c' 2018-05-28 02:05:07,456 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-28T02:05:07Z', u'is_default': False, u'revision_number': 2, u'port_security_enabled': True, u'provider:network_type': u'vxlan', u'id': u'96523c91-70c0-42ab-8c75-841d9f6abc8b', u'provider:segmentation_id': 63, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'name': u'tempest-net-3f03303a-88f0-466a-abde-dc07f004b15c', u'admin_state_up': True, u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-28T02:05:07Z', u'mtu': 1450, u'ipv4_address_scope': None, u'shared': False, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-28 02:05:09,000 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-28T02:05:08Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'eafb5b3a-43fb-4f80-ba6e-f4cbd88eb361', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-3f03303a-88f0-466a-abde-dc07f004b15c', u'enable_dhcp': True, u'network_id': u'96523c91-70c0-42ab-8c75-841d9f6abc8b', u'tenant_id': u'17e0c72255804297b05647b8b64ec56a', u'created_at': u'2018-05-28T02:05:08Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'17e0c72255804297b05647b8b64ec56a'}) 2018-05-28 02:05:09,000 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-28 02:05:09,000 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-3f03303a-88f0-466a-abde-dc07f004b15c' 2018-05-28 02:05:10,383 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-3f03303a-88f0-466a-abde-dc07f004b15c', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-28T02:05:10Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/cdeb0ee2-d5f0-4ab6-9955-76c70f51da41/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'cdeb0ee2-d5f0-4ab6-9955-76c70f51da41', u'size': None, u'name': u'Cirros-0.4.0-3f03303a-88f0-466a-abde-dc07f004b15c', u'checksum': None, u'self': u'/v2/images/cdeb0ee2-d5f0-4ab6-9955-76c70f51da41', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-28T02:05:10Z', u'schema': u'/v2/schemas/image'}) 2018-05-28 02:05:10,384 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-3f03303a-88f0-466a-abde-dc07f004b15c' 2018-05-28 02:05:11,384 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-3f03303a-88f0-466a-abde-dc07f004b15c', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-28T02:05:11Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/a76de289-8408-42f8-9471-677d0f8292a0/file', u'owner': u'17e0c72255804297b05647b8b64ec56a', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'a76de289-8408-42f8-9471-677d0f8292a0', u'size': None, u'name': u'Cirros-0.4.0-1-3f03303a-88f0-466a-abde-dc07f004b15c', u'checksum': None, u'self': u'/v2/images/a76de289-8408-42f8-9471-677d0f8292a0', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-28T02:05:11Z', u'schema': u'/v2/schemas/image'}) 2018-05-28 02:05:11,384 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-28 02:05:11,667 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-3f03303a-88f0-466a-abde-dc07f004b15c', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'34ab191e-dcd3-4256-8e33-c927b6824c1b', 'swap': 0}) 2018-05-28 02:05:11,820 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-3f03303a-88f0-466a-abde-dc07f004b15c', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'17e0c72255804297b05647b8b64ec56a', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'RegionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'2255c25f-9e60-4c38-b730-4b586ba473fb', 'swap': 0}) 2018-05-28 02:05:16,645 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-28 02:05:16,645 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-28 02:05:16,646 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-28 02:05:16,649 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-28 02:05:16,656 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-28 02:05:19,433 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-0194b4f7-924d-4ae2-aabb-c473f47d2457/repo; stestr list 'neutron_tempest_plugin.(api|scenario).test_trunk' >/home/opnfv/functest/results/neutron_trunk/test_list.txt 2>/dev/null) 2018-05-28 02:05:19,434 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Applying tempest blacklist... 2018-05-28 02:05:19,436 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Tempest blacklist file does not exist. 2018-05-28 02:05:19,437 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', '/home/opnfv/functest/results/neutron_trunk/test_list.txt']'. 2018-05-28 02:09:24,988 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Verification UUID: 5ee51678-0867-48db-ad79-4cc6884a3b9d 2018-05-28 02:09:25,201 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', '5ee51678-0867-48db-ad79-4cc6884a3b9d']'. 2018-05-28 02:09:26,106 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-28 02:09:26,106 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-28 02:09:26,106 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | 5ee51678-0867-48db-ad79-4cc6884a3b9d | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | finished | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-28 02:05:22 | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-28 02:09:24 | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:04:02 | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-28 02:09:26,107 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 0194b4f7-924d-4ae2-aabb-c473f47d2457) | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: ad85b323-3a96-41be-90cb-0bf2daa65d90) | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 52 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 222.298 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 43 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 9 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-28 02:09:26,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 0 | 2018-05-28 02:09:26,109 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-28 02:09:26,109 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +------------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-28 02:09:26,111 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest neutron_trunk success_rate is 100.0% 2018-05-28 02:09:33,609 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 02:09:33,610 - xtesting.ci.run_tests - INFO - Test result: +-----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-----------------------+------------------+------------------+----------------+ | neutron_trunk | functest | 04:27 | PASS | +-----------------------+------------------+------------------+----------------+ 2018-05-28 02:09:33,615 - xtesting.ci.run_tests - INFO - Xtesting report: +------------------------------+------------------+---------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +------------------------------+------------------+---------------+------------------+----------------+ | vping_ssh | functest | smoke | 00:57 | PASS | | vping_userdata | functest | smoke | 00:51 | PASS | | cinder_test | functest | smoke | 01:13 | PASS | | tempest_smoke_serial | functest | smoke | 24:59 | PASS | | rally_sanity | functest | smoke | 42:52 | PASS | | patrole | functest | smoke | 03:17 | PASS | | snaps_smoke | functest | smoke | 77:10 | PASS | | neutron_trunk | functest | smoke | 04:27 | PASS | | refstack_defcore | functest | smoke | 00:00 | SKIP | | odl | functest | smoke | 00:00 | SKIP | +------------------------------+------------------+---------------+------------------+----------------+ 2018-05-28 02:09:33,623 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK 2018-05-28 02:10:03,866 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-fuel-baremetal-daily-master-231 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | fuel | | DEPLOY_SCENARIO | os-nosdn-nofeature-ha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod2 | +--------------------------------------+----------------------------------------------------------+ 2018-05-28 02:10:03,871 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file export OS_IDENTITY_API_VERSION=3 export OS_PROJECT_DOMAIN_NAME=Default export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_NAME=admin export OS_TENANT_NAME=admin export OS_USERNAME=admin export OS_PASSWORD=opnfv_secret export OS_REGION_NAME=RegionOne export OS_INTERFACE=internal export OS_ENDPOINT_TYPE="internal" export OS_CACERT="/etc/ssl/certs/mcp_os_cacert" export VOLUME_DEVICE_NAME=vdc export OS_AUTH_URL=http://10.167.4.35:35357/v3 2018-05-28 02:10:03,871 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-28 02:10:03,872 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+-----------------+---------------------+-------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+-----------------+---------------------+-------------------+ +---------------+---------------+-----------------+---------------------+-------------------+ 2018-05-28 02:10:03,873 - xtesting.ci.run_tests - INFO - Xtesting report: +-----------------------------+------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +-----------------------------+------------------------+------------------+------------------+----------------+ | doctor-notification | doctor | features | 00:00 | SKIP | | bgpvpn | sdnvpn | features | 00:00 | SKIP | | functest-odl-sfc | sfc | features | 00:00 | SKIP | | barometercollectd | barometer | features | 00:00 | SKIP | | fds | fastdatastacks | features | 00:00 | SKIP | +-----------------------------+------------------------+------------------+------------------+----------------+ 2018-05-28 02:10:03,877 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK 2018-05-28 02:11:14,577 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-fuel-baremetal-daily-master-231 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | fuel | | DEPLOY_SCENARIO | os-nosdn-nofeature-ha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod2 | +--------------------------------------+----------------------------------------------------------+ 2018-05-28 02:11:14,581 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file export OS_IDENTITY_API_VERSION=3 export OS_PROJECT_DOMAIN_NAME=Default export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_NAME=admin export OS_TENANT_NAME=admin export OS_USERNAME=admin export OS_PASSWORD=opnfv_secret export OS_REGION_NAME=RegionOne export OS_INTERFACE=internal export OS_ENDPOINT_TYPE="internal" export OS_CACERT="/etc/ssl/certs/mcp_os_cacert" export VOLUME_DEVICE_NAME=vdc export OS_AUTH_URL=http://10.167.4.35:35357/v3 2018-05-28 02:11:14,581 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-28 02:11:14,582 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ | vnf | 4 | (daily)|(weekly) | Collection of VNF test cases. | cloudify_ims vyos_vrouter juju_epc | +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ 2018-05-28 02:11:14,584 - xtesting.ci.run_tests - INFO - Running tier 'vnf' 2018-05-28 02:11:14,584 - xtesting.ci.run_tests - INFO - Running test case 'cloudify_ims'... 2018-05-28 02:11:15,104 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Orchestrator configuration {'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'cloudify.medium'}, u'os_image': u'cloudify_manager_4.0'}} 2018-05-28 02:11:15,136 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - VNF configuration: {'inputs': {u'vellum_cluster_size': 1, u'agent_user': u'ubuntu', u'image_id': u'ubuntu_14.04', u'external_network_name': u'', u'dime_cluster_size': 1, u'key_pair_name': u'cloudify_ims_kp', u'bono_cluster_size': 1, u'flavor_id': u'cloudify.small', u'public_domain': u'clearwater.opnfv', u'homer_cluster_size': 1, u'release': u'repo122', u'private_key_path': u'/etc/cloudify/cloudify_ims.pem', u'sprout_cluster_size': 1}, 'requirements': {u'flavor': {u'ram_min': 2048, u'name': u'cloudify.small'}, u'network_quotas': {u'security_group': 20, u'security_group_rule': 100, u'port': 50}, u'compute_quotas': {u'cores': 50, u'instances': 15}}, 'descriptor': {u'file_name': u'/src/vims/openstack-blueprint.yaml', u'version': u'122', u'name': u'clearwater-opnfv'}} 2018-05-28 02:11:15,156 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Images needed for vIMS: {u'cloudify_manager_4.0': u'/home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2', u'ubuntu_14.04': u'/home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img'} 2018-05-28 02:11:15,590 - xtesting.energy.energy - INFO - API recorder available at : http://energy.opnfv.fr/resources/recorders/environment/lf-pod2 2018-05-28 02:11:15,591 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-28 02:11:16,033 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-28 02:11:16,033 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-28 02:11:16,420 - functest.core.vnf - INFO - Prepare VNF: cloudify_ims, description: Created by OPNFV Functest: cloudify_ims 2018-05-28 02:11:19,949 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=cloudify_ims-dcfb0a64-e70e-4881-a426-4ffbb06e6a56, password=38adaacb-3f7b-4b04-999c-25c2450b1388, auth_url=http://10.167.4.35:35357/v3, project_name=cloudify_ims-dcfb0a64-e70e-4881-a426-4ffbb06e6a56, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 02:11:19,949 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Additional pre-configuration steps 2018-05-28 02:11:21,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Get or create flavor for cloudify manager vm ... 2018-05-28 02:11:23,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating a second user to bypass issues ... 2018-05-28 02:11:25,792 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - snaps creds: OSCreds - username=cloudify_network_bug-dcfb0a64-e70e-4881-a426-4ffbb06e6a56, password=7a5f8421-512f-4f6c-9349-4babbe2aae5f, auth_url=http://10.167.4.35:35357/v3, project_name=cloudify_ims-dcfb0a64-e70e-4881-a426-4ffbb06e6a56, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 02:11:25,792 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating keypair ... 2018-05-28 02:11:27,345 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Upload some OS images if it doesn't exist 2018-05-28 02:11:27,345 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - image: cloudify_manager_4.0, file: /home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2 2018-05-28 02:13:32,389 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - image: ubuntu_14.04, file: /home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img 2018-05-28 02:13:41,740 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating full network ... 2018-05-28 02:14:01,368 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating security group for cloudify manager vm 2018-05-28 02:14:06,473 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating cloudify manager VM 2018-05-28 02:15:59,679 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Set creds for cloudify manager {'keystone_password': '7a5f8421-512f-4f6c-9349-4babbe2aae5f', 'keystone_tenant_name': 'cloudify_ims-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'region': 'RegionOne', 'keystone_url': u'https://172.30.10.101:5000/v3', 'user_domain_name': 'Default', 'keystone_username': 'cloudify_network_bug-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'project_domain_name': 'Default'} 2018-05-28 02:15:59,680 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Attemps running status of the Manager 2018-05-28 02:16:11,816 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - status {u'status': u'running', u'services': [{u'instances': [{u'LoadState': u'loaded', u'Description': u'InfluxDB Service', u'MainPID': 810, u'state': u'running', u'Id': u'cloudify-influxdb.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Management Worker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-mgmtworker.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Celery Management'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'LSB: Starts Logstash as a daemon.', u'MainPID': 0, u'state': u'running', u'Id': u'logstash.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Logstash'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'RabbitMQ Service', u'MainPID': 2084, u'state': u'start-post', u'Id': u'cloudify-rabbitmq.service', u'ActiveState': u'activating', u'SubState': u'start-post'}], u'display_name': u'RabbitMQ'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify AMQP InfluxDB Broker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-amqpinflux.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'AMQP InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'PostgreSQL 9.5 database server', u'MainPID': 869, u'state': u'running', u'Id': u'postgresql-9.5.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'PostgreSQL'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify REST Service', u'MainPID': 805, u'state': u'running', u'Id': u'cloudify-restservice.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Manager Rest-Service'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Stage Service', u'MainPID': 817, u'state': u'running', u'Id': u'cloudify-stage.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Cloudify Stage'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Riemann Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-riemann.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Riemann'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'nginx - high performance web server', u'MainPID': 891, u'state': u'running', u'Id': u'nginx.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Webserver'}]} 2018-05-28 02:16:12,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - The current manager status is running 2018-05-28 02:16:12,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Put OpenStack creds in manager 2018-05-28 02:16:17,368 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Put private keypair in manager 2018-05-28 02:16:17,777 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo cp ~/cloudify_ims.pem /etc/cloudify/ stdout: 2018-05-28 02:16:17,917 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo chmod 444 /etc/cloudify/cloudify_ims.pem stdout: 2018-05-28 02:17:32,524 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo yum install -y gcc python-devel stdout: Loaded plugins: fastestmirror Determining fastest mirrors * base: mirror.hostduplex.com * extras: sjc.edge.kernel.org * updates: repos.forethought.net Resolving Dependencies --> Running transaction check ---> Package gcc.x86_64 0:4.8.5-28.el7_5.1 will be installed --> Processing Dependency: libgomp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: cpp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libgcc >= 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: glibc-devel >= 2.2.90-12 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpfr.so.4()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpc.so.3()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 ---> Package python-devel.x86_64 0:2.7.5-68.el7 will be installed --> Processing Dependency: python(x86-64) = 2.7.5-68.el7 for package: python-devel-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package cpp.x86_64 0:4.8.5-28.el7_5.1 will be installed ---> Package glibc-devel.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: glibc-headers = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc-headers for package: glibc-devel-2.17-222.el7.x86_64 ---> Package libgcc.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgcc.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libgomp.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgomp.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libmpc.x86_64 0:1.0.1-3.el7 will be installed ---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed ---> Package python.x86_64 0:2.7.5-48.el7 will be updated ---> Package python.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: python-libs(x86-64) = 2.7.5-68.el7 for package: python-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc.x86_64 0:2.17-157.el7_3.1 will be updated --> Processing Dependency: glibc = 2.17-157.el7_3.1 for package: glibc-common-2.17-157.el7_3.1.x86_64 ---> Package glibc.x86_64 0:2.17-222.el7 will be an update ---> Package glibc-headers.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: kernel-headers >= 2.2.1 for package: glibc-headers-2.17-222.el7.x86_64 --> Processing Dependency: kernel-headers for package: glibc-headers-2.17-222.el7.x86_64 ---> Package python-libs.x86_64 0:2.7.5-48.el7 will be updated ---> Package python-libs.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: libcrypto.so.10(OPENSSL_1.0.2)(64bit) for package: python-libs-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc-common.x86_64 0:2.17-157.el7_3.1 will be updated ---> Package glibc-common.x86_64 0:2.17-222.el7 will be an update ---> Package kernel-headers.x86_64 0:3.10.0-862.3.2.el7 will be installed ---> Package openssl-libs.x86_64 1:1.0.1e-60.el7_3.1 will be updated --> Processing Dependency: openssl-libs(x86-64) = 1:1.0.1e-60.el7_3.1 for package: 1:openssl-1.0.1e-60.el7_3.1.x86_64 ---> Package openssl-libs.x86_64 1:1.0.2k-12.el7 will be an update --> Running transaction check ---> Package openssl.x86_64 1:1.0.1e-60.el7_3.1 will be updated ---> Package openssl.x86_64 1:1.0.2k-12.el7 will be an update --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: gcc x86_64 4.8.5-28.el7_5.1 updates 16 M python-devel x86_64 2.7.5-68.el7 base 397 k Installing for dependencies: cpp x86_64 4.8.5-28.el7_5.1 updates 5.9 M glibc-devel x86_64 2.17-222.el7 base 1.1 M glibc-headers x86_64 2.17-222.el7 base 678 k kernel-headers x86_64 3.10.0-862.3.2.el7 updates 7.1 M libmpc x86_64 1.0.1-3.el7 base 51 k mpfr x86_64 3.1.1-4.el7 base 203 k Updating for dependencies: glibc x86_64 2.17-222.el7 base 3.6 M glibc-common x86_64 2.17-222.el7 base 11 M libgcc x86_64 4.8.5-28.el7_5.1 updates 101 k libgomp x86_64 4.8.5-28.el7_5.1 updates 156 k openssl x86_64 1:1.0.2k-12.el7 base 492 k openssl-libs x86_64 1:1.0.2k-12.el7 base 1.2 M python x86_64 2.7.5-68.el7 base 93 k python-libs x86_64 2.7.5-68.el7 base 5.6 M Transaction Summary ================================================================================ Install 2 Packages (+6 Dependent packages) Upgrade ( 8 Dependent packages) Total download size: 54 M Downloading packages: Delta RPMs disabled because /usr/bin/applydeltarpm not installed. -------------------------------------------------------------------------------- Total 3.4 MB/s | 54 MB 00:15 Running transaction check Running transaction test Transaction test succeeded Running transaction Updating : libgcc-4.8.5-28.el7_5.1.x86_64 1/24 Updating : glibc-common-2.17-222.el7.x86_64 2/24 Updating : glibc-2.17-222.el7.x86_64 3/24 warning: /etc/nsswitch.conf created as /etc/nsswitch.conf.rpmnew Installing : mpfr-3.1.1-4.el7.x86_64 4/24 Installing : libmpc-1.0.1-3.el7.x86_64 5/24 Updating : 1:openssl-libs-1.0.2k-12.el7.x86_64 6/24 Updating : python-libs-2.7.5-68.el7.x86_64 7/24 Updating : python-2.7.5-68.el7.x86_64 8/24 Installing : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Updating : libgomp-4.8.5-28.el7_5.1.x86_64 10/24 Installing : kernel-headers-3.10.0-862.3.2.el7.x86_64 11/24 Installing : glibc-headers-2.17-222.el7.x86_64 12/24 Installing : glibc-devel-2.17-222.el7.x86_64 13/24 Installing : gcc-4.8.5-28.el7_5.1.x86_64 14/24 Installing : python-devel-2.7.5-68.el7.x86_64 15/24 Updating : 1:openssl-1.0.2k-12.el7.x86_64 16/24 Cleanup : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Cleanup : python-2.7.5-48.el7.x86_64 18/24 Cleanup : python-libs-2.7.5-48.el7.x86_64 19/24 Cleanup : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 20/24 Cleanup : libgomp-4.8.5-11.el7.x86_64 21/24 Cleanup : glibc-common-2.17-157.el7_3.1.x86_64 22/24 Cleanup : glibc-2.17-157.el7_3.1.x86_64 23/24 Cleanup : libgcc-4.8.5-11.el7.x86_64 24/24 Verifying : python-libs-2.7.5-68.el7.x86_64 1/24 Verifying : glibc-devel-2.17-222.el7.x86_64 2/24 Verifying : glibc-headers-2.17-222.el7.x86_64 3/24 Verifying : 1:openssl-libs-1.0.2k-12.el7.x86_64 4/24 Verifying : libgomp-4.8.5-28.el7_5.1.x86_64 5/24 Verifying : gcc-4.8.5-28.el7_5.1.x86_64 6/24 Verifying : glibc-2.17-222.el7.x86_64 7/24 Verifying : libgcc-4.8.5-28.el7_5.1.x86_64 8/24 Verifying : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Verifying : python-devel-2.7.5-68.el7.x86_64 10/24 Verifying : libmpc-1.0.1-3.el7.x86_64 11/24 Verifying : glibc-common-2.17-222.el7.x86_64 12/24 Verifying : python-2.7.5-68.el7.x86_64 13/24 Verifying : mpfr-3.1.1-4.el7.x86_64 14/24 Verifying : 1:openssl-1.0.2k-12.el7.x86_64 15/24 Verifying : kernel-headers-3.10.0-862.3.2.el7.x86_64 16/24 Verifying : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Verifying : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 18/24 Verifying : glibc-common-2.17-157.el7_3.1.x86_64 19/24 Verifying : glibc-2.17-157.el7_3.1.x86_64 20/24 Verifying : python-libs-2.7.5-48.el7.x86_64 21/24 Verifying : libgcc-4.8.5-11.el7.x86_64 22/24 Verifying : python-2.7.5-48.el7.x86_64 23/24 Verifying : libgomp-4.8.5-11.el7.x86_64 24/24 Installed: gcc.x86_64 0:4.8.5-28.el7_5.1 python-devel.x86_64 0:2.7.5-68.el7 Dependency Installed: cpp.x86_64 0:4.8.5-28.el7_5.1 glibc-devel.x86_64 0:2.17-222.el7 glibc-headers.x86_64 0:2.17-222.el7 kernel-headers.x86_64 0:3.10.0-862.3.2.el7 libmpc.x86_64 0:1.0.1-3.el7 mpfr.x86_64 0:3.1.1-4.el7 Dependency Updated: glibc.x86_64 0:2.17-222.el7 glibc-common.x86_64 0:2.17-222.el7 libgcc.x86_64 0:4.8.5-28.el7_5.1 libgomp.x86_64 0:4.8.5-28.el7_5.1 openssl.x86_64 1:1.0.2k-12.el7 openssl-libs.x86_64 1:1.0.2k-12.el7 python.x86_64 0:2.7.5-68.el7 python-libs.x86_64 0:2.7.5-68.el7 Complete! 2018-05-28 02:17:40,967 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH cfy status stdout: Retrieving manager services status... [ip=127.0.0.1] Services: +--------------------------------+---------+ | service | status | +--------------------------------+---------+ | InfluxDB | running | | Celery Management | running | | Logstash | running | | RabbitMQ | running | | AMQP InfluxDB | running | | PostgreSQL | running | | Manager Rest-Service | running | | Cloudify Stage | running | | Riemann | running | | Webserver | running | +--------------------------------+---------+ 2018-05-28 02:17:40,968 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Upload VNFD 2018-05-28 02:17:53,643 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Get or create flavor for all clearwater vm 2018-05-28 02:17:55,326 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Create VNF Instance 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'create_deployment_environment' workflow execution 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: openstack 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /opt/mgmtworker/env/bin/pip freeze 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing deployment plugins 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin openstack [current_platform=linux_x86_64, current_distro=centos, current_distro_release=core] 2018-05-28 02:18:04,495 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-openstack-plugin/archive/2.0.1.zip 2018-05-28 02:18:04,496 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:18:09,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing from directory: /tmp/tmpcNPNRW [args=--prefix="/tmp/openstack-P6D8kd" --constraint="/tmp/openstack-P6D8kd/constraint.txt", package_name=cloudify-openstack-plugin] 2018-05-28 02:18:09,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /opt/mgmtworker/env/bin/pip install /tmp/tmpcNPNRW --prefix="/tmp/openstack-P6D8kd" --constraint="/tmp/openstack-P6D8kd/constraint.txt" 2018-05-28 02:19:29,945 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Retrieved package name: cloudify-openstack-plugin 2018-05-28 02:19:29,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmpcNPNRW 2018-05-28 02:19:29,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:19:35,285 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - 'create_deployment_environment' workflow execution succeeded 2018-05-28 02:19:35,285 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating deployment work directory 2018-05-28 02:19:35,285 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping starting deployment policy engine core - no policies defined 2018-05-28 02:19:35,433 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Start the VNF Instance deployment 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'install' workflow execution 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:41,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:47,268 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 22} 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 22, 'port_range_min': 22, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 2380, 'port_range_min': 2380, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 4000, 'port_range_min': 4000, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,269 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 443, 'port_range_min': 443, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 80, 'port_range_min': 80, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 161, 'port_range_min': 161, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 2380} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 4000} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 443} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 80} 2018-05-28 02:19:47,270 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'udp', u'port': 161} 2018-05-28 02:19:53,167 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-28 02:19:53,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 7888, 'port_range_min': 7888, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 9160, 'port_range_min': 9160, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:53,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 7888} 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 9160} 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.keypair.create' 2018-05-28 02:19:53,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:19:53,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Floating IP creation response: {u'router_id': None, u'status': u'DOWN', u'description': u'', u'tags': [], u'tenant_id': u'98553474e7df4666aafb941accbcf7ba', u'created_at': u'2018-05-28T02:19:50Z', u'updated_at': u'2018-05-28T02:19:50Z', u'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'fixed_ip_address': None, u'floating_ip_address': u'172.30.10.122', u'revision_number': 0, u'project_id': u'98553474e7df4666aafb941accbcf7ba', u'port_id': None, u'id': u'572c53b8-192c-4e97-9c0b-aa3989ffbe35'} 2018-05-28 02:19:58,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:19:58,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:58,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Floating IP creation response: {u'router_id': None, u'status': u'DOWN', u'description': u'', u'tags': [], u'tenant_id': u'98553474e7df4666aafb941accbcf7ba', u'created_at': u'2018-05-28T02:19:50Z', u'updated_at': u'2018-05-28T02:19:50Z', u'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'fixed_ip_address': None, u'floating_ip_address': u'172.30.10.120', u'revision_number': 0, u'project_id': u'98553474e7df4666aafb941accbcf7ba', u'port_id': None, u'id': u'fbdcadc3-0c33-49c7-a61c-329d07c1192a'} 2018-05-28 02:19:58,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 9888} 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 53, 'port_range_min': 53, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:58,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 9888, 'port_range_min': 9888, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:58,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 53, 'port_range_min': 53, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:19:58,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 53} 2018-05-28 02:19:58,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'udp', u'port': 53} 2018-05-28 02:19:58,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:19:58,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 6668, 'port_range_min': 6667, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 10888, 'port_range_min': 10888, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 11211, 'port_range_min': 11211, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 11311, 'port_range_min': 11311, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5052, 'port_range_min': 5052, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5054, 'port_range_min': 5054, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5058, 'port_range_min': 5058, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 7000, 'port_range_min': 7000, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 7253, 'port_range_min': 7253, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 8888, 'port_range_min': 8888, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 8889, 'port_range_min': 8889, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 9160, 'port_range_min': 9160, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port_range_max': 6668, u'port_range_min': 6667} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 10888} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 11211} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 11311} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5052} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5054} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5058} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 7000} 2018-05-28 02:20:04,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 7253} 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 8888} 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 8889} 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 9160} 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-28 02:20:04,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:04,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:10,913 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 3478} 2018-05-28 02:20:10,913 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 6669, 'port_range_min': 6669, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,913 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 3478, 'port_range_min': 3478, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5058, 'port_range_min': 5058, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5060, 'port_range_min': 5060, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 5062, 'port_range_min': 5062, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 3478, 'port_range_min': 3478, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 5060, 'port_range_min': 5060, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 65535, 'port_range_min': 32768, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 6669} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port_range_max': 65535, u'port_range_min': 32768, u'protocol': u'udp'} 2018-05-28 02:20:10,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5058} 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5060} 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'tcp', u'port': 5062} 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'udp', u'port': 3478} 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'protocol': u'udp', u'port': 5060} 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.keypair.create' 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:10,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Floating IP creation response: {u'router_id': None, u'status': u'DOWN', u'description': u'', u'tags': [], u'tenant_id': u'98553474e7df4666aafb941accbcf7ba', u'created_at': u'2018-05-28T02:20:08Z', u'updated_at': u'2018-05-28T02:20:08Z', u'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'fixed_ip_address': None, u'floating_ip_address': u'172.30.10.123', u'revision_number': 0, u'project_id': u'98553474e7df4666aafb941accbcf7ba', u'port_id': None, u'id': u'fc11440a-aded-4971-a409-6849920728d3'} 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-28 02:20:10,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Using external resource keypair: cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56 2018-05-28 02:20:16,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:16,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:16,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.keypair.create' 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:16,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 02:20:16,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:22,114 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:22,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:27,795 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 02:20:27,796 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'a9388b9c-94ce-4c34-a32e-ca8334d61e0b', u'external_name': u'clearwater-sg_dime', u'external_type': u'security_group'}] 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}] 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'79d5182f-1ea9-4417-86a0-0c57683ae199', u'external_name': u'clearwater-sg_vellum', u'external_type': u'security_group'}] 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_dime_host_rcflbn'} 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_proxy_host_r84dzn'} 2018-05-28 02:20:27,797 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_vellum_host_rw33sl'} 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'24d93475-60fe-4c99-a5d4-d1c5ed6b2ef3', u'external_name': u'clearwater-sg_ellis', u'external_type': u'security_group'}, {u'external_id': u'572c53b8-192c-4e97-9c0b-aa3989ffbe35', u'floating_ip_address': u'172.30.10.122', u'external_type': u'floatingip'}] 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'c2fccc4d-bb9c-470b-822c-8a683e7bd176', u'external_name': u'clearwater-sg_internal_sip', u'external_type': u'security_group'}, {u'external_id': u'208069b9-fff6-4bdc-b15b-dec7f5262122', u'external_name': u'clearwater-sg_sprout', u'external_type': u'security_group'}] 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_ellis_host_l9ydgp'} 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_sprout_host_u75pco'} 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_dime_host_rcflbn', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:27,798 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_dime_host_rcflbn', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:27,799 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_proxy_host_r84dzn', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:27,799 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_vellum_host_rw33sl', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:27,799 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_proxy_host_r84dzn', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:27,799 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_vellum_host_rw33sl', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:33,590 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_ellis_host_l9ydgp', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_sprout_host_u75pco', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_ellis_host_l9ydgp', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_sprout_host_u75pco', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:33,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:33,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:33,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:33,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:40,803 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:40,803 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'b4d61087-b653-4112-aaf8-01fbe9c33f62', u'external_name': u'clearwater-sg_homer', u'external_type': u'security_group'}] 2018-05-28 02:20:46,700 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_homer_host_mjmnui'} 2018-05-28 02:20:46,700 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'208de46f-9faa-494c-a65f-67b848ffb9b9', u'external_name': u'clearwater-sg_bind', u'external_type': u'security_group'}, {u'external_id': u'fbdcadc3-0c33-49c7-a61c-329d07c1192a', u'floating_ip_address': u'172.30.10.120', u'external_type': u'floatingip'}] 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Instance relationship target instances: [{u'external_id': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', u'external_type': u'keypair'}, {u'external_id': u'167f4e2f-3841-40f3-86bd-1a58a03874e1', u'external_name': u'clearwater-sg_base', u'external_type': u'security_group'}, {u'external_id': u'd2c824e0-7b94-4220-a808-5962bb09ae6d', u'external_name': u'clearwater-sg_bono', u'external_type': u'security_group'}, {u'external_id': u'c2fccc4d-bb9c-470b-822c-8a683e7bd176', u'external_name': u'clearwater-sg_internal_sip', u'external_type': u'security_group'}, {u'external_id': u'fc11440a-aded-4971-a409-6849920728d3', u'floating_ip_address': u'172.30.10.123', u'external_type': u'floatingip'}] 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_bind_host_eehrez'} 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server before transformations: {'meta': {}, 'name': 'server_clearwater-opnfv_bono_host_0py57x'} 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:46,701 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_homer_host_mjmnui', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:46,702 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_bind_host_eehrez', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_homer_host_mjmnui', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_bind_host_eehrez', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating VM with parameters: {'name': 'server_clearwater-opnfv_bono_host_0py57x', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - server.create() server after transformations: {'name': 'server_clearwater-opnfv_bono_host_0py57x', 'key_name': u'cloudify_ims_kp-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'image': u'07f7b5ce-c19e-4f09-9e81-19fe67e4909a', 'meta': {'cloudify_management_network_name': u'cloudify_ims_network-dcfb0a64-e70e-4881-a426-4ffbb06e6a56', 'cloudify_management_network_id': u'cecec741-184b-4629-bdff-259df7fdcaff'}, 'nics': [{'net-id': u'cecec741-184b-4629-bdff-259df7fdcaff'}], 'flavor': u'e1a768aa-2791-43de-9f16-707027d20a08'} 2018-05-28 02:20:46,703 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:52,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:20:52,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:52,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:52,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-28 02:20:52,576 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:52,577 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 02:20:52,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.11 on port 22: Unable to connect to port 22 on 10.67.79.11 (tried 1 time) 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,633 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' 2018-05-28 02:20:59,634 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,635 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:20:59,638 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:05,092 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:05,092 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.13 on port 22: Unable to connect to port 22 on 10.67.79.13 (tried 1 time) 2018-05-28 02:21:05,092 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,092 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:05,093 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:05,094 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-28 02:21:05,094 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.4 on port 22: Unable to connect to port 22 on 10.67.79.4 (tried 1 time) 2018-05-28 02:21:10,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.17 on port 22: Unable to connect to port 22 on 10.67.79.17 (tried 1 time) 2018-05-28 02:21:10,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:10,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:10,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:10,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.11 on port 22: Unable to connect to port 22 on 10.67.79.11 (tried 1 time) [retry 1/60] 2018-05-28 02:21:15,945 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:15,945 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:15,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:21:21,341 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:26,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:21:26,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,718 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,719 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 2/60] 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:26,720 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,206 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,206 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,206 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,206 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,206 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent sprout_host_u75pco 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/sprout_host_u75pco/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:32,207 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,208 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,209 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:21:32,210 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent ellis_host_l9ydgp 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/ellis_host_l9ydgp/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:21:32,211 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:37,687 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:37,687 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/sprout_host_u75pco/env 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/sprout_host_u75pco/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/sprout_host_u75pco/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'sprout_host_u75pco', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/sprout_host_u75pco/work', 'CLOUDIFY_DAEMON_QUEUE': 'sprout_host_u75pco', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/sprout_host_u75pco/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: sprout_host_u75pco 2018-05-28 02:21:37,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 2/60] 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:37,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent vellum_host_rw33sl 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:21:43,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/bind_host_eehrez/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:21:43,087 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/ellis_host_l9ydgp/env 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:21:43,088 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent bind_host_eehrez 2018-05-28 02:21:43,089 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/ellis_host_l9ydgp/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/ellis_host_l9ydgp/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'ellis_host_l9ydgp', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/ellis_host_l9ydgp/work', 'CLOUDIFY_DAEMON_QUEUE': 'ellis_host_l9ydgp', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:21:43,089 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:21:43,089 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:21:43,089 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:43,089 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/ellis_host_l9ydgp/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:48,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/ellis_host_l9ydgp/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:48,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: ellis_host_l9ydgp 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:21:48,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/vellum_host_rw33sl/env 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/vellum_host_rw33sl/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/vellum_host_rw33sl/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'vellum_host_rw33sl', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/vellum_host_rw33sl/work', 'CLOUDIFY_DAEMON_QUEUE': 'vellum_host_rw33sl', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/bind_host_eehrez/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/bind_host_eehrez/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'bind_host_eehrez', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/bind_host_eehrez/work', 'CLOUDIFY_DAEMON_QUEUE': 'bind_host_eehrez', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/vellum_host_rw33sl/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/bind_host_eehrez/env 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: vellum_host_rw33sl 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:21:48,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: bind_host_eehrez 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:21:48,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/bind_host_eehrez/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:21:48,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:48,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent dime_host_rcflbn 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/dime_host_rcflbn/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:53,858 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,859 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent sprout_host_u75pco 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent ellis_host_l9ydgp 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:21:53,860 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:59,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:21:59,441 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: ellis_host_l9ydgp 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:59,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/dime_host_rcflbn/env 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/dime_host_rcflbn/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/dime_host_rcflbn/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'dime_host_rcflbn', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/dime_host_rcflbn/work', 'CLOUDIFY_DAEMON_QUEUE': 'dime_host_rcflbn', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,444 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 2/60] 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/dime_host_rcflbn/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:21:59,445 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,446 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:21:59,447 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:22:05,140 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: dime_host_rcflbn 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,141 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:22:05,142 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: vellum_host_rw33sl 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: bind_host_eehrez 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:05,143 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:05,145 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,146 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,146 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,146 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:05,146 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:05,146 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,242 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent sprout_host_u75pco 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:11,244 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,245 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,246 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:22:11,247 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:22:11,248 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:22:11,248 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:22:11,248 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:17,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent homer_host_mjmnui 2018-05-28 02:22:17,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:22:17,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:22:17,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-sprout_host_u75pco start 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: sprout_host_u75pco 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:17,421 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/ellis_host_l9ydgp/env/bin/pip freeze 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:17,426 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:23,182 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:22:23,182 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,182 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,182 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:22:23,182 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/sprout_host_u75pco/env/bin/pip freeze 2018-05-28 02:22:23,183 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent bono_host_0py57x 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/bono_host_0py57x/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:22:23,184 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent vellum_host_rw33sl 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmppE1ctA 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/homer_host_mjmnui/env 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,185 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent dime_host_rcflbn 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,186 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/homer_host_mjmnui/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/homer_host_mjmnui/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'homer_host_mjmnui', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/homer_host_mjmnui/work', 'CLOUDIFY_DAEMON_QUEUE': 'homer_host_mjmnui', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:23,187 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:28,666 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/homer_host_mjmnui/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: homer_host_mjmnui 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:28,667 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmpZ7rnsN 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:28,668 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/bono_host_0py57x/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/bono_host_0py57x/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'bono_host_0py57x', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/bono_host_0py57x/work', 'CLOUDIFY_DAEMON_QUEUE': 'bono_host_0py57x', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/bono_host_0py57x/env 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-28 02:22:28,670 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: bono_host_0py57x 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: vellum_host_rw33sl 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent dime_host_rcflbn 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,671 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/bono_host_0py57x/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:22:28,672 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent bind_host_eehrez 2018-05-28 02:22:28,672 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,672 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:28,672 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:28,672 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:34,897 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,897 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,897 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,897 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: vellum_host_rw33sl 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:34,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,899 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,900 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/vellum_host_rw33sl/env/bin/pip freeze 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent homer_host_mjmnui 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:34,901 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: homer_host_mjmnui 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent bono_host_0py57x 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:34,902 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:34,903 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:41,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:41,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: bono_host_0py57x 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-dime_host_rcflbn start 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: dime_host_rcflbn 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:22:41,694 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent homer_host_mjmnui 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:41,695 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/dime_host_rcflbn/env/bin/pip freeze 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,696 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,697 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:41,698 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,698 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,698 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:22:41,698 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent bono_host_0py57x 2018-05-28 02:22:47,974 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmpZcXhl9 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:22:47,975 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: homer_host_mjmnui 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-homer_host_mjmnui start 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-bono_host_0py57x start 2018-05-28 02:22:47,976 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: bono_host_0py57x 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-bind_host_eehrez start 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: bind_host_eehrez 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:47,977 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:22:47,978 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/homer_host_mjmnui/env/bin/pip freeze 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/bind_host_eehrez/env/bin/pip freeze 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:22:47,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:47,980 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:22:47,980 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/bono_host_0py57x/env/bin/pip freeze 2018-05-28 02:22:47,980 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmpA2cWOO 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:22:53,591 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmpCyQO4X 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:22:53,592 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_floatingip' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_floatingip' 2018-05-28 02:22:53,593 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_floatingip' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:53,595 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'nova_plugin.server.start' -> Request to https://172.30.10.101:5000/v3/auth/tokens timed out 2018-05-28 02:22:58,998 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-28 02:22:58,998 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:22:58,998 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,460 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,460 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,461 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,461 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,461 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,461 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:04,461 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:10,075 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:23:10,075 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:23:10,076 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-28 02:23:10,076 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_floatingip' 2018-05-28 02:23:10,076 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_floatingip' 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Server is ACTIVE 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-28 02:23:15,423 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - fabric_env set by default value 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,424 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:15,425 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:23:20,778 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_floatingip' 2018-05-28 02:23:26,219 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:23:26,219 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating agent from package 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloading Agent Package from https://10.67.79.9:53333/resources/packages/agents/ubuntu-trusty-agent.tar.gz 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent proxy_host_r84dzn 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - env set by default value 2018-05-28 02:23:26,220 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Uploading SSL certificate from /etc/cloudify/ssl/cloudify_internal_cert.pem to /home/ubuntu/proxy_host_r84dzn/cloudify/ssl/cloudify_internal_cert.pem 2018-05-28 02:23:26,221 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - disable_requiretty set by default value 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - system_python set by default value 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:26,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - broker_get_settings_from_manager set by default value 2018-05-28 02:23:26,223 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Attempting to locate wget on the host machine 2018-05-28 02:23:26,223 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Untaring Agent package... 2018-05-28 02:23:31,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:23:31,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Cloudify Agent will be created using the following environment: {'CLOUDIFY_BROKER_IP': '10.67.79.9', 'CLOUDIFY_DAEMON_MAX_WORKERS': '5', 'BROKER_SSL_CERT_PATH': '/home/ubuntu/proxy_host_r84dzn/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_DAEMON_PROCESS_MANAGEMENT': 'init.d', 'LOCAL_REST_CERT_FILE': '/home/ubuntu/proxy_host_r84dzn/cloudify/ssl/cloudify_internal_cert.pem', 'CLOUDIFY_REST_TENANT': 'default_tenant', 'CLOUDIFY_DAEMON_USER': 'ubuntu', 'REST_PORT': '53333', 'CLOUDIFY_DAEMON_MIN_WORKERS': '0', 'CLOUDIFY_BYPASS_MAINTENANCE_MODE': 'False', 'CLOUDIFY_DAEMON_NAME': 'proxy_host_r84dzn', 'REST_HOST': '10.67.79.9', 'CLOUDIFY_DAEMON_WORKDIR': '/home/ubuntu/proxy_host_r84dzn/work', 'CLOUDIFY_DAEMON_QUEUE': 'proxy_host_r84dzn', 'CLOUDIFY_REST_TOKEN': 'WyIwIiwiMTNjOTk0MTg5Yzk1OWE0MDFkNDNhMzFmN2NkYzQyYzciXQ.Dez4tw.gMHyzhpL3v0mmt2_IbxFkktsKHc'} 2018-05-28 02:23:31,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured cfy-agent 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Auto-correcting virtualenv /home/ubuntu/proxy_host_r84dzn/env 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Disabling requiretty directive in sudoers file 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SNIMissingWarning 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating... 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - /home/ubuntu/proxy_host_r84dzn/env/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully created daemon: proxy_host_r84dzn 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-28 02:23:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:23:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,913 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent proxy_host_r84dzn 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deploying celery configuration. 2018-05-28 02:23:36,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring... 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating start-on-boot entry 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully configured daemon: proxy_host_r84dzn 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Validating SSH connection 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH connection is ready 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:23:36,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent proxy_host_r84dzn 2018-05-28 02:23:42,264 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting... 2018-05-28 02:23:42,264 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-28 02:23:42,264 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully started daemon: proxy_host_r84dzn 2018-05-28 02:23:42,264 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting daemon with command: sudo service celeryd-proxy_host_r84dzn start 2018-05-28 02:23:42,265 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 02:23:42,265 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 02:23:42,265 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-28 02:23:42,265 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin: diamond 2018-05-28 02:23:42,265 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - [localhost] run: /home/ubuntu/proxy_host_r84dzn/env/bin/pip freeze 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-diamond-plugin/archive/1.3.5.zip 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin diamond [current_platform=linux_x86_64, current_distro=ubuntu, current_distro_release=trusty] 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugin from source 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Skipping source plugin cloudify-diamond-plugin installation, as the plugin is already installed in the agent virtualenv. 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Removing directory: /tmp/tmp5gJEyu 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-28 02:23:47,612 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-28 02:23:47,613 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-28 02:23:47,613 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-28 02:23:47,613 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-28 02:23:52,955 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:23:52,955 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-28 02:23:52,955 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:23:52,955 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/monitoring/proxy_snmp/install_requirements.sh to /tmp/Y494M/install_requirements.sh 2018-05-28 02:23:52,955 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/Y494M/install_requirements.sh 2018-05-28 02:24:14,373 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'nova_plugin.server.connect_security_group' -> Unable to establish connection to https://172.30.10.101:8774/v2.1/98553474e7df4666aafb941accbcf7ba/servers/6a2cfc42-2b4c-490d-9b31-65c6b9ccf9d6: ('Connection aborted.', error(104, 'Connection reset by peer')) 2018-05-28 02:24:30,482 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' [retry 1/60] 2018-05-28 02:24:30,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' [retry 1/60] 2018-05-28 02:24:35,840 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'nova_plugin.server.connect_security_group' -> Invalid input for security_groups. Reason: Duplicate items in the list: '79d5182f-1ea9-4417-86a0-0c57683ae199'. Neutron server returns request_ids: ['req-7aa93fad-1cf5-4ced-979d-5c962aebef08'] [status_code=400] [retry 1/60] 2018-05-28 02:24:35,841 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - 'install' workflow execution failed: RuntimeError: Workflow failed: Task failed 'nova_plugin.server.connect_security_group' -> Invalid input for security_groups. Reason: Duplicate items in the list: '79d5182f-1ea9-4417-86a0-0c57683ae199'. Neutron server returns request_ids: ['req-7aa93fad-1cf5-4ced-979d-5c962aebef08'] [status_code=400] 2018-05-28 02:24:35,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - {u'status': u'failed', u'is_system_workflow': False, u'parameters': {}, u'blueprint_id': u'clearwater-opnfv', u'tenant_name': u'default_tenant', u'created_at': u'2018-05-28T02:19:35.626Z', u'created_by': u'admin', u'private_resource': False, u'workflow_id': u'install', u'error': u'Traceback (most recent call last):\n File "/tmp/pip-build-RgW6mD/cloudify-plugins-common/cloudify/dispatch.py", line 502, in _remote_workflow_child_thread\n File "/tmp/pip-build-RgW6mD/cloudify-plugins-common/cloudify/dispatch.py", line 533, in _execute_workflow_function\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/plugins/workflows.py", line 27, in install\n node_instances=set(ctx.node_instances))\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/plugins/lifecycle.py", line 28, in install_node_instances\n processor.install()\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/plugins/lifecycle.py", line 93, in install\n graph_finisher_func=self._finish_install)\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/plugins/lifecycle.py", line 114, in _process_node_instances\n self.graph.execute()\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/workflows/tasks_graph.py", line 133, in execute\n self._handle_terminated_task(task)\n File "/opt/mgmtworker/env/lib/python2.7/site-packages/cloudify/workflows/tasks_graph.py", line 207, in _handle_terminated_task\n raise RuntimeError(message)\nRuntimeError: Workflow failed: Task failed \'nova_plugin.server.connect_security_group\' -> Invalid input for security_groups. Reason: Duplicate items in the list: \'79d5182f-1ea9-4417-86a0-0c57683ae199\'.\nNeutron server returns request_ids: [\'req-7aa93fad-1cf5-4ced-979d-5c962aebef08\'] [status_code=400]\n', u'deployment_id': u'clearwater-opnfv', u'id': u'bec945bb-cc4a-4c5c-a4ad-e1b578ccb0ac'} 2018-05-28 02:24:35,986 - xtesting.energy.energy - DEBUG - Restoring previous scenario (cloudify_ims/running) 2018-05-28 02:24:35,986 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-28 02:24:36,497 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 02:24:36,498 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | cloudify_ims | functest | 13:20 | FAIL | +----------------------+------------------+------------------+----------------+ 2018-05-28 02:24:36,501 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Deleting the current deployment 2018-05-28 02:24:42,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'uninstall' workflow execution 2018-05-28 02:24:42,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,578 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:42,579 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/homer/stop-homer.sh to /tmp/5AI19/stop-homer.sh 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:42,580 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/bono/stop-bono.sh to /tmp/2YA19/stop-bono.sh 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/2YA19/stop-bono.sh 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/ellis/stop-ellis.sh to /tmp/8C3MH/stop-ellis.sh 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/sprout/stop-sprout.sh to /tmp/ZFP8Z/stop-sprout.sh 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping homer node 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/5AI19/stop-homer.sh 2018-05-28 02:24:47,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/Q09UM/stop-dime.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/Q09UM/stop-dime.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/dime/stop-dime.sh to /tmp/Q09UM/stop-dime.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/vellum/stop-vellum.sh to /tmp/XGL3J/stop-vellum.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/8C3MH/stop-ellis.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping vellum node 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping bono node 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping dime node 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping sprout node 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/XGL3J/stop-vellum.sh 2018-05-28 02:24:47,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/2YA19/stop-bono.sh 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/ZFP8Z/stop-sprout.sh 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/XGL3J/stop-vellum.sh 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/5AI19/stop-homer.sh 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-dime-host-rcflbn sudo: monit: command not found 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-homer-host-mjmnui sudo: monit: command not found 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-bono-host-0py57x sudo: monit: command not found 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-vellum-host-rw33sl sudo: monit: command not found 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping ellis node 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/8C3MH/stop-ellis.sh 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/ZFP8Z/stop-sprout.sh 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-l9ydgp sudo: monit: command not found 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-sprout-host-u75pco sudo: monit: command not found 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/87299/bind.py 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/SJUR4/bind.py 2018-05-28 02:24:47,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'homer' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'dime' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'vellum' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/INYVI/bind.py 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/O5ENL/bind.py 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/4OHZM/bind.py 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,996 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'sprout' 2018-05-28 02:24:47,997 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:47,997 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:47,997 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'bono' 2018-05-28 02:24:47,997 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/bind/bind.py to /tmp/ZLGN0/bind.py 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:53,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'ellis' 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/IKW5G/remove-cluster-node.sh 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/ZFMBI/remove-cluster-node.sh 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/EWNXI/remove-cluster-node.sh 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/8VPPG/remove-cluster-node.sh 2018-05-28 02:24:53,518 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/EWNXI/remove-cluster-node.sh 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/8VPPG/remove-cluster-node.sh 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/EWNXI/remove-cluster-node.sh 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/ZFMBI/remove-cluster-node.sh 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/IKW5G/remove-cluster-node.sh 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-homer-host-mjmnui clearwater-etcd: unrecognized service 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-28 02:24:53,519 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-bono-host-0py57x clearwater-etcd: unrecognized service 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-dime-host-rcflbn clearwater-etcd: unrecognized service 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-vellum-host-rw33sl clearwater-etcd: unrecognized service 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/ZFMBI/remove-cluster-node.sh 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/U60HA/remove-cluster-node.sh 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/U60HA/remove-cluster-node.sh 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/U60HA/remove-cluster-node.sh 2018-05-28 02:24:53,520 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Downloaded scripts/clearwater/other/remove-cluster-node.sh to /tmp/F6CJP/remove-cluster-node.sh 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/IKW5G/remove-cluster-node.sh 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Executing: /tmp/F6CJP/remove-cluster-node.sh 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/8VPPG/remove-cluster-node.sh 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-l9ydgp clearwater-etcd: unrecognized service 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-sprout-host-u75pco clearwater-etcd: unrecognized service 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Execution done (return_code=1): /tmp/F6CJP/remove-cluster-node.sh 2018-05-28 02:24:53,521 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Remove node in ETCD cluster 2018-05-28 02:24:53,522 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.stop' -> 'diamond_paths' 2018-05-28 02:24:53,522 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:53,522 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,929 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,929 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,929 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,930 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.uninstall' -> 'diamond_paths' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:24:58,931 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,932 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent homer_host_mjmnui 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:24:58,933 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-28 02:24:58,935 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,401 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,401 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,402 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.stop' -> 'diamond_paths' 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent vellum_host_rw33sl 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,404 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.uninstall' -> 'diamond_paths' 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent bono_host_0py57x 2018-05-28 02:25:04,405 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,406 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,407 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,407 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,407 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:04,407 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,751 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,751 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,751 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent ellis_host_l9ydgp 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,752 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent sprout_host_u75pco 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.stop' -> timeout after 10 seconds (pid=1779) 2018-05-28 02:25:09,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping... 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully stopped daemon: vellum_host_rw33sl 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping daemon with command: sudo service celeryd-vellum_host_rw33sl stop 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping... 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping... 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping daemon with command: sudo service celeryd-ellis_host_l9ydgp stop 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully stopped daemon: bono_host_0py57x 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping daemon with command: sudo service celeryd-bono_host_0py57x stop 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully stopped daemon: ellis_host_l9ydgp 2018-05-28 02:25:09,755 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent bind_host_eehrez 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,756 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,757 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:09,757 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,167 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,167 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,167 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:15,167 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,168 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent proxy_host_r84dzn 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping Agent dime_host_rcflbn 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,169 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping... 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:15,170 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping daemon with command: sudo service celeryd-proxy_host_r84dzn stop 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully stopped daemon: proxy_host_r84dzn 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping... 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully stopped daemon: dime_host_rcflbn 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping daemon with command: sudo service celeryd-dime_host_rcflbn stop 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-28 02:25:15,171 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:15,172 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:20,548 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,548 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,548 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,548 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,548 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,549 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting Agent vellum_host_rw33sl 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting Agent bono_host_0py57x 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting... 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully deleted daemon: bono_host_0py57x 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting start-on-boot entry 2018-05-28 02:25:20,550 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting start-on-boot entry 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully deleted daemon: vellum_host_rw33sl 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting... 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 02:25:20,551 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,015 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,016 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting Agent proxy_host_r84dzn 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting Agent ellis_host_l9ydgp 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting... 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully deleted daemon: ellis_host_l9ydgp 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting... 2018-05-28 02:25:26,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting start-on-boot entry 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully deleted daemon: proxy_host_r84dzn 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting start-on-boot entry 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 02:25:26,018 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting Agent dime_host_rcflbn 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Applying function:setter on Attribute 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting... 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting start-on-boot entry 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Successfully deleted daemon: dime_host_rcflbn 2018-05-28 02:25:31,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:25:31,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 02:25:31,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 02:25:31,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-28 02:25:31,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 02:25:37,054 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-28 02:25:37,054 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 02:25:37,055 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:25:37,055 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-28 02:25:37,055 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:25:42,453 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:25:42,453 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-28 02:25:42,454 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 02:25:42,454 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 02:25:42,454 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:25:47,784 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:25:47,784 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:25:47,784 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:25:58,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:25:58,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:25:58,442 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:09,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_floatingip' 2018-05-28 02:26:09,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 02:26:09,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:09,085 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting server 2018-05-28 02:26:14,517 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Waiting for server "ee33612c-44d0-45b1-afa9-a6433da28c8a" to be deleted. current status: SHUTOFF 2018-05-28 02:26:19,881 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-28 02:26:19,881 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:30,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:30,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:30,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:26:30,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 02:26:41,678 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:41,678 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:41,678 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:47,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 02:26:47,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'nova_plugin.server.stop' -> Unable to establish connection to https://172.30.10.101:8774/v2.1/98553474e7df4666aafb941accbcf7ba/servers/b1e33e78-1d07-42d1-a69e-0cc1a317cf35/action: ('Connection aborted.', error(104, 'Connection reset by peer')) 2018-05-28 02:26:47,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:47,017 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting server 2018-05-28 02:26:52,348 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:52,348 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:52,348 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:52,348 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Waiting for server "6a2cfc42-2b4c-490d-9b31-65c6b9ccf9d6" to be deleted. current status: SHUTOFF 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:26:57,898 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:27:03,281 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:03,281 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-28 02:27:08,603 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:08,603 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:08,603 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:19,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:19,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-28 02:27:19,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:19,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 02:27:19,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting security_group 2018-05-28 02:27:24,654 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-28 02:27:24,654 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:30,094 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-28 02:27:30,094 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 02:27:35,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 02:27:35,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:35,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting server 2018-05-28 02:27:35,420 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Waiting for server "2561bd95-0f3e-43bc-a443-1fb637b9c409" to be deleted. current status: SHUTOFF 2018-05-28 02:27:46,101 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-28 02:27:46,102 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting server 2018-05-28 02:27:51,416 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Waiting for server "b1e33e78-1d07-42d1-a69e-0cc1a317cf35" to be deleted. current status: SHUTOFF 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-28 02:27:56,743 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting floatingip 2018-05-28 02:28:02,061 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-28 02:28:02,062 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-28 02:28:02,062 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting security_group 2018-05-28 02:28:02,062 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-28 02:28:02,062 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting security_group 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'nova_plugin.server.disconnect_security_group' -> Unable to establish connection to https://172.30.10.101:8774/v2.1/98553474e7df4666aafb941accbcf7ba/servers/4e5a5bd8-c966-486e-aae9-944acd2e9eea/os-security-groups: ('Connection aborted.', error(104, 'Connection reset by peer')) 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 02:28:07,394 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:28:12,819 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting server 2018-05-28 02:28:12,819 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Waiting for server "4e5a5bd8-c966-486e-aae9-944acd2e9eea" to be deleted. current status: SHUTOFF 2018-05-28 02:28:23,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting floatingip 2018-05-28 02:28:23,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - deleting security_group 2018-05-28 02:28:28,890 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-28 02:28:28,890 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-28 03:24:37,371 - functest.opnfv_tests.vnf.ims.cloudify_ims - ERROR - Some issue during the undeployment .. Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/ims/cloudify_ims.py", line 429, in clean wait_for_execution(cfy_client, execution, self.__logger) File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/ims/cloudify_ims.py", line 510, in wait_for_execution execution.deployment_id)) RuntimeError: execution of operation uninstall for deployment clearwater-opnfv timed out 2018-05-28 03:24:37,373 - functest.core.vnf - INFO - Removing the VNF resources .. 2018-05-28 03:25:00,389 - functest.core.vnf - ERROR - Unexpected error cleaning - Unable to complete operation on subnet 87a50323-af91-4a30-9ed0-56d7d766997e: One or more ports have an IP allocation from this subnet. Neutron server returns request_ids: ['req-5821f788-1975-422e-b39f-ced119ca787b'] 2018-05-28 03:25:12,915 - xtesting.ci.run_tests - ERROR - The test case 'cloudify_ims' failed. 2018-05-28 03:25:12,915 - xtesting.ci.run_tests - INFO - Running test case 'vyos_vrouter'... 2018-05-28 03:25:14,314 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:25:15,370 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Orchestrator configuration {'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'cloudify.medium'}, u'os_image': u'cloudify_manager_4.0'}} 2018-05-28 03:25:15,370 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - name = functest.opnfv_tests.vnf.router.cloudify_vrouter 2018-05-28 03:25:15,409 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - VNF configuration: {'inputs': {u'region': u'RegionOne', u'external_network_name': u'admin_floating_net'}, 'requirements': {u'flavor': {u'ram_min': 2048, u'name': u'cloudify.medium'}}, 'descriptor': {u'file_name': u'/src/opnfv-vnf-vyos-blueprint/function-test-openstack-blueprint.yaml', u'version': u'fraser', u'name': u'vrouter-opnfv'}} 2018-05-28 03:25:15,417 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:25:15,447 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Images needed for vrouter: {u'cloudify_manager_4.0': u'/home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2', u'vyos1.1.7': u'/home/opnfv/functest/images/vyos-1.1.7.img'} 2018-05-28 03:25:15,448 - functest.core.vnf - INFO - Prepare VNF: vyos_vrouter, description: Created by OPNFV Functest: vyos_vrouter 2018-05-28 03:25:19,255 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=vyos_vrouter-d8a3af0a-b9bf-490b-a0d2-ddc1f0264bf0, password=81880283-92ea-40b9-b35c-d90823875944, auth_url=http://10.167.4.35:35357/v3, project_name=vyos_vrouter-d8a3af0a-b9bf-490b-a0d2-ddc1f0264bf0, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 03:25:19,255 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Additional pre-configuration steps 2018-05-28 03:25:19,255 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Get or create flavor for cloudify manager vm ... 2018-05-28 03:25:23,704 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - snaps creds: OSCreds - username=cloudify_network_bug-d8a3af0a-b9bf-490b-a0d2-ddc1f0264bf0, password=afc5b45b-5c61-4569-9a71-1695d21575ac, auth_url=http://10.167.4.35:35357/v3, project_name=vyos_vrouter-d8a3af0a-b9bf-490b-a0d2-ddc1f0264bf0, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 03:25:23,705 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating keypair ... 2018-05-28 03:25:25,516 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Upload some OS images if it doesn't exist 2018-05-28 03:25:25,517 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - image: cloudify_manager_4.0, file: /home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2 2018-05-28 03:27:28,019 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - image: vyos1.1.7, file: /home/opnfv/functest/images/vyos-1.1.7.img 2018-05-28 03:27:43,712 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating full network ... 2018-05-28 03:27:59,898 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating security group for cloudify manager vm 2018-05-28 03:28:04,590 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating cloudify manager VM 2018-05-28 03:30:11,064 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Attemps running status of the Manager 2018-05-28 03:30:20,590 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - status {u'status': u'running', u'services': [{u'instances': [{u'LoadState': u'loaded', u'Description': u'InfluxDB Service', u'MainPID': 807, u'state': u'running', u'Id': u'cloudify-influxdb.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Management Worker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-mgmtworker.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Celery Management'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'LSB: Starts Logstash as a daemon.', u'MainPID': 0, u'state': u'running', u'Id': u'logstash.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Logstash'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'RabbitMQ Service', u'MainPID': 2095, u'state': u'start-post', u'Id': u'cloudify-rabbitmq.service', u'ActiveState': u'activating', u'SubState': u'start-post'}], u'display_name': u'RabbitMQ'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify AMQP InfluxDB Broker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-amqpinflux.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'AMQP InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'PostgreSQL 9.5 database server', u'MainPID': 892, u'state': u'running', u'Id': u'postgresql-9.5.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'PostgreSQL'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify REST Service', u'MainPID': 809, u'state': u'running', u'Id': u'cloudify-restservice.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Manager Rest-Service'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Stage Service', u'MainPID': 814, u'state': u'running', u'Id': u'cloudify-stage.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Cloudify Stage'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Riemann Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-riemann.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Riemann'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'nginx - high performance web server', u'MainPID': 855, u'state': u'running', u'Id': u'nginx.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Webserver'}]} 2018-05-28 03:30:20,915 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - The current manager status is running 2018-05-28 03:30:20,915 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Put private keypair in manager 2018-05-28 03:30:24,820 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo cp ~/cloudify_vrouter.pem /etc/cloudify/ stdout: 2018-05-28 03:30:25,090 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo chmod 444 /etc/cloudify/cloudify_vrouter.pem stdout: 2018-05-28 03:31:42,207 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo yum install -y gcc python-devel stdout: Loaded plugins: fastestmirror Determining fastest mirrors * base: mirror.tocici.com * extras: mirrordenver.fdcservers.net * updates: mirrors.xmission.com Resolving Dependencies --> Running transaction check ---> Package gcc.x86_64 0:4.8.5-28.el7_5.1 will be installed --> Processing Dependency: libgomp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: cpp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libgcc >= 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: glibc-devel >= 2.2.90-12 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpfr.so.4()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpc.so.3()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 ---> Package python-devel.x86_64 0:2.7.5-68.el7 will be installed --> Processing Dependency: python(x86-64) = 2.7.5-68.el7 for package: python-devel-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package cpp.x86_64 0:4.8.5-28.el7_5.1 will be installed ---> Package glibc-devel.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: glibc-headers = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc-headers for package: glibc-devel-2.17-222.el7.x86_64 ---> Package libgcc.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgcc.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libgomp.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgomp.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libmpc.x86_64 0:1.0.1-3.el7 will be installed ---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed ---> Package python.x86_64 0:2.7.5-48.el7 will be updated ---> Package python.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: python-libs(x86-64) = 2.7.5-68.el7 for package: python-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc.x86_64 0:2.17-157.el7_3.1 will be updated --> Processing Dependency: glibc = 2.17-157.el7_3.1 for package: glibc-common-2.17-157.el7_3.1.x86_64 ---> Package glibc.x86_64 0:2.17-222.el7 will be an update ---> Package glibc-headers.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: kernel-headers >= 2.2.1 for package: glibc-headers-2.17-222.el7.x86_64 --> Processing Dependency: kernel-headers for package: glibc-headers-2.17-222.el7.x86_64 ---> Package python-libs.x86_64 0:2.7.5-48.el7 will be updated ---> Package python-libs.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: libcrypto.so.10(OPENSSL_1.0.2)(64bit) for package: python-libs-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc-common.x86_64 0:2.17-157.el7_3.1 will be updated ---> Package glibc-common.x86_64 0:2.17-222.el7 will be an update ---> Package kernel-headers.x86_64 0:3.10.0-862.3.2.el7 will be installed ---> Package openssl-libs.x86_64 1:1.0.1e-60.el7_3.1 will be updated --> Processing Dependency: openssl-libs(x86-64) = 1:1.0.1e-60.el7_3.1 for package: 1:openssl-1.0.1e-60.el7_3.1.x86_64 ---> Package openssl-libs.x86_64 1:1.0.2k-12.el7 will be an update --> Running transaction check ---> Package openssl.x86_64 1:1.0.1e-60.el7_3.1 will be updated ---> Package openssl.x86_64 1:1.0.2k-12.el7 will be an update --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: gcc x86_64 4.8.5-28.el7_5.1 updates 16 M python-devel x86_64 2.7.5-68.el7 base 397 k Installing for dependencies: cpp x86_64 4.8.5-28.el7_5.1 updates 5.9 M glibc-devel x86_64 2.17-222.el7 base 1.1 M glibc-headers x86_64 2.17-222.el7 base 678 k kernel-headers x86_64 3.10.0-862.3.2.el7 updates 7.1 M libmpc x86_64 1.0.1-3.el7 base 51 k mpfr x86_64 3.1.1-4.el7 base 203 k Updating for dependencies: glibc x86_64 2.17-222.el7 base 3.6 M glibc-common x86_64 2.17-222.el7 base 11 M libgcc x86_64 4.8.5-28.el7_5.1 updates 101 k libgomp x86_64 4.8.5-28.el7_5.1 updates 156 k openssl x86_64 1:1.0.2k-12.el7 base 492 k openssl-libs x86_64 1:1.0.2k-12.el7 base 1.2 M python x86_64 2.7.5-68.el7 base 93 k python-libs x86_64 2.7.5-68.el7 base 5.6 M Transaction Summary ================================================================================ Install 2 Packages (+6 Dependent packages) Upgrade ( 8 Dependent packages) Total download size: 54 M Downloading packages: Delta RPMs disabled because /usr/bin/applydeltarpm not installed. -------------------------------------------------------------------------------- Total 14 MB/s | 54 MB 00:03 Running transaction check Running transaction test Transaction test succeeded Running transaction Updating : libgcc-4.8.5-28.el7_5.1.x86_64 1/24 Updating : glibc-common-2.17-222.el7.x86_64 2/24 Updating : glibc-2.17-222.el7.x86_64 3/24 warning: /etc/nsswitch.conf created as /etc/nsswitch.conf.rpmnew Installing : mpfr-3.1.1-4.el7.x86_64 4/24 Installing : libmpc-1.0.1-3.el7.x86_64 5/24 Updating : 1:openssl-libs-1.0.2k-12.el7.x86_64 6/24 Updating : python-libs-2.7.5-68.el7.x86_64 7/24 Updating : python-2.7.5-68.el7.x86_64 8/24 Installing : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Updating : libgomp-4.8.5-28.el7_5.1.x86_64 10/24 Installing : kernel-headers-3.10.0-862.3.2.el7.x86_64 11/24 Installing : glibc-headers-2.17-222.el7.x86_64 12/24 Installing : glibc-devel-2.17-222.el7.x86_64 13/24 Installing : gcc-4.8.5-28.el7_5.1.x86_64 14/24 Installing : python-devel-2.7.5-68.el7.x86_64 15/24 Updating : 1:openssl-1.0.2k-12.el7.x86_64 16/24 Cleanup : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Cleanup : python-2.7.5-48.el7.x86_64 18/24 Cleanup : python-libs-2.7.5-48.el7.x86_64 19/24 Cleanup : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 20/24 Cleanup : libgomp-4.8.5-11.el7.x86_64 21/24 Cleanup : glibc-common-2.17-157.el7_3.1.x86_64 22/24 Cleanup : glibc-2.17-157.el7_3.1.x86_64 23/24 Cleanup : libgcc-4.8.5-11.el7.x86_64 24/24 Verifying : python-libs-2.7.5-68.el7.x86_64 1/24 Verifying : glibc-devel-2.17-222.el7.x86_64 2/24 Verifying : glibc-headers-2.17-222.el7.x86_64 3/24 Verifying : 1:openssl-libs-1.0.2k-12.el7.x86_64 4/24 Verifying : libgomp-4.8.5-28.el7_5.1.x86_64 5/24 Verifying : gcc-4.8.5-28.el7_5.1.x86_64 6/24 Verifying : glibc-2.17-222.el7.x86_64 7/24 Verifying : libgcc-4.8.5-28.el7_5.1.x86_64 8/24 Verifying : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Verifying : python-devel-2.7.5-68.el7.x86_64 10/24 Verifying : libmpc-1.0.1-3.el7.x86_64 11/24 Verifying : glibc-common-2.17-222.el7.x86_64 12/24 Verifying : python-2.7.5-68.el7.x86_64 13/24 Verifying : mpfr-3.1.1-4.el7.x86_64 14/24 Verifying : 1:openssl-1.0.2k-12.el7.x86_64 15/24 Verifying : kernel-headers-3.10.0-862.3.2.el7.x86_64 16/24 Verifying : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Verifying : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 18/24 Verifying : glibc-common-2.17-157.el7_3.1.x86_64 19/24 Verifying : glibc-2.17-157.el7_3.1.x86_64 20/24 Verifying : python-libs-2.7.5-48.el7.x86_64 21/24 Verifying : libgcc-4.8.5-11.el7.x86_64 22/24 Verifying : python-2.7.5-48.el7.x86_64 23/24 Verifying : libgomp-4.8.5-11.el7.x86_64 24/24 Installed: gcc.x86_64 0:4.8.5-28.el7_5.1 python-devel.x86_64 0:2.7.5-68.el7 Dependency Installed: cpp.x86_64 0:4.8.5-28.el7_5.1 glibc-devel.x86_64 0:2.17-222.el7 glibc-headers.x86_64 0:2.17-222.el7 kernel-headers.x86_64 0:3.10.0-862.3.2.el7 libmpc.x86_64 0:1.0.1-3.el7 mpfr.x86_64 0:3.1.1-4.el7 Dependency Updated: glibc.x86_64 0:2.17-222.el7 glibc-common.x86_64 0:2.17-222.el7 libgcc.x86_64 0:4.8.5-28.el7_5.1 libgomp.x86_64 0:4.8.5-28.el7_5.1 openssl.x86_64 1:1.0.2k-12.el7 openssl-libs.x86_64 1:1.0.2k-12.el7 python.x86_64 0:2.7.5-68.el7 python-libs.x86_64 0:2.7.5-68.el7 Complete! 2018-05-28 03:31:42,207 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Get or create flavor for vrouter 2018-05-28 03:31:45,684 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Upload VNFD 2018-05-28 03:31:54,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Create VNF Instance 2018-05-28 03:32:13,984 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Installing plugin: openstack 2018-05-28 03:32:13,984 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - [localhost] run: /opt/mgmtworker/env/bin/pip freeze 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Installing plugin openstack [current_platform=linux_x86_64, current_distro=centos, current_distro_release=core] 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Installing plugin from source 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Installing from directory: /tmp/tmpVtMzHR [args=--prefix="/tmp/openstack-cvvwBn" --constraint="/tmp/openstack-cvvwBn/constraint.txt", package_name=cloudify-openstack-plugin] 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - [localhost] run: /opt/mgmtworker/env/bin/pip install /tmp/tmpVtMzHR --prefix="/tmp/openstack-cvvwBn" --constraint="/tmp/openstack-cvvwBn/constraint.txt" 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Extracting archive: https://github.com/cloudify-cosmo/cloudify-openstack-plugin/archive/2.0.1.zip 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting 'create_deployment_environment' workflow execution 2018-05-28 03:32:13,985 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Installing deployment plugins 2018-05-28 03:33:24,079 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Retrieved package name: cloudify-openstack-plugin 2018-05-28 03:33:24,079 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Removing directory: /tmp/tmpVtMzHR 2018-05-28 03:33:29,389 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-28 03:33:29,389 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating deployment work directory 2018-05-28 03:33:29,389 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - 'create_deployment_environment' workflow execution succeeded 2018-05-28 03:33:29,389 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Skipping starting deployment policy engine core - no policies defined 2018-05-28 03:33:29,528 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Start the VNF Instance deployment 2018-05-28 03:33:35,730 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting 'install' workflow execution 2018-05-28 03:33:35,730 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:35,730 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.create' 2018-05-28 03:33:41,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-28 03:33:41,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-28 03:33:41,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-28 03:33:41,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.keypair.create' 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.create' 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule before transformations: {u'remote_ip_prefix': u'0.0.0.0/0', u'port': 22} 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': 'tcp', 'ethertype': 'IPv4', 'port_range_max': 22, 'port_range_min': 22, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:41,180 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule before transformations: {u'port_range_min': 0, u'port_range_max': 0, u'protocol': u'icmp', u'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,525 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule before transformations: {u'port_range_min': 1, u'port_range_max': 65535, u'protocol': u'tcp', u'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,525 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule before transformations: {u'port_range_min': 1, u'port_range_max': 65535, u'protocol': u'udp', u'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'tcp', 'ethertype': 'IPv4', 'port_range_max': 65535, 'port_range_min': 1, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Using external resource network: floating_net 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'udp', 'ethertype': 'IPv4', 'port_range_max': 65535, 'port_range_min': 1, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Security group rule after transformations: {'remote_group_id': None, 'direction': 'ingress', 'protocol': u'icmp', 'ethertype': 'IPv4', 'port_range_max': 0, 'port_range_min': 0, 'remote_ip_prefix': u'0.0.0.0/0'} 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.create' 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.create' 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.create' 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:46,526 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.keypair.create' 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Floating IP creation response: {u'router_id': None, u'status': u'DOWN', u'description': u'', u'tags': [], u'tenant_id': u'8f028a14aabf403bbd9d90ade8067ede', u'created_at': u'2018-05-28T03:33:42Z', u'updated_at': u'2018-05-28T03:33:42Z', u'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'fixed_ip_address': None, u'floating_ip_address': u'172.30.10.123', u'revision_number': 0, u'project_id': u'8f028a14aabf403bbd9d90ade8067ede', u'port_id': None, u'id': u'7cb5d5a7-a5ff-4c15-bde7-73796ccb31b2'} 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Floating IP creation response: {u'router_id': None, u'status': u'DOWN', u'description': u'', u'tags': [], u'tenant_id': u'8f028a14aabf403bbd9d90ade8067ede', u'created_at': u'2018-05-28T03:33:42Z', u'updated_at': u'2018-05-28T03:33:42Z', u'floating_network_id': u'11c92fd4-326a-487a-a640-1b09c88fcb5b', u'fixed_ip_address': None, u'floating_ip_address': u'172.30.10.125', u'revision_number': 0, u'project_id': u'8f028a14aabf403bbd9d90ade8067ede', u'port_id': None, u'id': u'df2c683c-2bd8-4b31-9a28-2725ab7cd276'} 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.router.create' 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.router.create' 2018-05-28 03:33:46,527 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:46,528 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-28 03:33:46,528 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:51,964 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.create' 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.keypair.create' 2018-05-28 03:33:51,965 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.subnet.create' 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:51,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.subnet.create' 2018-05-28 03:33:57,924 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.router.create' 2018-05-28 03:33:57,924 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:33:57,924 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:33:57,924 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.subnet.create' 2018-05-28 03:34:03,420 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:03,420 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.subnet.create' 2018-05-28 03:34:03,420 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.subnet.create' 2018-05-28 03:34:03,420 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:03,421 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:08,744 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:08,744 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:08,744 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.create' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.create' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.create' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.create' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.subnet.create' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.router.connect_subnet' 2018-05-28 03:34:08,745 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:08,746 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.router.connect_subnet' 2018-05-28 03:34:14,159 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.create' 2018-05-28 03:34:14,159 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:14,159 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:14,159 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:14,159 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.create' 2018-05-28 03:34:14,160 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:14,160 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:14,160 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:14,160 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:14,160 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.router.connect_subnet' 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - connect_security_group(): source_id=b6a10873-e767-4ca4-9abc-5113394abc0f target={u'external_id': u'b1fceb8c-abfb-4d5b-b096-e5f225b05176', u'external_name': u'vnf_test_security_group', u'external_type': u'security_group'} 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - connect_security_group(): source_id=a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb target={u'external_id': u'b1fceb8c-abfb-4d5b-b096-e5f225b05176', u'external_name': u'vnf_test_security_group', u'external_type': u'security_group'} 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.create' 2018-05-28 03:34:19,485 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.create' 2018-05-28 03:34:19,486 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.create' 2018-05-28 03:34:19,486 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.create' 2018-05-28 03:34:19,486 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:19,486 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:24,799 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.create' 2018-05-28 03:34:24,800 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.create' 2018-05-28 03:34:24,800 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:24,800 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:30,178 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:30,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - connect_security_group(): source_id=6e6da789-d620-497f-9652-16dd06286974 target={u'external_id': u'b1fceb8c-abfb-4d5b-b096-e5f225b05176', u'external_name': u'vnf_test_security_group', u'external_type': u'security_group'} 2018-05-28 03:34:30,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - connect_security_group(): source_id=17fbaf0e-0b5b-46e9-9287-ec3582b8d18f target={u'external_id': u'b1fceb8c-abfb-4d5b-b096-e5f225b05176', u'external_name': u'vnf_test_security_group', u'external_type': u'security_group'} 2018-05-28 03:34:30,179 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:35,499 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:35,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:35,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:35,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:35,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.connect_security_group' 2018-05-28 03:34:35,500 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:40,849 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:40,849 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.connect_port' 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating node 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.create' 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Instance relationship target instances: [{u'external_id': u'vnf_test_keypair', u'external_name': u'vnf_test_keypair', u'external_type': u'keypair'}, {u'external_id': u'6e6da789-d620-497f-9652-16dd06286974', u'fixed_ip_address': u'11.0.0.11', u'external_name': u'target_vnf_port', u'external_type': u'port', u'mac_address': u'fa:16:3e:03:aa:a0'}, {u'external_id': u'a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb', u'fixed_ip_address': u'12.0.0.10', u'external_name': u'target_vnf_data_plane_port', u'external_type': u'port', u'mac_address': u'fa:16:3e:81:35:0d'}] 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - server.create() server before transformations: {'meta': {}, 'name': u'target_vnf'} 2018-05-28 03:34:40,850 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Instance relationship target instances: [{u'external_id': u'vnf_test_keypair', u'external_name': u'vnf_test_keypair', u'external_type': u'keypair'}, {u'external_id': u'17fbaf0e-0b5b-46e9-9287-ec3582b8d18f', u'fixed_ip_address': u'11.0.0.13', u'external_name': u'reference_vnf_port', u'external_type': u'port', u'mac_address': u'fa:16:3e:56:a8:b3'}, {u'external_id': u'b6a10873-e767-4ca4-9abc-5113394abc0f', u'fixed_ip_address': u'12.0.0.4', u'external_name': u'reference_vnf_data_plane_port', u'external_type': u'port', u'mac_address': u'fa:16:3e:54:38:59'}] 2018-05-28 03:34:40,851 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - server.create() server before transformations: {'meta': {}, 'name': u'reference_vnf'} 2018-05-28 03:34:46,222 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - server.create() server after transformations: {'name': u'target_vnf', 'key_name': u'vnf_test_keypair', 'image': u'46d91cfa-dcee-4ef1-b441-ab28261511fb', 'meta': {'cloudify_management_network_name': u'management_plane_network', 'cloudify_management_network_id': u'c2e3e4d3-d150-48fa-9227-b89db593637e'}, 'nics': [{'port-id': u'6e6da789-d620-497f-9652-16dd06286974'}, {'port-id': u'a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb'}], 'flavor': u'9f190396-2861-41af-9d34-b84e692e2cbb'} 2018-05-28 03:34:46,222 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 03:34:46,222 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating VM with parameters: {'name': u'target_vnf', 'key_name': u'vnf_test_keypair', 'image': u'46d91cfa-dcee-4ef1-b441-ab28261511fb', 'meta': {'cloudify_management_network_name': u'management_plane_network', 'cloudify_management_network_id': u'c2e3e4d3-d150-48fa-9227-b89db593637e'}, 'nics': [{'port-id': u'6e6da789-d620-497f-9652-16dd06286974'}, {'port-id': u'a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb'}], 'flavor': u'9f190396-2861-41af-9d34-b84e692e2cbb'} 2018-05-28 03:34:46,222 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Asking Nova to create server. All possible parameters are: name,key_name,image,meta,nics,flavor) 2018-05-28 03:34:46,223 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Creating VM with parameters: {'name': u'reference_vnf', 'key_name': u'vnf_test_keypair', 'image': u'46d91cfa-dcee-4ef1-b441-ab28261511fb', 'meta': {'cloudify_management_network_name': u'management_plane_network', 'cloudify_management_network_id': u'c2e3e4d3-d150-48fa-9227-b89db593637e'}, 'nics': [{'port-id': u'17fbaf0e-0b5b-46e9-9287-ec3582b8d18f'}, {'port-id': u'b6a10873-e767-4ca4-9abc-5113394abc0f'}], 'flavor': u'9f190396-2861-41af-9d34-b84e692e2cbb'} 2018-05-28 03:34:46,223 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - server.create() server after transformations: {'name': u'reference_vnf', 'key_name': u'vnf_test_keypair', 'image': u'46d91cfa-dcee-4ef1-b441-ab28261511fb', 'meta': {'cloudify_management_network_name': u'management_plane_network', 'cloudify_management_network_id': u'c2e3e4d3-d150-48fa-9227-b89db593637e'}, 'nics': [{'port-id': u'17fbaf0e-0b5b-46e9-9287-ec3582b8d18f'}, {'port-id': u'b6a10873-e767-4ca4-9abc-5113394abc0f'}], 'flavor': u'9f190396-2861-41af-9d34-b84e692e2cbb'} 2018-05-28 03:34:46,223 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 03:34:51,717 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:51,717 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 03:34:51,717 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Configuring node 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting node 2018-05-28 03:34:51,718 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.start' 2018-05-28 03:34:57,390 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-28 03:35:02,694 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-28 03:35:23,920 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:23,921 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:29,271 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:29,271 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Server is ACTIVE 2018-05-28 03:35:29,271 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:29,271 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:34,571 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Server is ACTIVE 2018-05-28 03:35:34,571 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 1/60] 2018-05-28 03:35:34,571 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - 'install' workflow execution succeeded 2018-05-28 03:35:34,703 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - {u'status': u'terminated', u'is_system_workflow': False, u'parameters': {}, u'blueprint_id': u'vrouter-opnfv', u'tenant_name': u'default_tenant', u'created_at': u'2018-05-28T03:33:29.720Z', u'created_by': u'admin', u'private_resource': False, u'workflow_id': u'install', u'error': u'', u'deployment_id': u'vrouter-opnfv', u'id': u'f3b916a7-17cd-4bfd-bae7-1581c9f8dcd2'} 2018-05-28 03:35:34,714 - functest.opnfv_tests.vnf.router.vrouter_base - INFO - BGP Interoperability test. 2018-05-28 03:35:34,864 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - {u'outputs': {u'vnfs': {u'reference_vnf': {u'vnf_name': u'reference_vnf', u'public_key_path': u'~/.ssh/vnf_test_keypair.pem', u'floating_ip': u'172.30.10.123'}, u'target_vnf': {u'vnf_name': u'target_vnf', u'public_key_path': u'~/.ssh/vnf_test_keypair.pem', u'floating_ip': u'172.30.10.125'}}, u'networks': {u'management_plane_network': {u'network_name': u'management_plane_network'}, u'data_plane_network': {u'network_name': u'data_plane_network'}}}, u'deployment_id': u'vrouter-opnfv'} 2018-05-28 03:35:35,014 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - {u'outputs': {u'vnfs': {u'reference_vnf': {u'vnf_name': u'reference_vnf', u'public_key_path': u'~/.ssh/vnf_test_keypair.pem', u'floating_ip': u'172.30.10.123'}, u'target_vnf': {u'vnf_name': u'target_vnf', u'public_key_path': u'~/.ssh/vnf_test_keypair.pem', u'floating_ip': u'172.30.10.125'}}, u'networks': {u'management_plane_network': {u'network_name': u'management_plane_network'}, u'data_plane_network': {u'network_name': u'data_plane_network'}}}, u'deployment_id': u'vrouter-opnfv'} 2018-05-28 03:35:35,014 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - vnf name : reference_vnf 2018-05-28 03:35:35,014 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - reference_vnf floating ip address : 172.30.10.123 2018-05-28 03:35:39,679 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - management_plane_network_ip of reference_vnf : 11.0.0.13 2018-05-28 03:35:39,679 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - management_plane_network_mac of reference_vnf : fa:16:3e:56:a8:b3 2018-05-28 03:35:44,006 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - data_plane_network_ip of reference_vnf : 12.0.0.4 2018-05-28 03:35:44,007 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - data_plane_network_mac of reference_vnf : fa:16:3e:54:38:59 2018-05-28 03:35:44,007 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - vnf name : target_vnf 2018-05-28 03:35:44,007 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - target_vnf floating ip address : 172.30.10.125 2018-05-28 03:35:48,250 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - management_plane_network_ip of target_vnf : 11.0.0.11 2018-05-28 03:35:48,250 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - management_plane_network_mac of target_vnf : fa:16:3e:03:aa:a0 2018-05-28 03:35:52,416 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - data_plane_network_ip of target_vnf : 12.0.0.10 2018-05-28 03:35:52,416 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - data_plane_network_mac of target_vnf : fa:16:3e:81:35:0d 2018-05-28 03:35:52,416 - functest.opnfv_tests.vnf.router.vrouter_base - DEBUG - request vnf's reboot. 2018-05-28 03:35:52,416 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - reboot the reference_vnf 2018-05-28 03:35:54,932 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - reboot the target_vnf 2018-05-28 03:36:27,722 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - init test exec 2018-05-28 03:36:27,735 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:36:27,755 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - init vnf controller 2018-05-28 03:36:27,765 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:36:27,782 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - initialize vm controller 2018-05-28 03:36:27,782 - functest.opnfv_tests.vnf.router.vnf_controller.command_generator - DEBUG - init command generator 2018-05-28 03:36:27,791 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:36:27,852 - functest.opnfv_tests.vnf.router.vrouter_base - INFO - vRouter test Start Time:'2018-05-28 03:36:27' 2018-05-28 03:36:27,852 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Start config command target_vnf and reference_vnf 2018-05-28 03:36:27,852 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Configuration to target vnf 2018-05-28 03:36:27,861 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:36:27,881 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connect to 172.30.10.125. 2018-05-28 03:36:30,932 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH timeout for 172.30.10.125... 2018-05-28 03:37:00,963 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connect to 172.30.10.125. 2018-05-28 03:37:01,234 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connection established to 172.30.10.125. 2018-05-28 03:37:02,265 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : configure 2018-05-28 03:37:03,267 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : configure [edit] vyos@vyos# 2018-05-28 03:37:08,272 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : sudo /sbin/ifconfig eth0 mtu 1450 2018-05-28 03:37:09,274 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : sudo /sbin/ifconfig eth0 mtu 1450 [edit] vyos@vyos# 2018-05-28 03:37:14,279 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : sudo /sbin/ifconfig eth1 mtu 1450 2018-05-28 03:37:15,281 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : sudo /sbin/ifconfig eth1 mtu 1450 [edit] vyos@vyos# 2018-05-28 03:37:20,286 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.1.0/24 blackhole distance 1 2018-05-28 03:37:21,288 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.1.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:26,294 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.2.0/24 blackhole distance 1 2018-05-28 03:37:27,296 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.2.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:32,301 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.3.0/24 blackhole distance 1 2018-05-28 03:37:33,303 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.3.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:38,308 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.4.0/24 blackhole distance 1 2018-05-28 03:37:39,310 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.4.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:44,315 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.5.0/24 blackhole distance 1 2018-05-28 03:37:45,316 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.5.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:50,320 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.6.0/24 blackhole distance 1 2018-05-28 03:37:51,321 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.6.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:37:56,326 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.7.0/24 blackhole distance 1 2018-05-28 03:37:57,327 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.7.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:38:02,332 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.8.0/24 blackhole distance 1 2018-05-28 03:38:03,333 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.8.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:38:08,338 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.9.0/24 blackhole distance 1 2018-05-28 03:38:09,339 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.9.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:38:14,342 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.0.10.0/24 blackhole distance 1 2018-05-28 03:38:15,343 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.0.10.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:38:20,347 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : commit 2018-05-28 03:38:22,350 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : commit [edit] vyos@vyos# 2018-05-28 03:38:27,356 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 neighbor 12.0.0.4 ebgp-multihop '2' 2018-05-28 03:38:28,357 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 neighbor 12.0.0.4 ebgp-multihop '2' [edit] vyos@vyos# 2018-05-28 03:38:33,361 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 neighbor 12.0.0.4 remote-as 65002 2018-05-28 03:38:34,363 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 neighbor 12.0.0.4 remote-as 65002 [edit] vyos@vyos# 2018-05-28 03:38:39,367 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 neighbor 12.0.0.4 update-source 12.0.0.10 2018-05-28 03:38:40,369 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 neighbor 12.0.0.4 update-source 12.0.0.10 [edit] vyos@vyos# 2018-05-28 03:38:45,369 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 neighbor 12.0.0.4 soft-reconfiguration inbound 2018-05-28 03:38:46,371 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 neighbor 12.0.0.4 soft-reconfiguration inboun d [edit] vyos@vyos# 2018-05-28 03:38:51,375 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 neighbor 12.0.0.4 password lab0033 2018-05-28 03:38:52,376 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 neighbor 12.0.0.4 password lab0033 [edit] vyos@vyos# 2018-05-28 03:38:57,379 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.1.0/24 2018-05-28 03:38:58,380 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.1.0/24 [edit] vyos@vyos# 2018-05-28 03:39:03,382 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.2.0/24 2018-05-28 03:39:04,384 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.2.0/24 [edit] vyos@vyos# 2018-05-28 03:39:09,388 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.3.0/24 2018-05-28 03:39:10,390 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.3.0/24 [edit] vyos@vyos# 2018-05-28 03:39:15,395 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.4.0/24 2018-05-28 03:39:16,397 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.4.0/24 [edit] vyos@vyos# 2018-05-28 03:39:21,398 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.5.0/24 2018-05-28 03:39:22,399 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.5.0/24 [edit] vyos@vyos# 2018-05-28 03:39:27,404 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.6.0/24 2018-05-28 03:39:28,405 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.6.0/24 [edit] vyos@vyos# 2018-05-28 03:39:33,408 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.7.0/24 2018-05-28 03:39:34,410 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.7.0/24 [edit] vyos@vyos# 2018-05-28 03:39:39,415 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.8.0/24 2018-05-28 03:39:40,417 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.8.0/24 [edit] vyos@vyos# 2018-05-28 03:39:45,422 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.9.0/24 2018-05-28 03:39:46,423 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.9.0/24 [edit] vyos@vyos# 2018-05-28 03:39:51,428 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 network 10.0.10.0/24 2018-05-28 03:39:52,430 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 network 10.0.10.0/24 [edit] vyos@vyos# 2018-05-28 03:39:57,435 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65001 parameters router-id 12.0.0.10 2018-05-28 03:39:58,436 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65001 parameters router-id 12.0.0.10 [edit] vyos@vyos# 2018-05-28 03:40:03,441 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : commit 2018-05-28 03:40:05,658 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : commit [edit] vyos@vyos# 2018-05-28 03:40:10,725 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Configuration to reference vnf 2018-05-28 03:40:10,745 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:40:10,780 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connect to 172.30.10.123. 2018-05-28 03:40:11,030 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connection established to 172.30.10.123. 2018-05-28 03:40:12,061 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : configure 2018-05-28 03:40:13,063 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : configure [edit] vyos@vyos# 2018-05-28 03:40:18,068 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : sudo /sbin/ifconfig eth0 mtu 1450 2018-05-28 03:40:19,070 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : sudo /sbin/ifconfig eth0 mtu 1450 [edit] vyos@vyos# 2018-05-28 03:40:24,076 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : sudo /sbin/ifconfig eth1 mtu 1450 2018-05-28 03:40:25,078 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : sudo /sbin/ifconfig eth1 mtu 1450 [edit] vyos@vyos# 2018-05-28 03:40:30,083 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.1.0/24 blackhole distance 1 2018-05-28 03:40:31,085 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.1.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:40:36,090 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.2.0/24 blackhole distance 1 2018-05-28 03:40:37,092 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.2.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:40:42,098 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.3.0/24 blackhole distance 1 2018-05-28 03:40:43,100 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.3.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:40:48,105 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.4.0/24 blackhole distance 1 2018-05-28 03:40:49,107 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.4.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:40:54,113 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.5.0/24 blackhole distance 1 2018-05-28 03:40:55,115 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.5.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:00,120 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.6.0/24 blackhole distance 1 2018-05-28 03:41:01,122 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.6.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:06,127 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.7.0/24 blackhole distance 1 2018-05-28 03:41:07,129 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.7.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:12,135 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.8.0/24 blackhole distance 1 2018-05-28 03:41:13,137 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.8.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:18,142 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.9.0/24 blackhole distance 1 2018-05-28 03:41:19,144 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.9.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:24,150 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols static route 10.1.10.0/24 blackhole distance 1 2018-05-28 03:41:25,152 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols static route 10.1.10.0/24 blackhole distance 1 [edit] vyos@vyos# 2018-05-28 03:41:30,157 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : commit 2018-05-28 03:41:32,160 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : commit [edit] vyos@vyos# 2018-05-28 03:41:37,166 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 neighbor 12.0.0.10 ebgp-multihop '2' 2018-05-28 03:41:38,167 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 neighbor 12.0.0.10 ebgp-multihop '2' [edit] vyos@vyos# 2018-05-28 03:41:43,172 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 neighbor 12.0.0.10 remote-as 65001 2018-05-28 03:41:44,173 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 neighbor 12.0.0.10 remote-as 65001 [edit] vyos@vyos# 2018-05-28 03:41:49,179 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 neighbor 12.0.0.10 update-source 12.0.0.4 2018-05-28 03:41:50,181 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 neighbor 12.0.0.10 update-source 12.0.0.4 [edit] vyos@vyos# 2018-05-28 03:41:55,186 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 neighbor 12.0.0.10 soft-reconfiguration inbound 2018-05-28 03:41:56,188 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 neighbor 12.0.0.10 soft-reconfiguration inbou nd [edit] vyos@vyos# 2018-05-28 03:42:01,194 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 neighbor 12.0.0.10 password lab0033 2018-05-28 03:42:02,196 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 neighbor 12.0.0.10 password lab0033 [edit] vyos@vyos# 2018-05-28 03:42:07,201 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.1.0/24 2018-05-28 03:42:08,203 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.1.0/24 [edit] vyos@vyos# 2018-05-28 03:42:13,208 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.2.0/24 2018-05-28 03:42:14,210 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.2.0/24 [edit] vyos@vyos# 2018-05-28 03:42:19,213 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.3.0/24 2018-05-28 03:42:20,215 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.3.0/24 [edit] vyos@vyos# 2018-05-28 03:42:25,219 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.4.0/24 2018-05-28 03:42:26,221 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.4.0/24 [edit] vyos@vyos# 2018-05-28 03:42:31,226 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.5.0/24 2018-05-28 03:42:32,228 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.5.0/24 [edit] vyos@vyos# 2018-05-28 03:42:37,233 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.6.0/24 2018-05-28 03:42:38,235 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.6.0/24 [edit] vyos@vyos# 2018-05-28 03:42:43,240 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.7.0/24 2018-05-28 03:42:44,242 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.7.0/24 [edit] vyos@vyos# 2018-05-28 03:42:49,246 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.8.0/24 2018-05-28 03:42:50,248 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.8.0/24 [edit] vyos@vyos# 2018-05-28 03:42:55,253 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.9.0/24 2018-05-28 03:42:56,255 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.9.0/24 [edit] vyos@vyos# 2018-05-28 03:43:01,260 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 network 10.1.10.0/24 2018-05-28 03:43:02,262 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 network 10.1.10.0/24 [edit] vyos@vyos# 2018-05-28 03:43:07,268 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : set protocols bgp 65002 parameters router-id 12.0.0.4 2018-05-28 03:43:08,270 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : set protocols bgp 65002 parameters router-id 12.0.0.4 [edit] vyos@vyos# 2018-05-28 03:43:13,271 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Command : commit 2018-05-28 03:43:15,538 - functest.opnfv_tests.vnf.router.vnf_controller.vm_controller - DEBUG - Response : commit [edit] vyos@vyos# 2018-05-28 03:43:20,608 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Finish config command. 2018-05-28 03:43:20,608 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Waiting for protocol stable. 2018-05-28 03:44:20,652 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Start check method 2018-05-28 03:44:20,671 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-28 03:44:20,707 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connect to 172.30.10.125. 2018-05-28 03:44:20,940 - functest.opnfv_tests.vnf.router.vnf_controller.ssh_client - INFO - SSH connection established to 172.30.10.125. 2018-05-28 03:44:21,947 - functest.opnfv_tests.vnf.router.vnf_controller.checker - DEBUG - init checker 2018-05-28 03:44:52,106 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - INFO - Test result: +-------------------------------+----------------+ | TEST ITEM | RESULT | +-------------------------------+----------------+ | Check bgp peer | OK | | Check bgp status | OK | | Check route advertise | OK | | Check route receive | OK | | Check route table | OK | +-------------------------------+----------------+ 2018-05-28 03:44:52,106 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - show ip bgp summary | no-more BGP router identifier 12.0.0.10, local AS number 65001 IPv4 Unicast - max multipaths: ebgp 1 ibgp 1 RIB entries 39, using 3744 bytes of memory Peers 1, using 4560 bytes of memory Neighbor V AS MsgRcvd MsgSent TblVer InQ OutQ Up/Down State/PfxRcd 12.0.0.4 4 65002 3 5 0 0 0 00:01:04 10 Total number of neighbors 1 vyos@vyos:~$ 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - show ip bgp neighbors 12.0.0.4 | no-more BGP neighbor is 12.0.0.4, remote AS 65002, local AS 65001, external link BGP version 4, remote router ID 12.0.0.4 BGP state = Established, up for 00:01:10 Last read 20:23:52, hold time is 180, keepalive interval is 60 seconds Neighbor capabilities: 4 Byte AS: advertised and received Route refresh: advertised and received(old & new) Address family IPv4 Unicast: advertised and received Message statistics: Inq depth is 0 Outq depth is 0 Sent Rcvd Opens: 1 0 Notifications: 0 0 Updates: 1 1 Keepalives: 3 2 Route Refresh: 0 0 Capability: 0 0 Total: 5 3 Minimum time between advertisement runs is 30 seconds Update source is 12.0.0.10 For address family: IPv4 Unicast Inbound soft reconfiguration allowed Community attribute sent to this neighbor(both) 10 accepted prefixes Connections established 1; dropped 0 Last reset never External BGP neighbor may be up to 2 hops away. Local host: 12.0.0.10, Local port: 179 Foreign host: 12.0.0.4, Foreign port: 42117 Nexthop: 12.0.0.10 Nexthop global: fe80::f816:3eff:fe81:350d Nexthop local: :: BGP connection: non shared network Read thread: on Write thread: off vyos@vyos:~$ 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - show ip bgp neighbors 12.0.0.4 advertised-routes | no-more BGP table version is 0, local router ID is 12.0.0.10 Status codes: s suppressed, d damped, h history, * valid, > best, i - internal, r RIB-failure, S Stale, R Removed Origin codes: i - IGP, e - EGP, ? - incomplete Network Next Hop Metric LocPrf Weight Path *> 10.0.1.0/24 12.0.0.10 0 32768 i *> 10.0.2.0/24 12.0.0.10 0 32768 i *> 10.0.3.0/24 12.0.0.10 0 32768 i *> 10.0.4.0/24 12.0.0.10 0 32768 i *> 10.0.5.0/24 12.0.0.10 0 32768 i *> 10.0.6.0/24 12.0.0.10 0 32768 i *> 10.0.7.0/24 12.0.0.10 0 32768 i *> 10.0.8.0/24 12.0.0.10 0 32768 i *> 10.0.9.0/24 12.0.0.10 0 32768 i *> 10.0.10.0/24 12.0.0.10 0 32768 i Total number of prefixes 10 vyos@vyos:~$ 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - show ip bgp neighbors 12.0.0.4 received-routes | no-more BGP table version is 0, local router ID is 12.0.0.10 Status codes: s suppressed, d damped, h history, * valid, > best, i - internal, r RIB-failure, S Stale, R Removed Origin codes: i - IGP, e - EGP, ? - incomplete Network Next Hop Metric LocPrf Weight Path *> 10.1.1.0/24 12.0.0.4 0 0 65002 i *> 10.1.2.0/24 12.0.0.4 0 0 65002 i *> 10.1.3.0/24 12.0.0.4 0 0 65002 i *> 10.1.4.0/24 12.0.0.4 0 0 65002 i *> 10.1.5.0/24 12.0.0.4 0 0 65002 i *> 10.1.6.0/24 12.0.0.4 0 0 65002 i *> 10.1.7.0/24 12.0.0.4 0 0 65002 i *> 10.1.8.0/24 12.0.0.4 0 0 65002 i *> 10.1.9.0/24 12.0.0.4 0 0 65002 i *> 10.1.10.0/24 12.0.0.4 0 0 65002 i Total number of prefixes 10 vyos@vyos:~$ 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.vnf_controller.vnf_controller - DEBUG - show ip bgp neighbors 12.0.0.4 routes | no-more BGP table version is 0, local router ID is 12.0.0.10 Status codes: s suppressed, d damped, h history, * valid, > best, i - internal, r RIB-failure, S Stale, R Removed Origin codes: i - IGP, e - EGP, ? - incomplete Network Next Hop Metric LocPrf Weight Path *> 10.1.1.0/24 12.0.0.4 0 0 65002 i *> 10.1.2.0/24 12.0.0.4 0 0 65002 i *> 10.1.3.0/24 12.0.0.4 0 0 65002 i *> 10.1.4.0/24 12.0.0.4 0 0 65002 i *> 10.1.5.0/24 12.0.0.4 0 0 65002 i *> 10.1.6.0/24 12.0.0.4 0 0 65002 i *> 10.1.7.0/24 12.0.0.4 0 0 65002 i *> 10.1.8.0/24 12.0.0.4 0 0 65002 i *> 10.1.9.0/24 12.0.0.4 0 0 65002 i *> 10.1.10.0/24 12.0.0.4 0 0 65002 i Total number of prefixes 10 vyos@vyos:~$ 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.test_controller.function_test_exec - DEBUG - Finish check method. 2018-05-28 03:44:52,107 - functest.opnfv_tests.vnf.router.vrouter_base - INFO - vRouter test duration :'504.3' 2018-05-28 03:44:52,108 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - delete the reference_vnf 2018-05-28 03:44:54,538 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - delete the target_vnf 2018-05-28 03:44:57,252 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 03:44:57,254 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | vyos_vrouter | functest | 19:42 | PASS | +----------------------+------------------+------------------+----------------+ 2018-05-28 03:44:57,257 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Deleting the current deployment 2018-05-28 03:45:03,276 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Starting 'uninstall' workflow execution 2018-05-28 03:45:03,276 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 03:45:03,276 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:03,276 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:03,276 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-28 03:45:03,277 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 03:45:03,277 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-28 03:45:08,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task failed 'nova_plugin.server.stop' -> Instance 974ebe72-ee8c-43b0-a199-35d88637d711 could not be found. [status_code=404] 2018-05-28 03:45:08,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.detach' 2018-05-28 03:45:08,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task failed 'nova_plugin.server.stop' -> Instance a9c493be-5118-43a0-a14c-8ada55202699 could not be found. [status_code=404] 2018-05-28 03:45:08,619 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.detach' 2018-05-28 03:45:13,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Detaching port a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb... 2018-05-28 03:45:13,966 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Detaching port b6a10873-e767-4ca4-9abc-5113394abc0f... 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.detach' 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Successfully detached port a3a9ec9a-77ce-4e53-8b6f-fd4de29581cb 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Successfully detached port b6a10873-e767-4ca4-9abc-5113394abc0f 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.detach' 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.detach' 2018-05-28 03:45:13,967 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.detach' 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Detaching port 6e6da789-d620-497f-9652-16dd06286974... 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Detaching port 17fbaf0e-0b5b-46e9-9287-ec3582b8d18f... 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Successfully detached port 17fbaf0e-0b5b-46e9-9287-ec3582b8d18f 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.detach' 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.detach' 2018-05-28 03:45:19,286 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Successfully detached port 6e6da789-d620-497f-9652-16dd06286974 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting server 2018-05-28 03:45:19,287 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting server 2018-05-28 03:45:24,875 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task failed 'nova_plugin.server.delete' -> Instance 974ebe72-ee8c-43b0-a199-35d88637d711 could not be found. [status_code=404] 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task failed 'nova_plugin.server.delete' -> Instance a9c493be-5118-43a0-a14c-8ada55202699 could not be found. [status_code=404] 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.disconnect_port' 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:24,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:24,877 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.disconnect_port' 2018-05-28 03:45:30,637 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:30,637 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:30,637 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'nova_plugin.keypair.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'nova_plugin.keypair.delete' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.disconnect_port' 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting port 2018-05-28 03:45:30,638 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting keypair 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting port 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting private key file at /etc/cloudify/.ssh/vnf_test_keypair.pem 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.delete' 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.delete' 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.disconnect_port' 2018-05-28 03:45:30,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'nova_plugin.keypair.delete' 2018-05-28 03:45:36,135 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.port.delete' 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.port.delete' 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.delete' 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting port 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.delete' 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting port 2018-05-28 03:45:36,136 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.subnet.delete' 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.subnet.delete' 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.delete' 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting subnet 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:36,137 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.port.delete' 2018-05-28 03:45:41,505 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.router.disconnect_subnet' 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.router.disconnect_subnet' 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting floatingip 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:41,506 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting floatingip 2018-05-28 03:45:41,507 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.subnet.delete' 2018-05-28 03:45:46,875 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting security_group 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.delete' 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.delete' 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting network 2018-05-28 03:45:46,876 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.router.disconnect_subnet' 2018-05-28 03:45:46,877 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:46,877 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.subnet.delete' 2018-05-28 03:45:46,877 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.subnet.delete' 2018-05-28 03:45:52,189 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.delete' 2018-05-28 03:45:52,189 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting subnet 2018-05-28 03:45:57,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.subnet.delete' 2018-05-28 03:45:57,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:57,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:45:57,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:57,618 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:45:57,619 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.delete' 2018-05-28 03:45:57,619 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.router.delete' 2018-05-28 03:45:57,619 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.delete' 2018-05-28 03:45:57,619 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.router.delete' 2018-05-28 03:46:03,129 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting network 2018-05-28 03:46:03,129 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - deleting router 2018-05-28 03:46:03,130 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.delete' 2018-05-28 03:46:03,130 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.router.delete' 2018-05-28 03:46:03,130 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Stopping node 2018-05-28 03:46:08,444 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Deleting node 2018-05-28 03:46:08,444 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Sending task 'neutron_plugin.network.delete' 2018-05-28 03:46:08,444 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task started 'neutron_plugin.network.delete' 2018-05-28 03:46:08,444 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - not deleting network since an external network is being used 2018-05-28 03:46:08,445 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Task succeeded 'neutron_plugin.network.delete' 2018-05-28 03:46:08,445 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - 'uninstall' workflow execution succeeded 2018-05-28 03:46:09,539 - functest.core.vnf - INFO - Removing the VNF resources .. 2018-05-28 03:46:47,969 - xtesting.ci.run_tests - INFO - Running test case 'juju_epc'... 2018-05-28 03:46:48,064 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - VNF configuration: {'descriptor': {u'file_name': u'/src/epc-requirements/abot_charm/functest-abot-epc-bundle/bundle.yaml', u'version': u'1', u'name': u'abot-oai-epc'}, 'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'm1.medium.juju'}}} 2018-05-28 03:46:48,088 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Additional pre-configuration steps 2018-05-28 03:46:48,088 - functest.core.vnf - INFO - Prepare VNF: juju_epc, description: Created by OPNFV Functest: juju_epc 2018-05-28 03:46:51,743 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=juju_epc-55686fa2-d6b3-4d63-b400-cd844eaefd18, password=252bf79b-691a-49ab-b2fa-43bbfd9ce989, auth_url=http://10.167.4.35:35357/v3, project_name=juju_epc-55686fa2-d6b3-4d63-b400-cd844eaefd18, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 03:46:51,745 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - ENV: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | DEPLOY_SCENARIO | os-nosdn-nofeature-ha | | BUILD_TAG | jenkins-functest-fuel-baremetal-daily-master-231 | | SDN_CONTROLLER_IP | | | ENERGY_RECORDER_API_PASSWORD | | | INSTALLER_TYPE | fuel | | NAMESERVER | 8.8.8.8 | | POD_ARCH | x86_64 | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | NODE_NAME | lf-pod2 | | VOLUME_DEVICE_NAME | vdc | | EXTERNAL_NETWORK | | | ENERGY_RECORDER_API_USER | | +--------------------------------------+----------------------------------------------------------+ 2018-05-28 03:46:52,568 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Cloud for Abot-epc ..... 2018-05-28 03:46:53,811 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju add-cloud abot-epc -f /home/opnfv/functest/results/juju_epc/clouds.yaml --replace Since Juju 2 is being run for the first time, downloading latest cloud information. Fetching latest public cloud list... Updated your list of public clouds with 6 cloud regions added: added cloud region: - aws/eu-west-3 - google/asia-south1 - google/europe-west2 - google/europe-west3 - google/southamerica-east1 - google/us-east4 2018-05-28 03:46:53,811 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Credentials for Abot-epc ..... 2018-05-28 03:46:55,799 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - snaps creds: OSCreds - username=juju_network_discovery_bug, password=8536c7b7-6586-46a8-8c29-f20d99e39483, auth_url=http://10.167.4.35:35357/v3, project_name=juju_epc-55686fa2-d6b3-4d63-b400-cd844eaefd18, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=internal, region_name=RegionOne, proxy_settings=None, cacert=/etc/ssl/certs/mcp_os_cacert 2018-05-28 03:46:56,026 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju add-credential abot-epc -f /home/opnfv/functest/results/juju_epc/credentials.yaml --replace Credentials updated for cloud "abot-epc". 2018-05-28 03:46:56,027 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Deploying Juju Orchestrator 2018-05-28 03:46:56,028 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating full network with nameserver: 8.8.8.8 2018-05-28 03:47:01,736 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating network Router .... 2018-05-28 03:47:13,165 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Flavor .... 2018-05-28 03:47:14,394 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Upload some OS images if it doesn't exist 2018-05-28 03:47:14,405 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Images needed for vEPC: {u'trusty': u'/home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img', u'xenial': u'/home/opnfv/functest/images/ubuntu-16.04-server-cloudimg-amd64-disk1.img'} 2018-05-28 03:47:14,405 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - image: trusty, file: /home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img 2018-05-28 03:47:25,034 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju metadata generate-image -d /root -i 5673ec9b-0bee-4c44-9b28-72f2bbe076f5 -s trusty -r RegionOne -u https://172.30.10.101:5000/v3 WARNING model could not be opened: No controllers registered. Please either create a new controller using "juju bootstrap" or connect to another controller that you have been given access to using "juju register". Image metadata files have been written to: /root/images/streams/v1. For Juju to use this metadata, the files need to be put into the image metadata search path. There are 2 options: 1. Use the --metadata-source parameter when bootstrapping: juju bootstrap --metadata-source /root 2. Use image-metadata-url in $JUJU_DATA/environments.yaml (if $JUJU_DATA is not set it will try $XDG_DATA_HOME/juju and if not set either default to ~/.local/share/juju) Configure a http server to serve the contents of /root and set the value of image-metadata-url accordingly. 2018-05-28 03:47:25,035 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - image: xenial, file: /home/opnfv/functest/images/ubuntu-16.04-server-cloudimg-amd64-disk1.img 2018-05-28 03:47:35,985 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju metadata generate-image -d /root -i 616f36a0-a63f-4bdb-8b78-3bf079b7e53d -s xenial -r RegionOne -u https://172.30.10.101:5000/v3 WARNING model could not be opened: No controllers registered. Please either create a new controller using "juju bootstrap" or connect to another controller that you have been given access to using "juju register". Image metadata files have been written to: /root/images/streams/v1. For Juju to use this metadata, the files need to be put into the image metadata search path. There are 2 options: 1. Use the --metadata-source parameter when bootstrapping: juju bootstrap --metadata-source /root 2. Use image-metadata-url in $JUJU_DATA/environments.yaml (if $JUJU_DATA is not set it will try $XDG_DATA_HOME/juju and if not set either default to ~/.local/share/juju) Configure a http server to serve the contents of /root and set the value of image-metadata-url accordingly. 2018-05-28 03:47:35,986 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Network ID : dc4d35b3-89be-430d-8542-25b22b5f83eb 2018-05-28 03:47:35,986 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Starting Juju Bootstrap process... 2018-05-28 03:51:38,015 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju bootstrap abot-epc abot-controller --metadata-source /root --constraints mem=2G --bootstrap-series xenial --config network=dc4d35b3-89be-430d-8542-25b22b5f83eb --config ssl-hostname-verification=false --config use-floating-ip=true --config use-default-secgroup=true --debug 03:47:36 INFO juju.cmd supercommand.go:63 running juju [2.2.5 gc go1.9.4] 03:47:36 DEBUG juju.cmd supercommand.go:64 args: []string{"juju", "bootstrap", "abot-epc", "abot-controller", "--metadata-source", "/root", "--constraints", "mem=2G", "--bootstrap-series", "xenial", "--config", "network=dc4d35b3-89be-430d-8542-25b22b5f83eb", "--config", "ssl-hostname-verification=false", "--config", "use-floating-ip=true", "--config", "use-default-secgroup=true", "--debug"} 03:47:36 DEBUG juju.cmd.juju.commands bootstrap.go:804 authenticating with region "" and credential "abot-epc" () 03:47:36 DEBUG juju.cmd.juju.commands bootstrap.go:932 provider attrs: map[network:dc4d35b3-89be-430d-8542-25b22b5f83eb external-network: use-floating-ip:true use-default-secgroup:true] 03:47:36 INFO cmd authkeys.go:114 Adding contents of "/root/.local/share/juju/ssh/juju_id_rsa.pub" to authorized-keys 03:47:36 DEBUG juju.cmd.juju.commands bootstrap.go:988 preparing controller with config: map[automatically-retry-hooks:true ftp-proxy: ignore-machine-addresses:false use-default-secgroup:true provisioner-harvest-mode:destroyed type:openstack agent-metadata-url: use-floating-ip:true apt-http-proxy: agent-stream:released http-proxy: uuid:17b2c0fe-8d44-47fa-84e4-667d8f071968 development:false image-stream:released proxy-ssh:false logforward-enabled:false disable-network-management:false network:dc4d35b3-89be-430d-8542-25b22b5f83eb test-mode:false max-status-history-age:336h authorized-keys:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC30YzTwh3fkAsr/OqUfhDQ0IhHXWDfdH4m91LZx6kFy/73Q2iHgfmNcdrjIl/N81E8jgx+S7c4tSydXYXmCxYzcDVU+9OB2tHZWKlsv+ZzKTxx2CPqnHbbqOBIjamXSpQXAgv3LilW6Jww9RUQBD1/dl4ZFejBMa6Ct90uyx5pOPwXohOlvaEZiWYpf0HarKl3NfGFdF+k0oNj2JvHKoIHdUCEgt+tqHnir1SgFwtONvg3Baho5gEKaSouvniQffhCy6+jlqUdagTUeb/Io0r7t7s4kXK1wsNAVgFXPP33Va/hurKH/+neEitMMr2I3azCcmJ6uB5VakKlS2/MP/ir juju-client-key no-proxy:127.0.0.1,localhost,::1 apt-https-proxy: name:controller ssl-hostname-verification:false net-bond-reconfigure-delay:17 apt-ftp-proxy: external-network: image-metadata-url: firewall-mode:instance transmit-vendor-metrics:true update-status-hook-interval:5m default-series:xenial max-action-results-age:336h max-status-history-size:5G apt-no-proxy: https-proxy: resource-tags: apt-mirror: max-action-results-size:5G enable-os-refresh-update:true enable-os-upgrade:true logging-config:] 03:47:36 INFO juju.provider.openstack provider.go:144 opening model "controller" 03:47:37 DEBUG goose :1 auth details: &{Token:gAAAAABbC3vZxIzhvVJigEhL3NcL7YMm4qvO-A6jVxDhdCh6XDs89nKz4-U8W45FFVGljfbTtah-bQ4e0TJNyFIzmk8DWJLKAIuL01bLOOWevR0miAizPxAcVrqvhpQEYFFO73uiCVfrviuZaYCiifglje207TAfmA3MUpWZtHahwvxtVW9fsLU TenantId:3b12091bf57d4f2bbe141fc160c84724 UserId:c3331d4f03a649bda60653191ee9cfea Domain: RegionServiceURLs:map[RegionOne:map[volumev3:https://172.30.10.101:8776/v3/3b12091bf57d4f2bbe141fc160c84724 compute:https://172.30.10.101:8774/v2.1/3b12091bf57d4f2bbe141fc160c84724 orchestration:https://172.30.10.101:8004/v1/3b12091bf57d4f2bbe141fc160c84724 placement:https://172.30.10.101:8778 image:https://172.30.10.101:9292 compute_legacy:https://172.30.10.101:8774/v2/3b12091bf57d4f2bbe141fc160c84724 dns:https://172.30.10.101:9001/ volumev2:https://172.30.10.101:8776/v2/3b12091bf57d4f2bbe141fc160c84724 event:https://172.30.10.101:8977/ cloudformation:https://172.30.10.101:8000/v1 alarming:https://172.30.10.101:8042/ network:https://172.30.10.101:9696/ identity:https://172.30.10.101:5000/v3 metering:https://172.30.10.101:8777/ metric:https://172.30.10.101:8041/ volume:https://172.30.10.101:8776/v1/3b12091bf57d4f2bbe141fc160c84724]]} 03:47:37 INFO cmd bootstrap.go:482 Creating Juju controller "abot-controller" on abot-epc/RegionOne 03:47:37 DEBUG goose :1 performing API version discovery for "https://172.30.10.101:8774/" 03:47:38 DEBUG goose :1 discovered API versions: [{Version:{major:2 minor:0} Links:[{Href:https://172.30.10.101:8774/v2/ Rel:self}] Status:SUPPORTED} {Version:{major:2 minor:1} Links:[{Href:https://172.30.10.101:8774/v2.1/ Rel:self}] Status:CURRENT}] 03:47:38 INFO juju.cmd.juju.commands bootstrap.go:540 combined bootstrap constraints: mem=2048M 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:199 model "controller" supports service/machine networks: true 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:201 network management by juju enabled: true 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:685 no agent directory found, using default agent binary metadata source: https://streams.canonical.com/juju/tools 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:710 setting default image metadata source: /root/images 03:47:38 DEBUG juju.environs imagemetadata.go:46 new user image datasource registered: bootstrap metadata 03:47:38 INFO juju.environs.bootstrap bootstrap.go:728 custom image metadata added to search path 03:47:38 INFO cmd bootstrap.go:233 Loading image metadata 03:47:38 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "bootstrap metadata" 03:47:38 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "default cloud images" 03:47:38 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "default ubuntu cloud images" 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:576 constraints for image metadata lookup &{{{RegionOne https://172.30.10.101:5000/v3} [win2016hv win2016nano opensuseleap precise win81 vivid quantal trusty yakkety zesty win2008r2 win8 raring win2012 win7 win10 centos7 saucy utopic win2012hv win2012r2 win2016 genericlinux wily xenial win2012hvr2] [amd64 i386 armhf arm64 ppc64el s390x] released}} 03:47:38 DEBUG juju.environs.bootstrap bootstrap.go:588 found 2 image metadata in bootstrap metadata 03:47:40 DEBUG juju.environs.bootstrap bootstrap.go:588 found 0 image metadata in default cloud images 03:47:41 DEBUG juju.environs.simplestreams simplestreams.go:457 skipping index "http://cloud-images.ubuntu.com/releases/streams/v1/index.sjson" because of missing information: index file has no data for cloud {RegionOne https://172.30.10.101:5000/v3} not found 03:47:41 DEBUG juju.environs.bootstrap bootstrap.go:584 ignoring image metadata in default ubuntu cloud images: index file has no data for cloud {RegionOne https://172.30.10.101:5000/v3} not found 03:47:41 DEBUG juju.environs.bootstrap bootstrap.go:592 found 2 image metadata from all image data sources 03:47:41 INFO cmd bootstrap.go:296 Looking for packaged Juju agent version 2.2.5 for amd64 03:47:41 INFO juju.environs.bootstrap tools.go:72 looking for bootstrap agent binaries: version=2.2.5 03:47:41 DEBUG juju.environs.tools tools.go:101 finding agent binaries in stream "released" 03:47:41 DEBUG juju.environs.tools tools.go:103 reading agent binaries with major.minor version 2.2 03:47:41 DEBUG juju.environs.tools tools.go:111 filtering agent binaries by version: 2.2.5 03:47:41 DEBUG juju.environs.tools tools.go:114 filtering agent binaries by series: xenial 03:47:41 DEBUG juju.environs.tools tools.go:117 filtering agent binaries by architecture: amd64 03:47:41 DEBUG juju.environs.tools urls.go:109 trying datasource "keystone catalog" 03:47:42 DEBUG juju.environs.simplestreams simplestreams.go:683 using default candidate for content id "com.ubuntu.juju:released:tools" are {20161007 mirrors:1.0 content-download streams/v1/cpc-mirrors.sjson []} 03:47:46 INFO juju.environs.bootstrap tools.go:74 found 1 packaged agent binaries 03:47:46 INFO cmd bootstrap.go:357 Starting new instance for initial controller Launching controller instance(s) on abot-epc/RegionOne... 03:47:48 DEBUG juju.environs.instances image.go:64 instance constraints {region: RegionOne, series: xenial, arches: [amd64], constraints: mem=2048M, storage: []} 03:47:48 DEBUG juju.environs.instances image.go:70 matching constraints {region: RegionOne, series: xenial, arches: [amd64], constraints: mem=2048M, storage: []} against possible image metadata [{Id:616f36a0-a63f-4bdb-8b78-3bf079b7e53d Arch:amd64 VirtType:}] 03:47:48 INFO juju.environs.instances image.go:106 find instance - using image with id: 616f36a0-a63f-4bdb-8b78-3bf079b7e53d 03:47:48 DEBUG juju.cloudconfig.instancecfg instancecfg.go:832 Setting numa ctl preference to false 03:47:48 DEBUG juju.service discovery.go:63 discovered init system "systemd" from series "xenial" 03:47:48 DEBUG juju.provider.openstack provider.go:1010 openstack user data; 2490 bytes 03:47:48 DEBUG juju.provider.openstack provider.go:1022 using network id "dc4d35b3-89be-430d-8542-25b22b5f83eb" 03:47:48 DEBUG goose :1 performing API version discovery for "https://172.30.10.101:9696/" 03:47:48 DEBUG goose :1 discovered API versions: [{Version:{major:2 minor:0} Links:[{Href:http://172.30.10.101:9696/v2.0/ Rel:self}] Status:CURRENT}] 03:47:56 INFO juju.provider.openstack provider.go:1146 trying to build instance in availability zone "nova" - instance "ca46f676-d1a6-49e4-a09f-45290cdc6dce" has status BUILD, wait 10 seconds before retry, attempt 1 - instance "ca46f676-d1a6-49e4-a09f-45290cdc6dce" has status BUILD, wait 10 seconds before retry, attempt 2 03:48:24 INFO juju.provider.openstack provider.go:1189 started instance "ca46f676-d1a6-49e4-a09f-45290cdc6dce" 03:48:24 DEBUG juju.provider.openstack provider.go:1193 allocating public IP address for openstack node 03:48:28 DEBUG juju.provider.openstack networking.go:216 allocated new public IP: 172.30.10.113 03:48:28 INFO juju.provider.openstack provider.go:1198 allocated public IP 172.30.10.113 - ca46f676-d1a6-49e4-a09f-45290cdc6dce (arch=amd64 mem=2G cores=1) 03:48:33 INFO juju.environs.bootstrap bootstrap.go:606 newest version: 2.2.5 03:48:33 INFO juju.environs.bootstrap bootstrap.go:621 picked bootstrap agent binary version: 2.2.5 03:48:33 INFO juju.environs.bootstrap bootstrap.go:393 Installing Juju agent on bootstrap instance 03:48:36 INFO cmd bootstrap.go:485 Fetching Juju GUI 2.12.3 03:48:36 DEBUG juju.cloudconfig.instancecfg instancecfg.go:832 Setting numa ctl preference to false Waiting for address 03:48:38 DEBUG juju.provider.openstack provider.go:414 instance ca46f676-d1a6-49e4-a09f-45290cdc6dce has floating IP address: 172.30.10.113 Attempting to connect to 172.30.10.113:22 Attempting to connect to 172.16.0.6:22 03:48:38 DEBUG juju.provider.common bootstrap.go:497 connection attempt for 172.16.0.6 failed: ssh: connect to host 172.16.0.6 port 22: Connection refused 03:48:39 DEBUG juju.provider.common bootstrap.go:497 connection attempt for 172.30.10.113 failed: ssh: connect to host 172.30.10.113 port 22: Connection refused 03:48:43 DEBUG juju.provider.common bootstrap.go:497 connection attempt for 172.16.0.6 failed: ssh: connect to host 172.16.0.6 port 22: Connection refused 03:48:44 DEBUG juju.provider.common bootstrap.go:497 connection attempt for 172.30.10.113 failed: ssh: connect to host 172.30.10.113 port 22: Connection refused 03:48:48 DEBUG juju.provider.common bootstrap.go:497 connection attempt for 172.16.0.6 failed: ssh: connect to host 172.16.0.6 port 22: Connection refused 03:48:48 DEBUG juju.provider.openstack provider.go:414 instance ca46f676-d1a6-49e4-a09f-45290cdc6dce has floating IP address: 172.30.10.113 03:48:50 INFO juju.cloudconfig userdatacfg_unix.go:410 Fetching agent: curl -sSfw 'tools from %{url_effective} downloaded: HTTP %{http_code}; time %{time_total}s; size %{size_download} bytes; speed %{speed_download} bytes/s ' --retry 10 --insecure -o $bin/tools.tar.gz <[https://streams.canonical.com/juju/tools/agent/2.2.5/juju-2.2.5-ubuntu-amd64.tgz]> sudo: unable to resolve host juju-071968-controller-0 03:51:30 INFO cmd bootstrap.go:423 Bootstrap agent now started 03:51:32 DEBUG juju.provider.openstack provider.go:414 instance ca46f676-d1a6-49e4-a09f-45290cdc6dce has floating IP address: 172.30.10.113 03:51:32 INFO juju.juju api.go:308 API endpoints changed from [] to [172.30.10.113:17070 172.16.0.6:17070] 03:51:32 INFO cmd controller.go:82 Contacting Juju controller at 172.30.10.113 to verify accessibility... 03:51:32 INFO juju.juju api.go:67 connecting to API addresses: [172.30.10.113:17070 172.16.0.6:17070] 03:51:34 DEBUG juju.api apiclient.go:863 successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:34 INFO juju.api apiclient.go:617 connection established to "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:35 DEBUG juju.api monitor.go:35 RPC connection died 03:51:35 INFO cmd controller.go:110 Still waiting for API to become available: upgrade in progress (upgrade in progress) 03:51:35 INFO juju.juju api.go:67 connecting to API addresses: [172.30.10.113:17070 172.16.0.6:17070] 03:51:35 DEBUG juju.api apiclient.go:863 successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:35 INFO juju.api apiclient.go:617 connection established to "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:35 DEBUG juju.api monitor.go:35 RPC connection died 03:51:35 INFO cmd controller.go:110 Still waiting for API to become available: upgrade in progress (upgrade in progress) 03:51:36 INFO juju.juju api.go:67 connecting to API addresses: [172.30.10.113:17070 172.16.0.6:17070] 03:51:36 DEBUG juju.api apiclient.go:863 successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:36 INFO juju.api apiclient.go:617 connection established to "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" 03:51:37 DEBUG juju.api monitor.go:35 RPC connection died 03:51:37 INFO cmd controller.go:87 Bootstrap complete, "abot-controller" controller now available. 03:51:37 INFO cmd controller.go:88 Controller machines are in the "controller" model. 03:51:37 INFO cmd controller.go:89 Initial model "default" added. 03:51:37 INFO cmd supercommand.go:465 command finished 2018-05-28 03:51:38,016 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Upload VNFD 2018-05-28 03:51:38,016 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Get or create flavor for all Abot-EPC 2018-05-28 03:51:39,012 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Deploying Abot-epc bundle file ... 2018-05-28 03:52:12,454 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju deploy /src/epc-requirements/abot_charm/functest-abot-epc-bundle/bundle.yaml Deploying charm "local:xenial/abot-epc-basic-1" application abot-epc-basic exposed Deploying charm "cs:mysql-55" Deploying charm "local:trusty/oai-epc-3" Deploying charm "local:trusty/oai-hss-13" Related "mysql:db" and "oai-hss:db" Related "oai-hss:hss" and "oai-epc:hss" Related "oai-epc:epc" and "abot-epc-basic:epc" Related "oai-epc:ssh-abot-epc" and "abot-epc-basic:ssh-abot" Deploy of bundle completed. 2018-05-28 03:52:12,455 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Waiting for instances ..... 2018-05-28 04:33:52,010 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju-wait INFO:root:All units idle since 2018-05-28 04:33:34.894705Z (abot-epc-basic/0, mysql/0, oai-epc/0, oai-hss/0) 2018-05-28 04:33:52,011 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Deployed Abot-epc on Openstack 2018-05-28 04:33:52,011 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Checking status of ABot and EPC units ... 2018-05-28 04:33:52,287 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - juju status Model Controller Cloud/Region Version SLA default abot-controller abot-epc/RegionOne 2.2.5 unsupported App Version Status Scale Charm Store Rev OS Notes abot-epc-basic active 1 abot-epc-basic local 1 ubuntu exposed mysql unknown 1 mysql jujucharms 55 ubuntu oai-epc active 1 oai-epc local 3 ubuntu oai-hss active 1 oai-hss local 13 ubuntu Unit Workload Agent Machine Public address Ports Message abot-epc-basic/0* active idle 0 172.30.10.124 80/tcp,5000/tcp ABot ready! EPC relation established, proceed to run tests... mysql/0* unknown idle 1 172.30.10.116 3306/tcp oai-epc/0* active idle 2 172.30.10.123 2152/udp OAI EPC is running and connected to HSS oai-hss/0* active idle 3 172.30.10.125 OAI HSS is running Machine State DNS Inst id Series AZ Message 0 started 172.30.10.124 43448662-7c7c-48a3-ad73-05456049cce1 xenial nova ACTIVE 1 started 172.30.10.116 1fece328-da5d-4278-ac2c-02beae330eff trusty nova ACTIVE 2 started 172.30.10.123 6f1214cb-47d1-4adb-880d-33edfbc43798 trusty nova ACTIVE 3 started 172.30.10.125 0be9e95c-6280-4df5-80e9-9bc13c9a8aed trusty nova ACTIVE Relation provider Requirer Interface Type abot-epc-basic:ssh-abot oai-epc:ssh-abot-epc ssh regular mysql:cluster mysql:cluster mysql-ha peer mysql:db oai-hss:db mysql regular oai-epc:epc abot-epc-basic:epc S1-C regular oai-hss:hss oai-epc:hss S6a-hss regular 2018-05-28 04:33:52,517 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju status --format short abot-epc-basic - abot-epc-basic/0: 172.30.10.124 (agent:idle, workload:active) 80/tcp, 5000/tcp 2018-05-28 04:33:52,518 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - abot-epc-basic workload is active 2018-05-28 04:33:52,783 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju status --format short oai-epc - oai-epc/0: 172.30.10.123 (agent:idle, workload:active) 2152/udp 2018-05-28 04:33:52,783 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - oai-epc workload is active 2018-05-28 04:33:53,047 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju status --format short oai-hss - oai-hss/0: 172.30.10.125 (agent:idle, workload:active) 2018-05-28 04:33:53,048 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - oai-hss workload is active 2018-05-28 04:33:56,213 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - List of Instance: [, , , , , ] 2018-05-28 04:33:57,335 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Instance: juju-9b3dc0a2-de1a-4778-8b41-5ab41ef63482-185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4 2018-05-28 04:33:57,335 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Adding Security group rule.... 2018-05-28 04:33:58,889 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Transferring the feature files to Abot_node ... 2018-05-28 04:33:59,768 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju scp -- -r -v /usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/epc/featureFiles abot-epc-basic/0:~/ 2018-05-28 04:33:59,769 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Copying the feature files within Abot_node 2018-05-28 04:34:00,605 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju ssh abot-epc-basic/0 sudo cp -vfR ~/featureFiles/* /etc/rebaca-test-suite/featureFiles sudo: unable to resolve host juju-2cc5d4-default-0 '/home/ubuntu/featureFiles/000-local-commands.feature' -> '/etc/rebaca-test-suite/featureFiles/000-local-commands.feature' '/home/ubuntu/featureFiles/Attach_Procedure_Attach_With_GUTI.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_Attach_With_GUTI.feature' '/home/ubuntu/featureFiles/Attach_Procedure_AttachWithIMSI.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_AttachWithIMSI.feature' '/home/ubuntu/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_MCC.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_MCC.feature' '/home/ubuntu/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_MNC.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_MNC.feature' '/home/ubuntu/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_TAC.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_AttachWithIMSI_Wrong_TAC.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_DNS_Server_Addr_Request.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_DNS_Server_Addr_Request.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_DSMIPv6_HA_Addr_Request.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_DSMIPv6_HA_Addr_Request.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_DSMIPv6_HN_Prefix_Request.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_DSMIPv6_HN_Prefix_Request.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_DSMIPv6_IPv4_HA_Addr_Request.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_DSMIPv6_IPv4_HA_Addr_Request.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_IM_CN_SS_Signalling_Flag.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_IM_CN_SS_Signalling_Flag.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_IP_Addr_Alloc_DHCPv4.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_IP_Addr_Alloc_DHCPv4.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_IP_Addr_Alloc_NAS_Signalling.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_IP_Addr_Alloc_NAS_Signalling.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_MS_Support_Bearer_Ctrl_Indication.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_MS_Support_Bearer_Ctrl_Indication.feature' '/home/ubuntu/featureFiles/Attach_Procedure_With_P_CSCF_Addr_Request.feature' -> '/etc/rebaca-test-suite/featureFiles/Attach_Procedure_With_P_CSCF_Addr_Request.feature' '/home/ubuntu/featureFiles/Auth_Accept.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_Accept.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_Network_GUTIattach_AuthReject_re_Auth.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_Network_GUTIattach_AuthReject_re_Auth.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_Network_GUTIattach_IMSIdiff_AuthReject_re_Auth.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_Network_GUTIattach_IMSIdiff_AuthReject_re_Auth.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_Network_IMSIattach_AuthReject_re_Auth.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_Network_IMSIattach_AuthReject_re_Auth.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_UE_GUTIattach_MAC_code_failure.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_UE_GUTIattach_MAC_code_failure.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_UE_IMSIattach_MAC_code_failure.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_UE_IMSIattach_MAC_code_failure.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_UE_non_EPS_AuthUnaccpt.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_UE_non_EPS_AuthUnaccpt.feature' '/home/ubuntu/featureFiles/Auth_NotAccept_by_UE_SQN_failure.feature' -> '/etc/rebaca-test-suite/featureFiles/Auth_NotAccept_by_UE_SQN_failure.feature' '/home/ubuntu/featureFiles/ResourceBundle.xml' -> '/etc/rebaca-test-suite/featureFiles/ResourceBundle.xml' Connection to 172.30.10.124 closed. 2018-05-28 04:34:00,606 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Running VNF Test cases.... 2018-05-28 04:34:01,111 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju run-action abot-epc-basic/0 run tagnames=TS_24_301 Action queued with id: c13a2385-ade1-4a2a-8a1a-a221db3dfb06 2018-05-28 04:46:07,723 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju-wait INFO:root:All units idle since 2018-05-28 04:45:50.597522Z (abot-epc-basic/0, mysql/0, oai-epc/0, oai-hss/0) 2018-05-28 04:46:07,723 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Getting results from Abot node.... 2018-05-28 04:46:08,379 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - timeout -t 3600 juju scp -- -v abot-epc-basic/0:/var/lib/abot-epc-basic/artifacts/TestResults.json /home/opnfv/functest/results/juju_epc/. 2018-05-28 04:46:08,379 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Parsing the Test results... 2018-05-28 04:46:08,410 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - {'failures': 22, 'skipped': 0, 'passed': 256} 2018-05-28 04:46:08,411 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Test VNF result: Passed: 256, Failed:22, Skipped: 0 2018-05-28 04:46:08,642 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-28 04:46:08,643 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | juju_epc | functest | 59:20 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-28 04:46:10,903 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - juju debug-log --replay --no-tail machine-2: 03:55:25 INFO juju.cmd running jujud [2.2.5 gc go1.8] machine-2: 03:55:25 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/machine-2/jujud", "machine", "--data-dir", "/var/lib/juju", "--machine-id", "2", "--debug"} machine-2: 03:55:25 DEBUG juju.agent read agent config, format "2.0" machine-2: 03:55:25 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-2: 03:55:25 INFO juju.worker.upgradesteps upgrade steps for 2.2.5 have already been run. machine-2: 03:55:25 DEBUG juju.worker start "engine" machine-2: 03:55:25 INFO juju.worker start "engine" machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker stopped: "upgrade-check-gate" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "termination-signal-handler" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-steps-gate" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "state" manifold worker stopped: "agent" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "agent" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "agent" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "agent" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-steps-flag" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-check-gate" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "unconverted-state-workers" manifold worker stopped: "state" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.introspection introspection worker listening on "@jujud-machine-2" machine-2: 03:55:25 DEBUG juju.worker "engine" started machine-2: 03:55:25 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: machine-2: 03:55:25 DEBUG juju.worker.introspection stats worker now serving machine-2: 03:55:25 DEBUG juju.worker.apicaller connecting with old password machine-2: 03:55:25 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "state-config-watcher" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: "state-config-watcher" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "state" manifold worker stopped: machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-2: 03:55:25 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: machine-2: 03:55:25 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "api-caller" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:25 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:25 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:25 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not set: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "state" manifold worker stopped: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:25 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: dependency not available machine-2: 03:55:26 DEBUG juju.worker.apicaller connected machine-2: 03:55:26 DEBUG juju.worker.apicaller changing password... machine-2: 03:55:26 DEBUG juju.worker.apicaller password changed machine-2: 03:55:26 DEBUG juju.api RPC connection died machine-2: 03:55:26 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately machine-2: 03:55:26 DEBUG juju.worker.apicaller connecting with current password machine-2: 03:55:26 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:26 INFO juju.api connection established to "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:26 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:27 DEBUG juju.worker.apicaller connected machine-2: 03:55:27 DEBUG juju.worker.dependency "api-caller" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "machiner" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "upgrader" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker started machine-2: 03:55:27 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-2: 03:55:27 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-fortress" not running: dependency not available machine-2: 03:55:27 INFO juju.worker.upgrader abort check blocked until version event received machine-2: 03:55:27 INFO juju.worker.upgrader unblocking abort check machine-2: 03:55:27 INFO juju.worker.upgrader desired tool version: 2.2.5 machine-2: 03:55:27 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker stopped: gate unlocked machine-2: 03:55:27 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: machine-2: 03:55:27 DEBUG juju.worker.dependency "migration-fortress" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "migration-minion" manifold worker started machine-2: 03:55:27 INFO juju.worker.migrationminion migration phase is now: NONE machine-2: 03:55:27 DEBUG juju.worker.dependency "api-address-updater" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "reboot-executor" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.dependency "host-key-reporter" manifold worker started machine-2: 03:55:27 DEBUG juju.worker.reboot Reboot worker got action: noop machine-2: 03:55:27 DEBUG juju.network no lxc bridge addresses to filter for machine machine-2: 03:55:27 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-2: 03:55:27 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-2: 03:55:27 DEBUG juju.network including address public:172.30.10.113 for machine machine-2: 03:55:27 DEBUG juju.network including address local-cloud:172.16.0.6 for machine machine-2: 03:55:27 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-2: 03:55:27 DEBUG juju.network including address local-machine:::1 for machine machine-2: 03:55:27 DEBUG juju.network addresses after filtering: [public:172.30.10.113 local-cloud:172.16.0.6 local-machine:127.0.0.1 local-machine:::1] machine-2: 03:55:27 DEBUG juju.worker.apiaddressupdater updating API hostPorts to [[172.30.10.113:17070 172.16.0.6:17070 127.0.0.1:17070 [::1]:17070]] machine-2: 03:55:27 DEBUG juju.agent API server address details [["172.30.10.113:17070" "172.16.0.6:17070" "127.0.0.1:17070" "[::1]:17070"]] written to agent config as ["172.16.0.6:17070" "172.30.10.113:17070"] machine-2: 03:55:28 DEBUG juju.worker.dependency "log-sender" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "machine-action-runner" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "disk-manager" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.logger initial log config: "=DEBUG" machine-2: 03:55:28 DEBUG juju.worker.dependency "logging-config-updater" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.logger logger setup machine-2: 03:55:28 DEBUG juju.worker.dependency "storage-provisioner" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: dependency not available machine-2: 03:55:28 DEBUG juju.worker.dependency "machiner" manifold worker started machine-2: 03:55:28 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: dependency not available machine-2: 03:55:28 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: dependency not available machine-2: 03:55:28 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: resource permanently unavailable machine-2: 03:55:28 DEBUG juju.worker.proxyupdater new proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,172.16.0.6,172.30.10.113,::1,localhost", AutoNoProxy:""} machine-2: 03:55:28 DEBUG juju.tools.lxdclient connecting to LXD remote "local": "unix:///var/lib/lxd/unix.socket" machine-2: 03:55:28 ERROR juju.worker.proxyupdater can't connect to the local LXD server: LXD socket not found; is LXD installed & running? Please install LXD by running: $ sudo apt-get install lxd and then configure it with: $ newgrp lxd $ lxd init machine-2: 03:55:28 DEBUG juju.worker.proxyupdater new apt proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,::1,localhost", AutoNoProxy:""} machine-2: 03:55:28 DEBUG juju.container.kvm kvm-ok output: INFO: /dev/kvm exists KVM acceleration can be used machine-2: 03:55:28 DEBUG juju.service discovered init system "upstart" from series "trusty" machine-2: 03:55:28 DEBUG juju.worker.logger reconfiguring logging from "=DEBUG" to "=DEBUG;unit=DEBUG" machine-2: 03:55:28 INFO juju.worker.deployer checking unit "oai-epc/0" machine-2: 03:55:28 DEBUG juju.network no lxc bridge addresses to filter for machine machine-2: 03:55:28 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-2: 03:55:28 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-2: 03:55:28 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-2: 03:55:28 DEBUG juju.network including address local-cloud:172.16.0.12 for machine machine-2: 03:55:28 DEBUG juju.network including address local-machine:::1 for machine machine-2: 03:55:28 DEBUG juju.network addresses after filtering: [local-machine:127.0.0.1 local-cloud:172.16.0.12 local-machine:::1] machine-2: 03:55:28 INFO juju.worker.machiner setting addresses for "machine-2" to [local-machine:127.0.0.1 local-cloud:172.16.0.12 local-machine:::1] machine-2: 03:55:28 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-2: 03:55:28 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-2: 03:55:28 DEBUG juju.utils.ssh writing authorised keys file /home/ubuntu/.ssh/authorized_keys machine-2: 03:55:28 DEBUG juju.worker.storageprovisioner filesystems alive: [], dying: [], dead: [] machine-2: 03:55:28 DEBUG juju.worker.storageprovisioner filesystem attachment alive: [], dying: [], dead: [] machine-2: 03:55:28 DEBUG juju.worker.storageprovisioner volumes alive: [], dying: [], dead: [] machine-2: 03:55:28 DEBUG juju.worker.storageprovisioner volume attachments alive: [], dying: [], dead: [] machine-2: 03:55:28 INFO juju.worker.diskmanager block devices changed: [{vda [] 10240 true } {vdb [/dev/disk/by-label/config-2 /dev/disk/by-uuid/D703-43B8] config-2 D703-43B8 64 vfat false }] machine-2: 03:55:28 INFO juju.worker.authenticationworker "machine-2" key updater worker started machine-2: 03:55:28 DEBUG juju.worker.hostkeyreporter 4 SSH host keys reported for machine 2 machine-2: 03:55:28 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: resource permanently unavailable machine-2: 03:55:28 DEBUG juju.worker start "2-container-watcher" machine-2: 03:55:28 DEBUG juju.worker start "stateconverter" machine-2: 03:55:28 INFO juju.worker start "2-container-watcher" machine-2: 03:55:28 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker started machine-2: 03:55:28 DEBUG juju.worker "2-container-watcher" started machine-2: 03:55:28 INFO juju.worker start "stateconverter" machine-2: 03:55:28 DEBUG juju.worker "stateconverter" started machine-2: 03:55:28 DEBUG juju.cmd.jujud upgrades done, starting worker "2-container-watcher" machine-2: 03:55:28 INFO juju.worker.deployer deploying unit "oai-epc/0" machine-2: 03:55:29 DEBUG juju.service discovered init system "upstart" from local host machine-2: 03:55:29 INFO juju.worker.machiner "machine-2" started machine-2: 03:55:29 DEBUG juju.worker.deployer state addresses: ["172.16.0.6:37017"] machine-2: 03:55:29 DEBUG juju.worker.deployer API addresses: ["172.16.0.6:17070" "172.30.10.113:17070"] machine-2: 03:55:29 INFO juju.service Installing and starting service &{Service:{Name:jujud-unit-oai-epc-0 Conf:{Desc:juju unit agent for oai-epc/0 Transient:false AfterStopped: Env:map[JUJU_CONTAINER_TYPE:] Limit:map[] Timeout:300 ExecStart:'/var/lib/juju/tools/unit-oai-epc-0/jujud' unit --data-dir '/var/lib/juju' --unit-name oai-epc/0 --debug ExecStopPost: Logfile:/var/log/juju/unit-oai-epc-0.log ExtraScript: ServiceBinary:/var/lib/juju/tools/unit-oai-epc-0/jujud ServiceArgs:[unit --data-dir /var/lib/juju --unit-name oai-epc/0 --debug]}}} unit-oai-epc-0: 03:55:29 INFO juju.cmd running jujud [2.2.5 gc go1.8] unit-oai-epc-0: 03:55:29 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/unit-oai-epc-0/jujud", "unit", "--data-dir", "/var/lib/juju", "--unit-name", "oai-epc/0", "--debug"} unit-oai-epc-0: 03:55:29 DEBUG juju.agent read agent config, format "2.0" unit-oai-epc-0: 03:55:29 DEBUG juju.worker start "api" unit-oai-epc-0: 03:55:29 INFO juju.worker start "api" unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "agent" manifold worker started unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.introspection introspection worker listening on "@jujud-unit-oai-epc-0" unit-oai-epc-0: 03:55:29 DEBUG juju.worker "api" started unit-oai-epc-0: 03:55:29 DEBUG juju.worker.introspection stats worker now serving unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "migration-fortress" manifold worker started unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.apicaller connecting with old password unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "api-caller" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: unit-oai-epc-0: 03:55:29 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:29 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:29 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:29 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:30 DEBUG juju.worker.apicaller connected unit-oai-epc-0: 03:55:30 DEBUG juju.worker.apicaller changing password... unit-oai-epc-0: 03:55:30 DEBUG juju.worker.apicaller password changed unit-oai-epc-0: 03:55:30 DEBUG juju.api RPC connection died unit-oai-epc-0: 03:55:30 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately unit-oai-epc-0: 03:55:30 DEBUG juju.worker.apicaller connecting with current password unit-oai-epc-0: 03:55:30 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:30 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:30 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-2: 03:55:30 DEBUG juju.worker.machiner observed network config updated for "machine-2" to [{1 127.0.0.0/8 65536 0 lo loopback false false loopback 127.0.0.1 [] [] []} {1 ::1/128 65536 0 lo loopback false false loopback ::1 [] [] []} {2 fa:16:3e:02:71:55 172.16.0.0/24 1450 0 eth0 ethernet false false static 172.16.0.12 [] [] []} {2 fa:16:3e:02:71:55 1450 0 eth0 ethernet false false manual [] [] []}] unit-oai-epc-0: 03:55:31 DEBUG juju.worker.apicaller connected unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "api-caller" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "uniter" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "log-sender" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "upgrader" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "migration-minion" manifold worker started unit-oai-epc-0: 03:55:31 INFO juju.worker.migrationminion migration phase is now: NONE unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "leadership-tracker" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.leadership oai-epc/0 making initial claim for oai-epc leadership unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "charm-dir" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "leadership-tracker" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: unit-oai-epc-0: 03:55:31 DEBUG juju.worker.logger initial log config: "=DEBUG" unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "logging-config-updater" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.logger logger setup unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "api-address-updater" manifold worker started unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "metric-spool" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.worker.proxyupdater new proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,172.16.0.6,172.30.10.113,::1,localhost", AutoNoProxy:""} unit-oai-epc-0: 03:55:31 DEBUG juju.worker.proxyupdater new apt proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,::1,localhost", AutoNoProxy:""} unit-oai-epc-0: 03:55:31 DEBUG juju.worker.logger reconfiguring logging from "=DEBUG" to "=DEBUG;unit=DEBUG" unit-oai-epc-0: 03:55:31 DEBUG juju.network no lxc bridge addresses to filter for machine unit-oai-epc-0: 03:55:31 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) unit-oai-epc-0: 03:55:31 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) unit-oai-epc-0: 03:55:31 DEBUG juju.network including address public:172.30.10.113 for machine unit-oai-epc-0: 03:55:31 DEBUG juju.network including address local-cloud:172.16.0.6 for machine unit-oai-epc-0: 03:55:31 DEBUG juju.network including address local-machine:127.0.0.1 for machine unit-oai-epc-0: 03:55:31 DEBUG juju.network including address local-machine:::1 for machine unit-oai-epc-0: 03:55:31 DEBUG juju.network addresses after filtering: [public:172.30.10.113 local-cloud:172.16.0.6 local-machine:127.0.0.1 local-machine:::1] unit-oai-epc-0: 03:55:31 DEBUG juju.worker.apiaddressupdater updating API hostPorts to [[172.30.10.113:17070 172.16.0.6:17070 127.0.0.1:17070 [::1]:17070]] unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "metric-spool" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "hook-retry-strategy" not running: dependency not available unit-oai-epc-0: 03:55:31 DEBUG juju.agent API server address details [["172.30.10.113:17070" "172.16.0.6:17070" "127.0.0.1:17070" "[::1]:17070"]] written to agent config as ["172.16.0.6:17070" "172.30.10.113:17070"] unit-oai-epc-0: 03:55:32 DEBUG juju.worker.dependency "meter-status" manifold worker started unit-oai-epc-0: 03:55:32 DEBUG juju.worker.dependency "metric-spool" manifold worker started unit-oai-epc-0: 03:55:32 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker started unit-oai-epc-0: 03:55:32 DEBUG juju.worker.meterstatus got meter status change signal from watcher unit-oai-epc-0: 03:55:32 DEBUG juju.worker.meterstatus acquire lock "machine-lock" for meter status hook execution unit-oai-epc-0: 03:55:32 DEBUG juju.worker.meterstatus lock "machine-lock" acquired unit-oai-epc-0: 03:55:32 DEBUG juju.worker.meterstatus release lock "machine-lock" for meter status hook execution unit-oai-epc-0: 03:55:32 INFO juju.worker.meterstatus skipped "meter-status-changed" hook (missing) unit-oai-epc-0: 03:55:32 DEBUG juju.worker.dependency "uniter" manifold worker started unit-oai-epc-0: 03:55:32 INFO juju.worker.upgrader abort check blocked until version event received unit-oai-epc-0: 03:55:32 INFO juju.worker.upgrader unblocking abort check unit-oai-epc-0: 03:55:32 DEBUG juju.worker.dependency "metric-sender" manifold worker started unit-oai-epc-0: 03:55:32 INFO juju.worker.upgrader desired tool version: 2.2.5 unit-oai-epc-0: 03:55:32 INFO juju.worker.leadership oai-epc/0 promoted to leadership of oai-epc unit-oai-epc-0: 03:55:32 INFO worker.uniter.jujuc ensure jujuc symlinks in /var/lib/juju/tools/unit-oai-epc-0 unit-oai-epc-0: 03:55:32 INFO worker.uniter.jujuc was a symlink, now looking at /var/lib/juju/tools/2.2.5-trusty-amd64 unit-oai-epc-0: 03:55:32 DEBUG worker.uniter.jujuc jujud path /var/lib/juju/tools/2.2.5-trusty-amd64/jujud unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter starting juju-run listener on unix:/var/lib/juju/agents/unit-oai-epc-0/run.socket unit-oai-epc-0: 03:55:32 INFO juju.worker.uniter unit "oai-epc/0" started unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter juju-run listener running unit-oai-epc-0: 03:55:32 INFO juju.worker.uniter resuming charm install unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter.operation running operation install local:trusty/oai-epc-3 unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter.operation preparing operation "install local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:32 INFO juju.worker.uniter.charm downloading local:trusty/oai-epc-3 from API server unit-oai-epc-0: 03:55:32 INFO juju.downloader downloading from local:trusty/oai-epc-3 unit-oai-epc-0: 03:55:32 DEBUG httpbakery client do GET https://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/charms?file=%2A&url=local%3Atrusty%2Foai-epc-3 { unit-oai-epc-0: 03:55:32 DEBUG httpbakery } -> error unit-oai-epc-0: 03:55:32 INFO juju.downloader download complete ("local:trusty/oai-epc-3") unit-oai-epc-0: 03:55:32 INFO juju.downloader download verified ("local:trusty/oai-epc-3") unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter.operation executing operation "install local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter.charm preparing to deploy charm "local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:32 DEBUG juju.worker.uniter.charm deploying charm "local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.charm finishing deploy of charm "local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.operation committing operation "install local:trusty/oai-epc-3" unit-oai-epc-0: 03:55:35 INFO juju.worker.uniter hooks are retried true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got action change: [] ok=true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got service change unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got leader settings change: ok=true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got unit change unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got storage change: [] ok=true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got address change: ok=true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got config change: ok=true unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.remotestate got relations change: ok=true unit-oai-epc-0: 03:55:35 INFO juju.worker.uniter.storage initial storage attachments ready unit-oai-epc-0: 03:55:35 INFO juju.worker.uniter found queued "install" hook unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.operation running operation run install hook unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter acquire lock "machine-lock" for uniter hook execution unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter lock "machine-lock" acquired unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.operation preparing operation "run install hook" unit-oai-epc-0: 03:55:35 DEBUG juju.worker.uniter.operation executing operation "run install hook" unit-oai-epc-0: 03:55:36 DEBUG juju.worker.uniter [AGENT-STATUS] executing: running install hook unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + export DEBIAN_FRONTEND=noninteractive unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + DEBIAN_FRONTEND=noninteractive unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + source /var/lib/juju/agents/unit-oai-epc-0/charm/utils/common unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ validated_mme_public_ip=0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + set_env_paths unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + openair_path=/srv/openair-cn unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + build_path=/srv/openair-cn/BUILD unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + hss_path=/srv/openair-cn/BUILD/HSS unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + epc_path=/srv/openair-cn/BUILD/EPC unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + build_run_scripts=/srv/openair-cn/SCRIPTS unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + tools_path=/srv/openair-cn/BUILD/TOOLS unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + epc_conf_path=/usr/local/etc/oai unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + epc_exec_name=mme_gw unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + open-port 2152/udp unit-oai-epc-0: 03:55:36 DEBUG worker.uniter.jujuc running hook tool "open-port" unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ config-get branch unit-oai-epc-0: 03:55:36 DEBUG worker.uniter.jujuc running hook tool "config-get" unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + branch=v0.3.2-branch unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ config-get revision unit-oai-epc-0: 03:55:36 DEBUG worker.uniter.jujuc running hook tool "config-get" unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + revision=HEAD unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + echo v0.3.2-branch unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + echo HEAD unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + update_hostname unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ sed 's|/|-|' unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ echo oai-epc/0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + HOSTNAME=oai-epc-0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + echo oai-epc-0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + hostname oai-epc-0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ hostname unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + echo '127.0.0.1 oai-epc-0' unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + install_packages unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ grep -c install unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install ++ dpkg --get-selections git unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + gitAlreadyInstalled=0 unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + true unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + '[' '!' 0 -eq 1 ']' unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install + apt-get install -y --force-yes git unit-oai-epc-0: 03:55:36 DEBUG unit.oai-epc/0.install Reading package lists... unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Building dependency tree... unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Reading state information... unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install The following extra packages will be installed: unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install git-man liberror-perl unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Suggested packages: unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install git-daemon-run git-daemon-sysvinit git-doc git-el git-email git-gui gitk unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install gitweb git-arch git-bzr git-cvs git-mediawiki git-svn unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install The following NEW packages will be installed: unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install git git-man liberror-perl unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install 0 upgraded, 3 newly installed, 0 to remove and 0 not upgraded. unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Need to get 2973 kB of archives. unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install After this operation, 21.9 MB of additional disk space will be used. unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Get:1 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty/main liberror-perl all 0.17-1.1 [21.1 kB] unit-oai-epc-0: 03:55:37 DEBUG unit.oai-epc/0.install Get:2 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main git-man all 1:1.9.1-1ubuntu0.7 [699 kB] unit-oai-epc-0: 03:55:38 DEBUG unit.oai-epc/0.install Get:3 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main git amd64 1:1.9.1-1ubuntu0.7 [2252 kB] unit-oai-epc-0: 03:55:38 DEBUG unit.oai-epc/0.install Fetched 2973 kB in 0s (3310 kB/s) unit-oai-epc-0: 03:55:38 DEBUG unit.oai-epc/0.install Selecting previously unselected package liberror-perl. unit-oai-epc-0: 03:55:38 DEBUG unit.oai-epc/0.install (Reading database ... 52372 files and directories currently installed.) unit-oai-epc-0: 03:55:38 DEBUG unit.oai-epc/0.install Preparing to unpack .../liberror-perl_0.17-1.1_all.deb ... unit-oai-epc-0: 03:55:39 DEBUG unit.oai-epc/0.install Unpacking liberror-perl (0.17-1.1) ... unit-oai-epc-0: 03:55:39 DEBUG unit.oai-epc/0.install Selecting previously unselected package git-man. unit-oai-epc-0: 03:55:39 DEBUG unit.oai-epc/0.install Preparing to unpack .../git-man_1%3a1.9.1-1ubuntu0.7_all.deb ... unit-oai-epc-0: 03:55:40 DEBUG unit.oai-epc/0.install Unpacking git-man (1:1.9.1-1ubuntu0.7) ... unit-oai-epc-0: 03:55:41 DEBUG unit.oai-epc/0.install Selecting previously unselected package git. unit-oai-epc-0: 03:55:41 DEBUG unit.oai-epc/0.install Preparing to unpack .../git_1%3a1.9.1-1ubuntu0.7_amd64.deb ... unit-oai-epc-0: 03:55:41 DEBUG unit.oai-epc/0.install Unpacking git (1:1.9.1-1ubuntu0.7) ... unit-oai-epc-0: 03:55:45 DEBUG unit.oai-epc/0.install Processing triggers for man-db (2.6.7.1-1ubuntu1) ... machine-0: 03:55:45 INFO juju.cmd running jujud [2.2.5 gc go1.8] machine-0: 03:55:45 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/machine-0/jujud", "machine", "--data-dir", "/var/lib/juju", "--machine-id", "0", "--debug"} machine-0: 03:55:45 DEBUG juju.agent read agent config, format "2.0" machine-0: 03:55:46 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-0: 03:55:46 INFO juju.worker.upgradesteps upgrade steps for 2.2.5 have already been run. machine-0: 03:55:46 DEBUG juju.worker start "engine" machine-0: 03:55:46 INFO juju.worker start "engine" machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "agent" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "api-config-watcher" manifold worker stopped: "agent" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: "agent" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "state-config-watcher" manifold worker stopped: "agent" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "unconverted-state-workers" manifold worker stopped: "state" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-check-gate" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "agent" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: "state-config-watcher" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.introspection introspection worker listening on "@jujud-machine-0" machine-0: 03:55:46 DEBUG juju.worker "engine" started machine-0: 03:55:46 DEBUG juju.worker.introspection stats worker now serving machine-0: 03:55:46 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "termination-signal-handler" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-steps-gate" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-steps-flag" manifold worker stopped: "upgrade-steps-gate" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "state" manifold worker stopped: "state-config-watcher" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "state-config-watcher" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not set: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-steps-flag" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started machine-0: 03:55:46 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: machine-0: 03:55:46 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "api-caller" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not set: dependency not available machine-0: 03:55:46 DEBUG juju.worker.dependency "state" manifold worker stopped: dependency not available machine-0: 03:55:46 DEBUG juju.worker.apicaller connecting with old password machine-0: 03:55:46 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:46 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:46 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:46 DEBUG juju.worker.apicaller connected machine-0: 03:55:46 DEBUG juju.worker.apicaller changing password... machine-0: 03:55:46 DEBUG juju.worker.apicaller password changed machine-0: 03:55:46 DEBUG juju.api RPC connection died machine-0: 03:55:46 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately machine-0: 03:55:46 DEBUG juju.worker.apicaller connecting with current password machine-0: 03:55:46 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:46 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:46 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:46 DEBUG unit.oai-epc/0.install Setting up liberror-perl (0.17-1.1) ... unit-oai-epc-0: 03:55:47 DEBUG unit.oai-epc/0.install Setting up git-man (1:1.9.1-1ubuntu0.7) ... machine-0: 03:55:47 DEBUG juju.worker.apicaller connected machine-0: 03:55:47 DEBUG juju.worker.dependency "api-caller" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "machiner" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "upgrader" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker started machine-0: 03:55:47 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-0: 03:55:47 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-0: 03:55:47 INFO juju.worker.upgrader abort check blocked until version event received machine-0: 03:55:47 INFO juju.worker.upgrader unblocking abort check unit-oai-epc-0: 03:55:47 DEBUG unit.oai-epc/0.install Setting up git (1:1.9.1-1ubuntu0.7) ... machine-0: 03:55:47 INFO juju.worker.upgrader desired tool version: 2.2.5 machine-0: 03:55:47 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker stopped: gate unlocked machine-0: 03:55:47 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not running: dependency not available machine-0: 03:55:47 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "migration-fortress" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "migration-minion" manifold worker started machine-0: 03:55:47 INFO juju.worker.migrationminion migration phase is now: NONE machine-0: 03:55:47 DEBUG juju.worker.dependency "log-sender" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "api-address-updater" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "machine-action-runner" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.logger initial log config: "=DEBUG" machine-0: 03:55:47 DEBUG juju.worker.dependency "logging-config-updater" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "reboot-executor" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "storage-provisioner" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.logger logger setup machine-0: 03:55:47 DEBUG juju.worker.dependency "machiner" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "host-key-reporter" manifold worker started machine-0: 03:55:47 DEBUG juju.worker.dependency "disk-manager" manifold worker started machine-0: 03:55:47 INFO juju.worker.diskmanager block devices changed: [{vda [/dev/disk/by-path/virtio-pci-0000:00:04.0] 10240 true } {vdb [/dev/disk/by-label/config-2 /dev/disk/by-path/virtio-pci-0000:00:05.0 /dev/disk/by-uuid/D019-492C] config-2 D019-492C 64 vfat false }] machine-0: 03:55:47 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: resource permanently unavailable machine-0: 03:55:47 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker started machine-0: 03:55:47 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-0: 03:55:47 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-0: 03:55:47 DEBUG juju.utils.ssh writing authorised keys file /home/ubuntu/.ssh/authorized_keys machine-0: 03:55:47 DEBUG juju.worker.proxyupdater new proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,172.16.0.6,172.30.10.113,::1,localhost", AutoNoProxy:""} machine-0: 03:55:47 DEBUG juju.tools.lxdclient connecting to LXD remote "local": "unix:///var/lib/lxd/unix.socket" machine-0: 03:55:47 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: dependency not available machine-0: 03:55:47 DEBUG juju.network no lxc bridge addresses to filter for machine machine-0: 03:55:47 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-0: 03:55:47 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-0: 03:55:47 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-0: 03:55:47 DEBUG juju.network including address local-cloud:172.16.0.8 for machine machine-0: 03:55:47 DEBUG juju.network including address local-machine:::1 for machine machine-0: 03:55:47 DEBUG juju.network addresses after filtering: [local-machine:127.0.0.1 local-cloud:172.16.0.8 local-machine:::1] machine-0: 03:55:47 INFO juju.worker.machiner setting addresses for "machine-0" to [local-machine:127.0.0.1 local-cloud:172.16.0.8 local-machine:::1] machine-0: 03:55:47 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: dependency not available machine-0: 03:55:47 DEBUG juju.worker.logger reconfiguring logging from "=DEBUG" to "=DEBUG;unit=DEBUG" machine-0: 03:55:47 DEBUG juju.network no lxc bridge addresses to filter for machine machine-0: 03:55:47 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-0: 03:55:47 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-0: 03:55:47 DEBUG juju.network including address public:172.30.10.113 for machine machine-0: 03:55:47 DEBUG juju.network including address local-cloud:172.16.0.6 for machine machine-0: 03:55:47 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-0: 03:55:47 DEBUG juju.network including address local-machine:::1 for machine machine-0: 03:55:47 DEBUG juju.network addresses after filtering: [public:172.30.10.113 local-cloud:172.16.0.6 local-machine:127.0.0.1 local-machine:::1] machine-0: 03:55:47 DEBUG juju.worker.apiaddressupdater updating API hostPorts to [[172.30.10.113:17070 172.16.0.6:17070 127.0.0.1:17070 [::1]:17070]] machine-0: 03:55:47 DEBUG juju.worker.reboot Reboot worker got action: noop machine-0: 03:55:47 DEBUG juju.service discovered init system "systemd" from series "xenial" machine-0: 03:55:47 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: dependency not available machine-0: 03:55:47 DEBUG juju.container.kvm kvm-ok output: INFO: /dev/kvm exists KVM acceleration can be used machine-0: 03:55:47 DEBUG juju.worker.storageprovisioner filesystems alive: [], dying: [], dead: [] machine-0: 03:55:47 DEBUG juju.worker.storageprovisioner volumes alive: [], dying: [], dead: [] machine-0: 03:55:47 DEBUG juju.agent API server address details [["172.30.10.113:17070" "172.16.0.6:17070" "127.0.0.1:17070" "[::1]:17070"]] written to agent config as ["172.16.0.6:17070" "172.30.10.113:17070"] machine-0: 03:55:47 DEBUG juju.worker.storageprovisioner filesystem attachment alive: [], dying: [], dead: [] machine-0: 03:55:47 INFO juju.worker.authenticationworker "machine-0" key updater worker started machine-0: 03:55:47 DEBUG juju.worker.storageprovisioner volume attachments alive: [], dying: [], dead: [] machine-0: 03:55:48 INFO juju.worker.deployer checking unit "abot-epc-basic/0" machine-0: 03:55:48 DEBUG juju.worker.hostkeyreporter 4 SSH host keys reported for machine 0 machine-0: 03:55:48 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: resource permanently unavailable machine-0: 03:55:48 INFO juju.worker.deployer deploying unit "abot-epc-basic/0" machine-0: 03:55:48 INFO juju.worker.machiner "machine-0" started machine-0: 03:55:49 DEBUG juju.worker start "0-container-watcher" machine-0: 03:55:49 DEBUG juju.worker start "stateconverter" machine-0: 03:55:49 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker started machine-0: 03:55:49 INFO juju.worker start "0-container-watcher" machine-0: 03:55:49 DEBUG juju.worker "0-container-watcher" started machine-0: 03:55:49 INFO juju.worker start "stateconverter" machine-0: 03:55:49 DEBUG juju.cmd.jujud upgrades done, starting worker "0-container-watcher" machine-0: 03:55:49 DEBUG juju.worker "stateconverter" started machine-0: 03:55:49 DEBUG juju.service discovered init system "systemd" from local host unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install ++ grep -c install unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install ++ dpkg --get-selections at unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + atAlreadyInstalled=1 unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + '[' '!' 1 -eq 1 ']' unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install ++ grep -c install unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install ++ dpkg --get-selections virt-what unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + virtwhatAlreadyInstalled=0 unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + true unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + '[' '!' 0 -eq 1 ']' unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install + apt-get install -y --force-yes virt-what unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install Reading package lists... machine-0: 03:55:49 DEBUG juju.worker.deployer state addresses: ["172.16.0.6:37017"] machine-0: 03:55:49 DEBUG juju.worker.deployer API addresses: ["172.16.0.6:17070" "172.30.10.113:17070"] machine-0: 03:55:49 INFO juju.service Installing and starting service &{Service:{Name:jujud-unit-abot-epc-basic-0 Conf:{Desc:juju unit agent for abot-epc-basic/0 Transient:false AfterStopped: Env:map[JUJU_CONTAINER_TYPE:] Limit:map[] Timeout:300 ExecStart:/var/lib/juju/init/jujud-unit-abot-epc-basic-0/exec-start.sh ExecStopPost: Logfile:/var/log/juju/unit-abot-epc-basic-0.log ExtraScript: ServiceBinary:/var/lib/juju/tools/unit-abot-epc-basic-0/jujud ServiceArgs:[unit --data-dir /var/lib/juju --unit-name abot-epc-basic/0 --debug]}} ConfName:jujud-unit-abot-epc-basic-0.service UnitName:jujud-unit-abot-epc-basic-0.service Dirname:/var/lib/juju/init/jujud-unit-abot-epc-basic-0 Script:[35 33 47 117 115 114 47 98 105 110 47 101 110 118 32 98 97 115 104 10 10 35 32 83 101 116 32 117 112 32 108 111 103 103 105 110 103 46 10 116 111 117 99 104 32 39 47 118 97 114 47 108 111 103 47 106 117 106 117 47 117 110 105 116 45 97 98 111 116 45 101 112 99 45 98 97 115 105 99 45 48 46 108 111 103 39 10 99 104 111 119 110 32 115 121 115 108 111 103 58 115 121 115 108 111 103 32 39 47 118 97 114 47 108 111 103 47 106 117 106 117 47 117 110 105 116 45 97 98 111 116 45 101 112 99 45 98 97 115 105 99 45 48 46 108 111 103 39 10 99 104 109 111 100 32 48 54 48 48 32 39 47 118 97 114 47 108 111 103 47 106 117 106 117 47 117 110 105 116 45 97 98 111 116 45 101 112 99 45 98 97 115 105 99 45 48 46 108 111 103 39 10 101 120 101 99 32 62 62 32 39 47 118 97 114 47 108 111 103 47 106 117 106 117 47 117 110 105 116 45 97 98 111 116 45 101 112 99 45 98 97 115 105 99 45 48 46 108 111 103 39 10 101 120 101 99 32 50 62 38 49 10 10 35 32 82 117 110 32 116 104 101 32 115 99 114 105 112 116 46 10 39 47 118 97 114 47 108 105 98 47 106 117 106 117 47 116 111 111 108 115 47 117 110 105 116 45 97 98 111 116 45 101 112 99 45 98 97 115 105 99 45 48 47 106 117 106 117 100 39 32 117 110 105 116 32 45 45 100 97 116 97 45 100 105 114 32 39 47 118 97 114 47 108 105 98 47 106 117 106 117 39 32 45 45 117 110 105 116 45 110 97 109 101 32 97 98 111 116 45 101 112 99 45 98 97 115 105 99 47 48 32 45 45 100 101 98 117 103]} unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install Building dependency tree... unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install Reading state information... unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install The following NEW packages will be installed: unit-oai-epc-0: 03:55:49 DEBUG unit.oai-epc/0.install virt-what machine-0: 03:55:50 DEBUG juju.service.systemd service "jujud-unit-abot-epc-basic-0" successfully installed machine-0: 03:55:50 DEBUG juju.service.systemd service "jujud-unit-abot-epc-basic-0" not running unit-oai-epc-0: 03:55:50 DEBUG unit.oai-epc/0.install 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. unit-oai-epc-0: 03:55:50 DEBUG unit.oai-epc/0.install Need to get 13.6 kB of archives. unit-oai-epc-0: 03:55:50 DEBUG unit.oai-epc/0.install After this operation, 66.6 kB of additional disk space will be used. unit-oai-epc-0: 03:55:50 DEBUG unit.oai-epc/0.install Get:1 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty/universe virt-what amd64 1.13-1 [13.6 kB] machine-0: 03:55:50 DEBUG juju.service.systemd service "jujud-unit-abot-epc-basic-0" successfully started unit-abot-epc-basic-0: 03:55:50 INFO juju.cmd running jujud [2.2.5 gc go1.8] unit-abot-epc-basic-0: 03:55:50 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/unit-abot-epc-basic-0/jujud", "unit", "--data-dir", "/var/lib/juju", "--unit-name", "abot-epc-basic/0", "--debug"} unit-abot-epc-basic-0: 03:55:50 DEBUG juju.agent read agent config, format "2.0" unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker start "api" unit-abot-epc-basic-0: 03:55:50 INFO juju.worker start "api" unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: "agent" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "migration-fortress" manifold worker started unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "agent" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.introspection introspection worker listening on "@jujud-unit-abot-epc-basic-0" unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker "api" started unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.introspection stats worker now serving unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "agent" manifold worker started unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-config-watcher" manifold worker stopped: "agent" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "api-caller" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "agent" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "uniter" manifold worker stopped: unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-oai-epc-0: 03:55:50 DEBUG unit.oai-epc/0.install Fetched 13.6 kB in 0s (36.4 kB/s) unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.apicaller connecting with old password unit-abot-epc-basic-0: 03:55:50 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-abot-epc-basic-0: 03:55:50 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-abot-epc-basic-0: 03:55:50 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-0: 03:55:50 DEBUG juju.worker.machiner observed network config updated for "machine-0" to [{1 127.0.0.0/8 65536 0 lo loopback false false loopback 127.0.0.1 [] [] []} {1 ::1/128 65536 0 lo loopback false false loopback ::1 [] [] []} {2 fa:16:3e:34:5b:53 172.16.0.0/24 1450 0 ens3 ethernet false false static 172.16.0.8 [] [] []} {2 fa:16:3e:34:5b:53 1450 0 ens3 ethernet false false manual [] [] []} {3 76:77:0f:0b:9e:f4 1500 0 lxdbr0 bridge false false manual [] [] []} {3 76:77:0f:0b:9e:f4 1500 0 lxdbr0 bridge false false manual [] [] []}] unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.apicaller connected unit-abot-epc-basic-0: 03:55:50 DEBUG juju.worker.apicaller changing password... unit-oai-epc-0: 03:55:51 DEBUG unit.oai-epc/0.install Selecting previously unselected package virt-what. unit-oai-epc-0: 03:55:51 DEBUG unit.oai-epc/0.install (Reading database ... 53121 files and directories currently installed.) unit-oai-epc-0: 03:55:51 DEBUG unit.oai-epc/0.install Preparing to unpack .../virt-what_1.13-1_amd64.deb ... unit-oai-epc-0: 03:55:51 DEBUG unit.oai-epc/0.install Unpacking virt-what (1.13-1) ... unit-abot-epc-basic-0: 03:55:51 DEBUG juju.worker.apicaller password changed unit-abot-epc-basic-0: 03:55:51 DEBUG juju.api RPC connection died unit-abot-epc-basic-0: 03:55:51 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately unit-abot-epc-basic-0: 03:55:51 DEBUG juju.worker.apicaller connecting with current password unit-abot-epc-basic-0: 03:55:51 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-abot-epc-basic-0: 03:55:51 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-abot-epc-basic-0: 03:55:51 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:55:51 DEBUG unit.oai-epc/0.install Processing triggers for man-db (2.6.7.1-1ubuntu1) ... unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.apicaller connected unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "api-caller" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "log-sender" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "upgrader" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "migration-minion" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 INFO juju.worker.migrationminion migration phase is now: NONE unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "metric-spool" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "metric-spool" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "charm-dir" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.leadership abot-epc-basic/0 making initial claim for abot-epc-basic leadership unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.logger initial log config: "=DEBUG" unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.proxyupdater new proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,172.16.0.6,172.30.10.113,::1,localhost", AutoNoProxy:""} unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network no lxc bridge addresses to filter for machine unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "leadership-tracker" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "api-address-updater" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "leadership-tracker" not running: dependency not available unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "metric-spool" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network "lxdbr0" has addresses [fe80::7477:fff:fe0b:9ef4/64 fe80::1/64] unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network including address public:172.30.10.113 for machine unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network including address local-cloud:172.16.0.6 for machine unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network including address local-machine:127.0.0.1 for machine unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network including address local-machine:::1 for machine unit-abot-epc-basic-0: 03:55:52 DEBUG juju.network addresses after filtering: [public:172.30.10.113 local-cloud:172.16.0.6 local-machine:127.0.0.1 local-machine:::1] unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.apiaddressupdater updating API hostPorts to [[172.30.10.113:17070 172.16.0.6:17070 127.0.0.1:17070 [::1]:17070]] unit-abot-epc-basic-0: 03:55:52 DEBUG juju.agent API server address details [["172.30.10.113:17070" "172.16.0.6:17070" "127.0.0.1:17070" "[::1]:17070"]] written to agent config as ["172.16.0.6:17070" "172.30.10.113:17070"] unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "logging-config-updater" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.proxyupdater new apt proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,::1,localhost", AutoNoProxy:""} unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.logger logger setup unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.logger reconfiguring logging from "=DEBUG" to "=DEBUG;unit=DEBUG" unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.dependency "meter-status" manifold worker started unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.meterstatus got meter status change signal from watcher unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.meterstatus acquire lock "machine-lock" for meter status hook execution unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.meterstatus lock "machine-lock" acquired unit-abot-epc-basic-0: 03:55:52 DEBUG juju.worker.meterstatus release lock "machine-lock" for meter status hook execution unit-abot-epc-basic-0: 03:55:52 INFO juju.worker.meterstatus skipped "meter-status-changed" hook (missing) unit-abot-epc-basic-0: 03:55:52 INFO juju.worker.upgrader abort check blocked until version event received unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.upgrader unblocking abort check unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.upgrader desired tool version: 2.2.5 unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.dependency "metric-sender" manifold worker started unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.dependency "uniter" manifold worker started unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.leadership abot-epc-basic/0 promoted to leadership of abot-epc-basic unit-oai-epc-0: 03:55:53 DEBUG unit.oai-epc/0.install Setting up virt-what (1.13-1) ... unit-abot-epc-basic-0: 03:55:53 INFO worker.uniter.jujuc ensure jujuc symlinks in /var/lib/juju/tools/unit-abot-epc-basic-0 unit-abot-epc-basic-0: 03:55:53 INFO worker.uniter.jujuc was a symlink, now looking at /var/lib/juju/tools/2.2.5-xenial-amd64 unit-abot-epc-basic-0: 03:55:53 DEBUG worker.uniter.jujuc jujud path /var/lib/juju/tools/2.2.5-xenial-amd64/jujud unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.uniter starting juju-run listener on unix:/var/lib/juju/agents/unit-abot-epc-basic-0/run.socket unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.uniter unit "abot-epc-basic/0" started unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.uniter juju-run listener running unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.uniter resuming charm install unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.uniter.operation running operation install local:xenial/abot-epc-basic-1 unit-abot-epc-basic-0: 03:55:53 DEBUG juju.worker.uniter.operation preparing operation "install local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:53 INFO juju.worker.uniter.charm downloading local:xenial/abot-epc-basic-1 from API server unit-abot-epc-basic-0: 03:55:53 INFO juju.downloader downloading from local:xenial/abot-epc-basic-1 unit-abot-epc-basic-0: 03:55:53 DEBUG httpbakery client do GET https://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/charms?file=%2A&url=local%3Axenial%2Fabot-epc-basic-1 { unit-abot-epc-basic-0: 03:55:53 DEBUG httpbakery } -> error unit-abot-epc-basic-0: 03:55:53 INFO juju.downloader download complete ("local:xenial/abot-epc-basic-1") unit-abot-epc-basic-0: 03:55:53 INFO juju.downloader download verified ("local:xenial/abot-epc-basic-1") unit-abot-epc-basic-0: 03:55:54 DEBUG juju.worker.uniter.operation executing operation "install local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:54 DEBUG juju.worker.uniter.charm preparing to deploy charm "local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:54 DEBUG juju.worker.uniter.charm deploying charm "local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.charm finishing deploy of charm "local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.operation committing operation "install local:xenial/abot-epc-basic-1" unit-abot-epc-basic-0: 03:55:57 INFO juju.worker.uniter hooks are retried true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got relations change: ok=true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got address change: ok=true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got config change: ok=true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got service change unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got action change: [] ok=true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got unit change unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got storage change: [] ok=true unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.remotestate got leader settings change: ok=true unit-abot-epc-basic-0: 03:55:57 INFO juju.worker.uniter.storage initial storage attachments ready unit-abot-epc-basic-0: 03:55:57 INFO juju.worker.uniter found queued "install" hook unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.operation running operation run install hook unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter acquire lock "machine-lock" for uniter hook execution unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter lock "machine-lock" acquired unit-abot-epc-basic-0: 03:55:57 DEBUG juju.worker.uniter.operation preparing operation "run install hook" unit-abot-epc-basic-0: 03:55:58 DEBUG juju.worker.uniter.operation executing operation "run install hook" unit-oai-epc-0: 03:55:58 DEBUG unit.oai-epc/0.install + PASSWORD=linux unit-oai-epc-0: 03:55:58 DEBUG unit.oai-epc/0.install + debconf-set-selections unit-oai-epc-0: 03:55:58 DEBUG unit.oai-epc/0.install + echo 'mysql-server mysql-server/root_password password linux' unit-abot-epc-basic-0: 03:55:58 DEBUG juju.worker.uniter [AGENT-STATUS] executing: running install hook unit-oai-epc-0: 03:55:58 DEBUG unit.oai-epc/0.install + debconf-set-selections unit-oai-epc-0: 03:55:58 DEBUG unit.oai-epc/0.install + echo 'mysql-server mysql-server/root_password_again password linux' unit-abot-epc-basic-0: 03:55:59 DEBUG worker.uniter.jujuc running hook tool "config-get" unit-abot-epc-basic-0: 03:55:59 DEBUG worker.uniter.jujuc running hook tool "config-get" unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install sudo: unable to resolve host juju-2cc5d4-default-0 unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install << ABOT Installer - Success: Checking sudo privileges >> unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install << ABOT Installer - Success: Application Name abot-functest-basic >> unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install << ABOT Installer - Success: Version 3.1.0 >> unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install << ABOT Installer - Success: External Tester none >> unit-abot-epc-basic-0: 03:55:59 DEBUG unit.abot-epc-basic/0.install Oracle Java Prerequisites... unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install + apt install -y mysql-client unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install WARNING: apt does not have a stable CLI interface yet. Use with caution in scripts. unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install Reading package lists... unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install Building dependency tree... unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install Reading state information... unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install The following extra packages will be installed: unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install libdbd-mysql-perl libdbi-perl libmysqlclient18 libterm-readkey-perl unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install mysql-client-5.5 mysql-client-core-5.5 mysql-common unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install Suggested packages: unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install libclone-perl libmldbm-perl libnet-daemon-perl libplrpc-perl unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install libsql-statement-perl unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install The following NEW packages will be installed: unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install libdbd-mysql-perl libdbi-perl libmysqlclient18 libterm-readkey-perl unit-oai-epc-0: 03:55:59 DEBUG unit.oai-epc/0.install mysql-client mysql-client-5.5 mysql-client-core-5.5 mysql-common unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install 0 upgraded, 8 newly installed, 0 to remove and 0 not upgraded. unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Need to get 3911 kB of archives. unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install After this operation, 43.8 MB of additional disk space will be used. unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:1 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main mysql-common all 5.5.60-0ubuntu0.14.04.1 [12.7 kB] unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:2 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main libmysqlclient18 amd64 5.5.60-0ubuntu0.14.04.1 [597 kB] unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:3 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty/main libdbi-perl amd64 1.630-1 [879 kB] unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:4 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main libdbd-mysql-perl amd64 4.025-1ubuntu0.1 [87.6 kB] unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:5 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty/main libterm-readkey-perl amd64 2.31-1 [27.4 kB] unit-oai-epc-0: 03:56:00 DEBUG unit.oai-epc/0.install Get:6 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main mysql-client-core-5.5 amd64 5.5.60-0ubuntu0.14.04.1 [707 kB] unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Get:7 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main mysql-client-5.5 amd64 5.5.60-0ubuntu0.14.04.1 [1589 kB] unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Get:8 http://nova.clouds.archive.ubuntu.com/ubuntu/ trusty-updates/main mysql-client all 5.5.60-0ubuntu0.14.04.1 [10.9 kB] unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Fetched 3911 kB in 1s (3026 kB/s) unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: keyring `/tmp/tmposntw9kx/secring.gpg' created unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: keyring `/tmp/tmposntw9kx/pubring.gpg' created unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: requesting key EEA14886 from hkp server keyserver.ubuntu.com unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Selecting previously unselected package mysql-common. unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install (Reading database ... 53128 files and directories currently installed.) unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Preparing to unpack .../mysql-common_5.5.60-0ubuntu0.14.04.1_all.deb ... unit-oai-epc-0: 03:56:01 DEBUG unit.oai-epc/0.install Progress: [ 0%] Progress: [ 2%] Unpacking mysql-common (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: /tmp/tmposntw9kx/trustdb.gpg: trustdb created unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: key EEA14886: public key "Launchpad VLC" imported unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: no ultimately trusted keys found unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: Total number processed: 1 unit-abot-epc-basic-0: 03:56:01 DEBUG unit.abot-epc-basic/0.install gpg: imported: 1 (RSA: 1) unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install OK unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install Get:1 http://ppa.launchpad.net/webupd8team/java/ubuntu xenial InRelease [17.5 kB] unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install Hit:2 http://security.ubuntu.com/ubuntu xenial-security InRelease unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install Hit:3 http://nova.clouds.archive.ubuntu.com/ubuntu xenial InRelease unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install Hit:4 http://nova.clouds.archive.ubuntu.com/ubuntu xenial-updates InRelease unit-oai-epc-0: 03:56:02 DEBUG unit.oai-epc/0.install Progress: [ 5%] Progress: [ 7%] Selecting previously unselected package libmysqlclient18:amd64. unit-oai-epc-0: 03:56:02 DEBUG unit.oai-epc/0.install Preparing to unpack .../libmysqlclient18_5.5.60-0ubuntu0.14.04.1_amd64.deb ... unit-abot-epc-basic-0: 03:56:02 DEBUG unit.abot-epc-basic/0.install Hit:5 http://nova.clouds.archive.ubuntu.com/ubuntu xenial-backports InRelease unit-oai-epc-0: 03:56:03 DEBUG unit.oai-epc/0.install Progress: [ 10%] Unpacking libmysqlclient18:amd64 (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:03 DEBUG unit.abot-epc-basic/0.install Get:6 http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 Packages [1556 B] unit-abot-epc-basic-0: 03:56:03 DEBUG unit.abot-epc-basic/0.install Get:7 http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main Translation-en [928 B] unit-oai-epc-0: 03:56:04 DEBUG unit.oai-epc/0.install Progress: [ 12%] Progress: [ 15%] Selecting previously unselected package libdbi-perl. unit-oai-epc-0: 03:56:04 DEBUG unit.oai-epc/0.install Preparing to unpack .../libdbi-perl_1.630-1_amd64.deb ... unit-oai-epc-0: 03:56:04 DEBUG unit.oai-epc/0.install Progress: [ 17%] Unpacking libdbi-perl (1.630-1) ... unit-oai-epc-0: 03:56:05 DEBUG unit.oai-epc/0.install Progress: [ 20%] Progress: [ 22%] Selecting previously unselected package libdbd-mysql-perl. unit-oai-epc-0: 03:56:05 DEBUG unit.oai-epc/0.install Preparing to unpack .../libdbd-mysql-perl_4.025-1ubuntu0.1_amd64.deb ... unit-oai-epc-0: 03:56:05 DEBUG unit.oai-epc/0.install Progress: [ 25%] Unpacking libdbd-mysql-perl (4.025-1ubuntu0.1) ... unit-oai-epc-0: 03:56:06 DEBUG unit.oai-epc/0.install Progress: [ 27%] Progress: [ 30%] Selecting previously unselected package libterm-readkey-perl. unit-oai-epc-0: 03:56:06 DEBUG unit.oai-epc/0.install Preparing to unpack .../libterm-readkey-perl_2.31-1_amd64.deb ... unit-oai-epc-0: 03:56:06 DEBUG unit.oai-epc/0.install Progress: [ 32%] Unpacking libterm-readkey-perl (2.31-1) ... unit-oai-epc-0: 03:56:07 DEBUG unit.oai-epc/0.install Progress: [ 35%] Progress: [ 37%] Selecting previously unselected package mysql-client-core-5.5. unit-oai-epc-0: 03:56:07 DEBUG unit.oai-epc/0.install Preparing to unpack .../mysql-client-core-5.5_5.5.60-0ubuntu0.14.04.1_amd64.deb ... unit-oai-epc-0: 03:56:07 DEBUG unit.oai-epc/0.install Progress: [ 40%] Unpacking mysql-client-core-5.5 (5.5.60-0ubuntu0.14.04.1) ... unit-oai-epc-0: 03:56:08 DEBUG unit.oai-epc/0.install Progress: [ 42%] Progress: [ 45%] Selecting previously unselected package mysql-client-5.5. unit-oai-epc-0: 03:56:08 DEBUG unit.oai-epc/0.install Preparing to unpack .../mysql-client-5.5_5.5.60-0ubuntu0.14.04.1_amd64.deb ... unit-oai-epc-0: 03:56:08 DEBUG unit.oai-epc/0.install Progress: [ 47%] Unpacking mysql-client-5.5 (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:10 DEBUG unit.abot-epc-basic/0.install Fetched 20.0 kB in 1s (12.4 kB/s) unit-oai-epc-0: 03:56:10 DEBUG unit.oai-epc/0.install Progress: [ 50%] Progress: [ 52%] Selecting previously unselected package mysql-client. unit-oai-epc-0: 03:56:10 DEBUG unit.oai-epc/0.install Preparing to unpack .../mysql-client_5.5.60-0ubuntu0.14.04.1_all.deb ... unit-oai-epc-0: 03:56:10 DEBUG unit.oai-epc/0.install Progress: [ 55%] Unpacking mysql-client (5.5.60-0ubuntu0.14.04.1) ... unit-oai-epc-0: 03:56:11 DEBUG unit.oai-epc/0.install Progress: [ 57%] Progress: [ 60%] Processing triggers for man-db (2.6.7.1-1ubuntu1) ... unit-oai-epc-0: 03:56:13 DEBUG unit.oai-epc/0.install Setting up mysql-common (5.5.60-0ubuntu0.14.04.1) ... unit-oai-epc-0: 03:56:14 DEBUG unit.oai-epc/0.install Progress: [ 62%] Progress: [ 65%] Setting up libmysqlclient18:amd64 (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:14 DEBUG unit.abot-epc-basic/0.install Reading package lists... unit-abot-epc-basic-0: 03:56:14 DEBUG unit.abot-epc-basic/0.install sudo: unable to resolve host juju-2cc5d4-default-0 machine-0: 03:56:15 INFO juju.tools.lxdclient using LXD API version "1.0" unit-oai-epc-0: 03:56:15 DEBUG unit.oai-epc/0.install Progress: [ 67%] Progress: [ 70%] Setting up libdbi-perl (1.630-1) ... unit-abot-epc-basic-0: 03:56:15 DEBUG unit.abot-epc-basic/0.install sudo: unable to resolve host juju-2cc5d4-default-0 unit-oai-epc-0: 03:56:15 DEBUG unit.oai-epc/0.install Progress: [ 72%] Progress: [ 75%] Setting up libdbd-mysql-perl (4.025-1ubuntu0.1) ... unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Reading package lists... unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Building dependency tree... unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Reading state information... unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install The following additional packages will be installed: unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install binutils gsfonts gsfonts-x11 java-common libfontenc1 libxfont1 unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install oracle-java8-set-default x11-common xfonts-encodings xfonts-utils unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Suggested packages: unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install binutils-doc binfmt-support visualvm ttf-baekmuk | ttf-unfonts unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | ttf-unfonts-core ttf-kochi-gothic | ttf-sazanami-gothic ttf-kochi-mincho unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | ttf-sazanami-mincho ttf-arphic-uming firefox | firefox-2 | iceweasel unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | mozilla-firefox | iceape-browser | mozilla-browser | epiphany-gecko unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | epiphany-webkit | epiphany-browser | galeon | midbrowser unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | moblin-web-browser | xulrunner | xulrunner-1.9 | konqueror unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install | chromium-browser | midori | google-chrome unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install The following NEW packages will be installed: unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install binutils gsfonts gsfonts-x11 java-common libfontenc1 libxfont1 unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install oracle-java8-installer oracle-java8-set-default x11-common xfonts-encodings unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install xfonts-utils machine-0: 03:56:16 DEBUG juju.worker.proxyupdater new apt proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,::1,localhost", AutoNoProxy:""} unit-oai-epc-0: 03:56:16 DEBUG unit.oai-epc/0.install Progress: [ 77%] Progress: [ 80%] Setting up libterm-readkey-perl (2.31-1) ... unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install 0 upgraded, 11 newly installed, 0 to remove and 0 not upgraded. unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Need to get 6520 kB of archives. unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install After this operation, 20.5 MB of additional disk space will be used. unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Get:1 http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 oracle-java8-installer all 8u171-1~webupd8~0 [33.3 kB] unit-abot-epc-basic-0: 03:56:16 DEBUG unit.abot-epc-basic/0.install Get:2 http://nova.clouds.archive.ubuntu.com/ubuntu xenial-updates/main amd64 binutils amd64 2.26.1-1ubuntu1~16.04.6 [2311 kB] unit-abot-epc-basic-0: 03:56:17 DEBUG unit.abot-epc-basic/0.install Get:3 http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 oracle-java8-set-default all 8u171-1~webupd8~0 [6846 B] unit-abot-epc-basic-0: 03:56:18 DEBUG unit.abot-epc-basic/0.install Get:4 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/main amd64 java-common all 0.56ubuntu2 [7742 B] unit-oai-epc-0: 03:56:18 DEBUG unit.oai-epc/0.install Progress: [ 82%] Progress: [ 85%] Setting up mysql-client-core-5.5 (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:18 DEBUG unit.abot-epc-basic/0.install Get:5 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/main amd64 gsfonts all 1:8.11+urwcyr1.0.7~pre44-4.2ubuntu1 [3374 kB] unit-abot-epc-basic-0: 03:56:18 DEBUG unit.abot-epc-basic/0.install Get:6 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/main amd64 libfontenc1 amd64 1:1.1.3-1 [13.9 kB] unit-abot-epc-basic-0: 03:56:18 DEBUG unit.abot-epc-basic/0.install Get:7 http://nova.clouds.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libxfont1 amd64 1:1.5.1-1ubuntu0.16.04.4 [95.0 kB] unit-abot-epc-basic-0: 03:56:18 DEBUG unit.abot-epc-basic/0.install Get:8 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/main amd64 x11-common all 1:7.7+13ubuntu3 [22.4 kB] unit-abot-epc-basic-0: 03:56:19 DEBUG unit.abot-epc-basic/0.install Get:9 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/main amd64 xfonts-encodings all 1:1.0.4-2 [573 kB] unit-oai-epc-0: 03:56:19 DEBUG unit.oai-epc/0.install Progress: [ 87%] Progress: [ 90%] Setting up mysql-client-5.5 (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:19 DEBUG unit.abot-epc-basic/0.install Get:10 http://nova.clouds.archive.ubuntu.com/ubuntu xenial-updates/main amd64 xfonts-utils amd64 1:7.7+3ubuntu0.16.04.2 [74.6 kB] unit-abot-epc-basic-0: 03:56:19 DEBUG unit.abot-epc-basic/0.install Get:11 http://nova.clouds.archive.ubuntu.com/ubuntu xenial/universe amd64 gsfonts-x11 all 0.24 [7314 B] unit-abot-epc-basic-0: 03:56:19 DEBUG unit.abot-epc-basic/0.install Preconfiguring packages ... unit-oai-epc-0: 03:56:20 DEBUG unit.oai-epc/0.install Progress: [ 92%] Progress: [ 95%] Setting up mysql-client (5.5.60-0ubuntu0.14.04.1) ... unit-abot-epc-basic-0: 03:56:20 DEBUG unit.abot-epc-basic/0.install Fetched 6520 kB in 2s (2298 kB/s) unit-oai-epc-0: 03:56:20 DEBUG unit.oai-epc/0.install Progress: [ 97%] Progress: [100%] Processing triggers for libc-bin (2.19-0ubuntu6.14) ... unit-abot-epc-basic-0: 03:56:21 DEBUG unit.abot-epc-basic/0.install Selecting previously unselected package binutils. unit-abot-epc-basic-0: 03:56:21 DEBUG unit.abot-epc-basic/0.install (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54317 files and directories currently installed.) unit-abot-epc-basic-0: 03:56:21 DEBUG unit.abot-epc-basic/0.install Preparing to unpack .../binutils_2.26.1-1ubuntu1~16.04.6_amd64.deb ... unit-abot-epc-basic-0: 03:56:22 DEBUG unit.abot-epc-basic/0.install Unpacking binutils (2.26.1-1ubuntu1~16.04.6) ... unit-oai-epc-0: 03:56:24 DEBUG unit.oai-epc/0.install ++ virt-what unit-oai-epc-0: 03:56:24 DEBUG unit.oai-epc/0.install + machine_type=kvm unit-oai-epc-0: 03:56:24 DEBUG unit.oai-epc/0.install + clone_repro unit-oai-epc-0: 03:56:24 DEBUG unit.oai-epc/0.install + juju-log 'Fetching and installing openair5G EPC' unit-oai-epc-0: 03:56:24 DEBUG worker.uniter.jujuc running hook tool "juju-log" unit-oai-epc-0: 03:56:24 INFO unit.oai-epc/0.juju-log Fetching and installing openair5G EPC unit-oai-epc-0: 03:56:24 DEBUG unit.oai-epc/0.install + status-set maintenance 'Fetching and installing Openair5G EPC' unit-oai-epc-0: 03:56:24 DEBUG worker.uniter.jujuc running hook tool "status-set" unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + openssl s_client -showcerts -connect gitlab.eurecom.fr:443 unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + echo -n unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + '[' -d /srv/openair-cn ']' unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + cp -f /etc/hosts /home unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install + git clone https://gitlab.eurecom.fr/oai/openair-cn.git /srv/openair-cn unit-oai-epc-0: 03:56:25 DEBUG unit.oai-epc/0.install Cloning into '/srv/openair-cn'... unit-abot-epc-basic-0: 03:56:25 DEBUG unit.abot-epc-basic/0.install Selecting previously unselected package java-common. unit-abot-epc-basic-0: 03:56:25 DEBUG unit.abot-epc-basic/0.install Preparing to unpack .../java-common_0.56ubuntu2_all.deb ... unit-abot-epc-basic-0: 03:56:26 DEBUG unit.abot-epc-basic/0.install Unpacking java-common (0.56ubuntu2) ... unit-abot-epc-basic-0: 03:56:26 DEBUG unit.abot-epc-basic/0.install Processing triggers for libc-bin (2.23-0ubuntu10) ... unit-abot-epc-basic-0: 03:56:26 DEBUG unit.abot-epc-basic/0.install Processing triggers for man-db (2.7.5-1) ... unit-abot-epc-basic-0: 03:56:28 DEBUG unit.abot-epc-basic/0.install Setting up binutils (2.26.1-1ubuntu1~16.04.6) ... unit-abot-epc-basic-0: 03:56:28 DEBUG unit.abot-epc-basic/0.install Processing triggers for libc-bin (2.23-0ubuntu10) ... unit-abot-epc-basic-0: 03:56:29 DEBUG unit.abot-epc-basic/0.install Selecting previously unselected package oracle-java8-installer. unit-abot-epc-basic-0: 03:56:29 DEBUG unit.abot-epc-basic/0.install (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 54533 files and directories currently installed.) unit-abot-epc-basic-0: 03:56:29 DEBUG unit.abot-epc-basic/0.install Preparing to unpack .../oracle-java8-installer_8u171-1~webupd8~0_all.deb ... unit-abot-epc-basic-0: 03:56:29 DEBUG unit.abot-epc-basic/0.install oracle-license-v1-1 license has already been accepted unit-abot-epc-basic-0: 03:56:29 DEBUG unit.abot-epc-basic/0.install Unpacking oracle-java8-installer (8u171-1~webupd8~0) ... machine-1: 03:56:30 INFO juju.cmd running jujud [2.2.5 gc go1.8] machine-1: 03:56:30 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/machine-1/jujud", "machine", "--data-dir", "/var/lib/juju", "--machine-id", "1", "--debug"} machine-1: 03:56:30 DEBUG juju.agent read agent config, format "2.0" machine-1: 03:56:30 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-1: 03:56:30 INFO juju.worker.upgradesteps upgrade steps for 2.2.5 have already been run. machine-1: 03:56:30 DEBUG juju.worker start "engine" machine-1: 03:56:30 INFO juju.worker start "engine" machine-1: 03:56:30 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "state" manifold worker stopped: "agent" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "termination-signal-handler" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "agent" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: "agent" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "agent" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.introspection introspection worker listening on "@jujud-machine-1" machine-1: 03:56:30 DEBUG juju.worker "engine" started machine-1: 03:56:30 DEBUG juju.worker.introspection stats worker now serving machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker stopped: "upgrade-check-gate" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "unconverted-state-workers" manifold worker stopped: "state" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: "state-config-watcher" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "api-caller" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-steps-gate" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-steps-flag" manifold worker stopped: "upgrade-steps-gate" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "state-config-watcher" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-check-gate" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: machine-1: 03:56:30 DEBUG juju.worker.dependency "state" manifold worker stopped: machine-1: 03:56:30 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-steps-flag" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: "api-caller" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "central-hub" manifold worker stopped: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "state" manifold worker stopped: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:30 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-1: 03:56:30 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: machine-1: 03:56:30 DEBUG juju.worker.apicaller connecting with old password unit-abot-epc-basic-0: 03:56:30 DEBUG unit.abot-epc-basic/0.install Processing triggers for mime-support (3.59ubuntu1) ... machine-1: 03:56:30 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:30 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:30 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:30 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not set: dependency not available unit-abot-epc-basic-0: 03:56:30 DEBUG unit.abot-epc-basic/0.install Processing triggers for shared-mime-info (1.5-2ubuntu0.1) ... machine-1: 03:56:31 DEBUG juju.worker.apicaller connected machine-1: 03:56:31 DEBUG juju.worker.apicaller changing password... machine-1: 03:56:31 DEBUG juju.worker.apicaller password changed machine-1: 03:56:31 DEBUG juju.api RPC connection died machine-1: 03:56:31 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately machine-1: 03:56:31 DEBUG juju.worker.apicaller connecting with current password machine-1: 03:56:31 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:31 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:31 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" machine-1: 03:56:31 DEBUG juju.worker.apicaller connected machine-1: 03:56:31 DEBUG juju.worker.dependency "api-caller" manifold worker started machine-1: 03:56:31 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:31 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker started machine-1: 03:56:31 DEBUG juju.wrench couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory machine-1: 03:56:31 DEBUG juju.worker.dependency "upgrade-steps-runner" manifold worker stopped: machine-1: 03:56:31 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:31 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:31 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:31 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:31 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: machine-1: 03:56:32 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "disk-manager" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "upgrader" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "machine-action-runner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "storage-provisioner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "reboot-executor" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "machiner" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker stopped: "migration-fortress" not running: dependency not available machine-1: 03:56:32 INFO juju.worker.upgrader abort check blocked until version event received machine-1: 03:56:32 INFO juju.worker.upgrader unblocking abort check machine-1: 03:56:32 INFO juju.worker.upgrader desired tool version: 2.2.5 machine-1: 03:56:32 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker stopped: gate unlocked machine-1: 03:56:32 DEBUG juju.worker.dependency "migration-fortress" manifold worker stopped: "upgrade-check-flag" not running: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "upgrade-check-flag" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "migration-fortress" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "migration-minion" manifold worker started machine-1: 03:56:32 INFO juju.worker.migrationminion migration phase is now: NONE machine-1: 03:56:32 DEBUG juju.worker.dependency "disk-manager" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "api-address-updater" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "reboot-executor" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.logger initial log config: "=DEBUG" machine-1: 03:56:32 DEBUG juju.worker.dependency "logging-config-updater" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "log-sender" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.logger logger setup machine-1: 03:56:32 DEBUG juju.worker.dependency "tools-version-checker" manifold worker stopped: dependency not available machine-1: 03:56:32 DEBUG juju.worker.dependency "ssh-authkeys-updater" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "machine-action-runner" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "storage-provisioner" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "host-key-reporter" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "machiner" manifold worker started machine-1: 03:56:32 DEBUG juju.worker.dependency "serving-info-setter" manifold worker stopped: resource permanently unavailable machine-1: 03:56:32 DEBUG juju.worker.dependency "ssh-identity-writer" manifold worker stopped: dependency not available machine-1: 03:56:32 DEBUG juju.worker.logger reconfiguring logging from "=DEBUG" to "=DEBUG;unit=DEBUG" machine-1: 03:56:32 DEBUG juju.worker.dependency "unit-agent-deployer" manifold worker started machine-1: 03:56:32 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-1: 03:56:32 DEBUG juju.utils.ssh reading authorised keys file /home/ubuntu/.ssh/authorized_keys machine-1: 03:56:32 DEBUG juju.utils.ssh writing authorised keys file /home/ubuntu/.ssh/authorized_keys machine-1: 03:56:32 DEBUG juju.worker.reboot Reboot worker got action: noop machine-1: 03:56:32 DEBUG juju.network no lxc bridge addresses to filter for machine machine-1: 03:56:32 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-1: 03:56:32 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-1: 03:56:32 DEBUG juju.network including address public:172.30.10.113 for machine machine-1: 03:56:32 DEBUG juju.network including address local-cloud:172.16.0.6 for machine machine-1: 03:56:32 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-1: 03:56:32 DEBUG juju.network including address local-machine:::1 for machine machine-1: 03:56:32 DEBUG juju.network addresses after filtering: [public:172.30.10.113 local-cloud:172.16.0.6 local-machine:127.0.0.1 local-machine:::1] machine-1: 03:56:32 DEBUG juju.worker.apiaddressupdater updating API hostPorts to [[172.30.10.113:17070 172.16.0.6:17070 127.0.0.1:17070 [::1]:17070]] machine-1: 03:56:32 DEBUG juju.worker.proxyupdater new proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,172.16.0.6,172.30.10.113,::1,localhost", AutoNoProxy:""} machine-1: 03:56:32 DEBUG juju.tools.lxdclient connecting to LXD remote "local": "unix:///var/lib/lxd/unix.socket" machine-1: 03:56:32 ERROR juju.worker.proxyupdater can't connect to the local LXD server: LXD socket not found; is LXD installed & running? Please install LXD by running: $ sudo apt-get install lxd and then configure it with: $ newgrp lxd $ lxd init machine-1: 03:56:32 DEBUG juju.worker.proxyupdater new apt proxy settings proxy.Settings{Http:"", Https:"", Ftp:"", NoProxy:"127.0.0.1,::1,localhost", AutoNoProxy:""} machine-1: 03:56:32 DEBUG juju.network no lxc bridge addresses to filter for machine machine-1: 03:56:32 DEBUG juju.network cannot get "lxdbr0" addresses: route ip+net: no such network interface (ignoring) machine-1: 03:56:32 DEBUG juju.worker.dependency "mgo-txn-resumer" manifold worker stopped: dependency not available machine-1: 03:56:32 DEBUG juju.network cannot get "virbr0" addresses: route ip+net: no such network interface (ignoring) machine-1: 03:56:32 DEBUG juju.network including address local-machine:127.0.0.1 for machine machine-1: 03:56:32 DEBUG juju.network including address local-cloud:172.16.0.15 for machine machine-1: 03:56:32 DEBUG juju.network including address local-machine:::1 for machine machine-1: 03:56:32 DEBUG juju.network addresses after filtering: [local-machine:127.0.0.1 local-cloud:172.16.0.15 local-machine:::1] machine-1: 03:56:32 INFO juju.worker.machiner setting addresses for "machine-1" to [local-machine:127.0.0.1 local-cloud:172.16.0.15 local-machine:::1] machine-1: 03:56:32 DEBUG juju.container.kvm kvm-ok output: INFO: /dev/kvm exists KVM acceleration can be used machine-1: 03:56:32 DEBUG juju.service discovered init system "upstart" from series "trusty" machine-1: 03:56:32 DEBUG juju.worker.storageprovisioner volume attachments alive: [], dying: [], dead: [] machine-1: 03:56:32 INFO juju.worker.diskmanager block devices changed: [{vda [] 10240 true } {vdb [/dev/disk/by-label/config-2 /dev/disk/by-uuid/D3BE-DB4B] config-2 D3BE-DB4B 64 vfat false }] machine-1: 03:56:32 INFO juju.worker.deployer checking unit "mysql/0" machine-1: 03:56:32 DEBUG juju.worker.storageprovisioner filesystem attachment alive: [], dying: [], dead: [] machine-1: 03:56:32 DEBUG juju.worker.storageprovisioner filesystems alive: [], dying: [], dead: [] machine-1: 03:56:32 DEBUG juju.worker.storageprovisioner volumes alive: [], dying: [], dead: [] machine-1: 03:56:32 DEBUG juju.agent API server address details [["172.30.10.113:17070" "172.16.0.6:17070" "127.0.0.1:17070" "[::1]:17070"]] written to agent config as ["172.16.0.6:17070" "172.30.10.113:17070"] machine-1: 03:56:32 INFO juju.worker.authenticationworker "machine-1" key updater worker started machine-1: 03:56:33 DEBUG juju.worker.hostkeyreporter 4 SSH host keys reported for machine 1 machine-1: 03:56:33 DEBUG juju.worker.dependency "host-key-reporter" manifold worker stopped: resource permanently unavailable machine-1: 03:56:33 INFO juju.worker.deployer deploying unit "mysql/0" machine-1: 03:56:33 DEBUG juju.worker start "1-container-watcher" machine-1: 03:56:33 DEBUG juju.worker start "stateconverter" machine-1: 03:56:33 DEBUG juju.worker.dependency "unconverted-api-workers" manifold worker started machine-1: 03:56:33 INFO juju.worker start "1-container-watcher" machine-1: 03:56:33 DEBUG juju.worker "1-container-watcher" started machine-1: 03:56:33 INFO juju.worker start "stateconverter" machine-1: 03:56:33 DEBUG juju.cmd.jujud upgrades done, starting worker "1-container-watcher" machine-1: 03:56:33 DEBUG juju.worker "stateconverter" started machine-1: 03:56:33 INFO juju.worker.machiner "machine-1" started unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + cd /srv/openair-cn unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + git checkout v0.3.2-branch unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install Switched to a new branch 'v0.3.2-branch' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install Branch v0.3.2-branch set up to track remote branch v0.3.2-branch from origin. unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + shopt -s nocasematch unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + [[ HEAD != \H\E\A\D ]] unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + cd - unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install /var/lib/juju/agents/unit-oai-epc-0/charm unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + sed -i /phpmyadmin/d /srv/openair-cn/BUILD/TOOLS/build_helper unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + sed -i -r 's/(check_kernel_release_and_install_xtables_addons_oai[^()]+)/#\1/' /srv/openair-cn/BUILD/TOOLS/build_helper unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + juju_install_kernel unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + juju-log 'Check the kernel and update if required' unit-oai-epc-0: 03:56:33 DEBUG worker.uniter.jujuc running hook tool "juju-log" unit-oai-epc-0: 03:56:33 INFO unit.oai-epc/0.juju-log Check the kernel and update if required machine-1: 03:56:33 DEBUG juju.service discovered init system "upstart" from local host unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + check_current_kernel unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_release=3.19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f1 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_version=3 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f2 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_major_revision=19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ uname -r unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_release=3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f1 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_version=3 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f2 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_major_revision=13 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 3 -gt 3 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 3 -eq 3 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 13 -ge 19 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + REQUIRED_KERNEL_IS_INSTALLED=false unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + [[ false == true ]] unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' kvm == lxc -o kvm == docker ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' kvm == '' ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + check_current_kernel unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_release=3.19 machine-1: 03:56:33 DEBUG juju.worker.deployer state addresses: ["172.16.0.6:37017"] machine-1: 03:56:33 DEBUG juju.worker.deployer API addresses: ["172.16.0.6:17070" "172.30.10.113:17070"] unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f1 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_version=3 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f2 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + required_kern_major_revision=19 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ uname -r unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_release=3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f1 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_version=3 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cut -d . -f2 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ echo 3.13.0-149-generic unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + current_kern_major_revision=13 unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 3 -gt 3 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 3 -eq 3 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' 13 -ge 19 ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + REQUIRED_KERNEL_IS_INSTALLED=false unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + [[ false == false ]] unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install ++ cat /var/lib/juju/agents/unit-oai-epc-0/charm/.reboot unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install cat: /var/lib/juju/agents/unit-oai-epc-0/charm/.reboot: No such file or directory unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + '[' '' '!=' reboot ']' unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + juju-log 'installing the required kernel and reboot' unit-oai-epc-0: 03:56:33 DEBUG worker.uniter.jujuc running hook tool "juju-log" unit-oai-epc-0: 03:56:33 INFO unit.oai-epc/0.juju-log installing the required kernel and reboot unit-oai-epc-0: 03:56:33 DEBUG unit.oai-epc/0.install + status-set maintenance 'installing the required kernel and rebooting' machine-1: 03:56:33 INFO juju.service Installing and starting service &{Service:{Name:jujud-unit-mysql-0 Conf:{Desc:juju unit agent for mysql/0 Transient:false AfterStopped: Env:map[JUJU_CONTAINER_TYPE:] Limit:map[] Timeout:300 ExecStart:'/var/lib/juju/tools/unit-mysql-0/jujud' unit --data-dir '/var/lib/juju' --unit-name mysql/0 --debug ExecStopPost: Logfile:/var/log/juju/unit-mysql-0.log ExtraScript: ServiceBinary:/var/lib/juju/tools/unit-mysql-0/jujud ServiceArgs:[unit --data-dir /var/lib/juju --unit-name mysql/0 --debug]}}} unit-oai-epc-0: 03:56:33 DEBUG worker.uniter.jujuc running hook tool "status-set" unit-mysql-0: 03:56:34 INFO juju.cmd running jujud [2.2.5 gc go1.8] unit-mysql-0: 03:56:34 DEBUG juju.cmd args: []string{"/var/lib/juju/tools/unit-mysql-0/jujud", "unit", "--data-dir", "/var/lib/juju", "--unit-name", "mysql/0", "--debug"} unit-mysql-0: 03:56:34 DEBUG juju.agent read agent config, format "2.0" unit-mysql-0: 03:56:34 DEBUG juju.worker start "api" unit-mysql-0: 03:56:34 INFO juju.worker start "api" unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-config-watcher" manifold worker stopped: "agent" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "migration-inactive-flag" manifold worker stopped: "api-caller" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: "agent" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "log-sender" manifold worker stopped: "api-caller" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "agent" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "agent" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "agent" manifold worker started unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.introspection introspection worker listening on "@jujud-unit-mysql-0" unit-mysql-0: 03:56:34 DEBUG juju.worker "api" started unit-mysql-0: 03:56:34 DEBUG juju.worker.introspection stats worker now serving unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "migration-fortress" manifold worker started unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "proxy-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.apicaller connecting with old password unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "uniter" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "hook-retry-strategy" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-sender" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "meter-status" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-collect" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "metric-spool" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.api successfully dialed "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-mysql-0: 03:56:34 INFO juju.api connection established to "wss://172.16.0.6:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "logging-config-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-config-watcher" manifold worker started unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "migration-minion" manifold worker stopped: "api-caller" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "leadership-tracker" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "upgrader" manifold worker stopped: "api-caller" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "charm-dir" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-address-updater" manifold worker stopped: "migration-inactive-flag" not running: dependency not available unit-mysql-0: 03:56:34 DEBUG juju.worker.apicaller connected unit-mysql-0: 03:56:34 DEBUG juju.worker.apicaller changing password... unit-oai-epc-0: 03:56:34 DEBUG unit.oai-epc/0.install + install_required_kernel unit-oai-epc-0: 03:56:34 DEBUG unit.oai-epc/0.install + version=3.19 unit-oai-epc-0: 03:56:34 DEBUG unit.oai-epc/0.install + wget -r -e robots=off --accept-regex '(.*generic.*amd64)|(all).deb' http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/ unit-mysql-0: 03:56:34 DEBUG juju.worker.apicaller password changed unit-mysql-0: 03:56:34 DEBUG juju.api RPC connection died unit-mysql-0: 03:56:34 DEBUG juju.worker.dependency "api-caller" manifold worker stopped: restart immediately unit-oai-epc-0: 03:56:34 DEBUG unit.oai-epc/0.install --2018-05-28 03:56:34-- http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/ unit-oai-epc-0: 03:56:34 DEBUG unit.oai-epc/0.install Resolving kernel.ubuntu.com (kernel.ubuntu.com)... 91.189.94.216 unit-mysql-0: 03:56:34 DEBUG juju.worker.apicaller connecting with current password unit-mysql-0: 03:56:34 DEBUG juju.api successfully dialed "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-mysql-0: 03:56:34 DEBUG juju.api error dialing websocket: dial tcp 172.16.0.6:17070: operation was canceled unit-mysql-0: 03:56:34 INFO juju.api connection established to "wss://172.30.10.113:17070/model/185eadf2-f0e5-47e1-8c09-69aa2d2cc5d4/api" unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Connecting to kernel.ubuntu.com (kernel.ubuntu.com)|91.189.94.216|:80... connected. unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install HTTP request sent, awaiting response... 200 OK unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Length: unspecified [text/html] unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Saving to: 'kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/index.html' unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 0K ....... 63.2M=0s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 2018-05-28 03:56:35 (63.2 MB/s) - 'kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/index.html' saved [8048] unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install --2018-05-28 03:56:35-- http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/linux-headers-3.19.0-031900-generic_3.19.0-031900.201504091832_amd64.deb unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Reusing existing connection to kernel.ubuntu.com:80. unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install HTTP request sent, awaiting response... 200 OK unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Length: 1153230 (1.1M) [application/x-debian-package] unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install Saving to: 'kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/linux-headers-3.19.0-031900-generic_3.19.0-031900.201504091832_amd64.deb' unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 0K .......... .......... .......... .......... .......... 4% 363K 3s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 50K .......... .......... .......... .......... .......... 8% 363K 3s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 100K .......... .......... .......... .......... .......... 13% 366K 3s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 150K .......... .......... .......... .......... .......... 17% 36.3M 2s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 200K .......... .......... .......... .......... .......... 22% 37.6M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 250K .......... .......... .......... .......... .......... 26% 1.38M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 300K .......... .......... .......... .......... .......... 31% 361K 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 350K .......... .......... .......... .......... .......... 35% 40.3M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 400K .......... .......... .......... .......... .......... 39% 37.4M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 450K .......... .......... .......... .......... .......... 44% 48.0M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 500K .......... .......... .......... .......... .......... 48% 56.7M 1s unit-oai-epc-0: 03:56:35 DEBUG unit.oai-epc/0.install 550K .......... .......... .......... .......... .......... 53% 58.3M 1s machine-1: 03:56:35 DEBUG juju.worker.machiner observed network config updated for "machine-1" to [{1 127.0.0.0/8 65536 0 lo loopback false false loopback 127.0.0.1 [] [] []} {1 ::1/128 65536 0 lo loopback false false loopback ::1 [] [] []} {2 fa:16:3e:0d:53:b0 172.16.0.0/24 1450 0 eth0 ethernet false false static 172.16.0.15 [] [] []} {2 fa:16:3e:0d:53:b0 1450 0 eth0 ethernet false false manual [] [] []}] unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 600K .......... .......... .......... .......... .......... 57% 380K 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 650K .......... .......... .......... .......... .......... 62% 22.9M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 700K .......... .......... .......... .......... .......... 66% 30.1M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 750K .......... .......... .......... .......... .......... 71% 34.0M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 800K .......... .......... .......... .......... .......... 75% 50.4M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 850K .......... .......... .......... .......... .......... 79% 48.8M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 900K .......... .......... .......... .......... .......... 84% 69.7M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 950K .......... .......... .......... .......... .......... 88% 69.3M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1000K .......... .......... .......... .......... .......... 93% 69.7M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1050K .......... .......... .......... .......... .......... 97% 64.2M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1100K .......... .......... ...... 100% 66.8M=0.7s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2018-05-28 03:56:36 (1.49 MB/s) - 'kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/linux-headers-3.19.0-031900-generic_3.19.0-031900.201504091832_amd64.deb' saved [1153230/1153230] unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install --2018-05-28 03:56:36-- http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/linux-headers-3.19.0-031900_3.19.0-031900.201504091832_all.deb unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install Reusing existing connection to kernel.ubuntu.com:80. unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install HTTP request sent, awaiting response... 200 OK unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install Length: 13381952 (13M) [application/x-debian-package] unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install Saving to: 'kernel.ubuntu.com/~kernel-ppa/mainline/v3.19-vivid/linux-headers-3.19.0-031900_3.19.0-031900.201504091832_all.deb' unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 0K .......... .......... .......... .......... .......... 0% 48.9M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 50K .......... .......... .......... .......... .......... 0% 47.6M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 100K .......... .......... .......... .......... .......... 1% 47.6M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 150K .......... .......... .......... .......... .......... 1% 53.3M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 200K .......... .......... .......... .......... .......... 1% 43.0M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 250K .......... .......... .......... .......... .......... 2% 54.7M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 300K .......... .......... .......... .......... .......... 2% 58.2M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 350K .......... .......... .......... .......... .......... 3% 61.1M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 400K .......... .......... .......... .......... .......... 3% 69.4M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 450K .......... .......... .......... .......... .......... 3% 80.8M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 500K .......... .......... .......... .......... .......... 4% 87.4M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 550K .......... .......... .......... .......... .......... 4% 93.7M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 600K .......... .......... .......... .......... .......... 4% 80.4M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 650K .......... .......... .......... .......... .......... 5% 92.0M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 700K .......... .......... .......... .......... .......... 5% 92.1M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 750K .......... .......... .......... .......... .......... 6% 93.6M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 800K .......... .......... .......... .......... .......... 6% 82.2M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 850K .......... .......... .......... .......... .......... 6% 92.4M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 900K .......... .......... .......... .......... .......... 7% 91.2M 0s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 950K .......... .......... .......... .......... .......... 7% 407K 2s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1000K .......... .......... .......... .......... .......... 8% 46.3M 2s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1050K .......... .......... .......... .......... .......... 8% 41.7M 2s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1100K .......... .......... .......... .......... .......... 8% 62.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1150K .......... .......... .......... .......... .......... 9% 47.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1200K .......... .......... .......... .......... .......... 9% 48.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1250K .......... .......... .......... .......... .......... 9% 52.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1300K .......... .......... .......... .......... .......... 10% 65.4M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1350K .......... .......... .......... .......... .......... 10% 58.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1400K .......... .......... .......... .......... .......... 11% 74.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1450K .......... .......... .......... .......... .......... 11% 59.6M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1500K .......... .......... .......... .......... .......... 11% 62.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1550K .......... .......... .......... .......... .......... 12% 62.5M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1600K .......... .......... .......... .......... .......... 12% 60.4M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1650K .......... .......... .......... .......... .......... 13% 64.6M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1700K .......... .......... .......... .......... .......... 13% 68.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1750K .......... .......... .......... .......... .......... 13% 74.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1800K .......... .......... .......... .......... .......... 14% 76.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1850K .......... .......... .......... .......... .......... 14% 88.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1900K .......... .......... .......... .......... .......... 14% 88.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 1950K .......... .......... .......... .......... .......... 15% 87.5M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2000K .......... .......... .......... .......... .......... 15% 76.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2050K .......... .......... .......... .......... .......... 16% 88.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2100K .......... .......... .......... .......... .......... 16% 415K 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2150K .......... .......... .......... .......... .......... 16% 50.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2200K .......... .......... .......... .......... .......... 17% 47.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2250K .......... .......... .......... .......... .......... 17% 48.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2300K .......... .......... .......... .......... .......... 17% 43.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2350K .......... .......... .......... .......... .......... 18% 54.4M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2400K .......... .......... .......... .......... .......... 18% 64.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2450K .......... .......... .......... .......... .......... 19% 63.5M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2500K .......... .......... .......... .......... .......... 19% 53.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2550K .......... .......... .......... .......... .......... 19% 56.2M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2600K .......... .......... .......... .......... .......... 20% 76.0M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2650K .......... .......... .......... .......... .......... 20% 57.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2700K .......... .......... .......... .......... .......... 21% 55.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2750K .......... .......... .......... .......... .......... 21% 62.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2800K .......... .......... .......... .......... .......... 21% 57.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2850K .......... .......... .......... .......... .......... 22% 64.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2900K .......... .......... .......... .......... .......... 22% 55.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 2950K .......... .......... .......... .......... .......... 22% 64.5M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3000K .......... .......... .......... .......... .......... 23% 61.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3050K .......... .......... .......... .......... .......... 23% 62.5M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3100K .......... .......... .......... .......... .......... 24% 55.7M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3150K .......... .......... .......... .......... .......... 24% 61.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3200K .......... .......... .......... .......... .......... 24% 59.6M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3250K .......... .......... .......... .......... .......... 25% 61.2M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3300K .......... .......... .......... .......... .......... 25% 56.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3350K .......... .......... .......... .......... .......... 26% 65.1M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3400K .......... .......... .......... .......... .......... 26% 67.3M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3450K .......... .......... .......... .......... .......... 26% 65.2M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3500K .......... .......... .......... .......... .......... 27% 60.6M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3550K .......... .......... .......... .......... .......... 27% 74.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3600K .......... .......... .......... .......... .......... 27% 61.4M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3650K .......... .......... .......... .......... .......... 28% 63.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3700K .......... .......... .......... .......... .......... 28% 75.0M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3750K .......... .......... .......... .......... .......... 29% 83.9M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3800K .......... .......... .......... .......... .......... 29% 87.8M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3850K .......... .......... .......... .......... .......... 29% 87.0M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3900K .......... .......... .......... .......... .......... 30% 77.2M 1s unit-oai-epc-0: 03:56:36 DEBUG unit.oai-epc/0.install 3950K .......... .......... .......... .......... .......... 30% 90.6M 1s unit-oai-epc-0