2018-05-25 06:23:44,349 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | apex | | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod1 | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 06:23:44,352 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file # Clear any old environment that may conflict. for key in $( set | awk '{FS="="} /^OS_/ {print $1}' ); do unset $key ; done export OS_USERNAME=admin export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_DOMAIN_NAME=Default export OS_BAREMETAL_API_VERSION=1.34 export NOVA_VERSION=1.1 export OS_PROJECT_NAME=admin export OS_PASSWORD=EZhwZcgCD6CaJWGE7BDRjvxtq export OS_NO_CACHE=True export COMPUTE_API_VERSION=1.1 export no_proxy=,172.30.9.26,192.30.9.4 export OS_VOLUME_API_VERSION=3 export OS_CLOUDNAME=overcloud export OS_AUTH_URL=http://172.30.9.26:5000/v3 export IRONIC_API_VERSION=1.34 export OS_IDENTITY_API_VERSION=3 export OS_IMAGE_API_VERSION=2 export OS_AUTH_TYPE=password export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available" # Add OS_CLOUDNAME to PS1 if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then export PS1=${PS1:-""} export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1 export CLOUDPROMPT_ENABLED=1 fi export SDN_CONTROLLER_IP=192.30.9.4 export OS_REGION_NAME=regionOne 2018-05-25 06:23:44,353 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-25 06:23:44,353 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------------+---------------+--------------------------+-------------------------------------------------+------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------------+---------------+--------------------------+-------------------------------------------------+------------------------------------+ | healthcheck | 0 | (daily)|(weekly) | First tier to be executed to verify the | connection_check api_check | | | | | basic operations in the VIM. | snaps_health_check | +---------------------+---------------+--------------------------+-------------------------------------------------+------------------------------------+ 2018-05-25 06:23:44,355 - xtesting.ci.run_tests - INFO - Running tier 'healthcheck' 2018-05-25 06:23:44,355 - xtesting.ci.run_tests - INFO - Running test case 'connection_check'... 2018-05-25 06:23:46,327 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-25 06:23:54,266 - xtesting.core.unit - DEBUG - test_glance_connect_fail (snaps.openstack.utils.tests.glance_utils_tests.GlanceSmokeTests) ... ok test_glance_connect_success (snaps.openstack.utils.tests.glance_utils_tests.GlanceSmokeTests) ... ok test_keystone_connect_fail (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneSmokeTests) ... ok test_keystone_connect_success (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneSmokeTests) ... ok test_neutron_connect_fail (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_neutron_connect_success (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_retrieve_ext_network_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronSmokeTests) ... ok test_nova_connect_fail (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_nova_connect_success (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_nova_get_hypervisor_hosts (snaps.openstack.utils.tests.nova_utils_tests.NovaSmokeTests) ... ok test_heat_connect_fail (snaps.openstack.utils.tests.heat_utils_tests.HeatSmokeTests) ... ok test_heat_connect_success (snaps.openstack.utils.tests.heat_utils_tests.HeatSmokeTests) ... ok test_cinder_connect_fail (snaps.openstack.utils.tests.cinder_utils_tests.CinderSmokeTests) ... ok test_cinder_connect_success (snaps.openstack.utils.tests.cinder_utils_tests.CinderSmokeTests) ... ok ---------------------------------------------------------------------- Ran 14 tests in 7.927s OK 2018-05-25 06:23:54,379 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:23:54,379 - xtesting.ci.run_tests - INFO - Test result: +--------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +--------------------------+------------------+------------------+----------------+ | connection_check | functest | 00:08 | PASS | +--------------------------+------------------+------------------+----------------+ 2018-05-25 06:23:54,382 - xtesting.ci.run_tests - INFO - Running test case 'api_check'... 2018-05-25 06:23:55,570 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-25 06:33:12,904 - xtesting.core.unit - DEBUG - test_create_project_minimal (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_create_user_minimal (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_fail_without_proper_credentials (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_fail_without_proper_service (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_success (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_get_endpoint_with_each_interface (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_grant_user_role_to_project (snaps.openstack.utils.tests.keystone_utils_tests.KeystoneUtilsTests) ... ok test_create_admin_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_delete_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_user (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_user_2x (snaps.openstack.tests.create_user_tests.CreateUserSuccessTests) ... ok test_create_delete_project (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_2x (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_bad_domain (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_quota_override (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_update_quotas (snaps.openstack.tests.create_project_tests.CreateProjectSuccessTests) ... ok test_create_project_sec_grp_one_user (snaps.openstack.tests.create_project_tests.CreateProjectUserTests) ... ok test_create_project_sec_grp_two_users (snaps.openstack.tests.create_project_tests.CreateProjectUserTests) ... ok test_create_image_minimal_file (snaps.openstack.utils.tests.glance_utils_tests.GlanceUtilsTests) ... ok test_create_image_minimal_url (snaps.openstack.utils.tests.glance_utils_tests.GlanceUtilsTests) ... ok test_create_network (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_network_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_network_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsNetworkTests) ... ok test_create_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_empty_cidr (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_null_cidr (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_create_subnet_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSubnetTests) ... ok test_add_interface_router (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_missing_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_null_router (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_add_interface_router_null_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_empty_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_invalid_ip (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_invalid_ip_to_subnet (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_ip (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_port_null_network_object (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_router_simple (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_router_with_public_interface (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsRouterTests) ... ok test_create_delete_simple_sec_grp (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_no_name (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_no_rules (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_create_sec_grp_one_rule (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_get_sec_grp_by_id (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsSecurityGroupTests) ... ok test_floating_ips (snaps.openstack.utils.tests.neutron_utils_tests.NeutronUtilsFloatingIpTests) ... ok test_create_delete_keypair (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_key_from_file (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_keypair (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsKeypairTests) ... ok test_create_delete_flavor (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsFlavorTests) ... ok test_create_flavor (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsFlavorTests) ... ok test_create_instance (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceTests) ... ok test_add_remove_volume (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_attach_volume_nowait (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_detach_volume_nowait (snaps.openstack.utils.tests.nova_utils_tests.NovaUtilsInstanceVolumeTests) ... ok test_create_clean_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_delete_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor_all_settings (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_flavor_existing (snaps.openstack.tests.create_flavor_tests.CreateFlavorTests) ... ok test_create_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateSimpleStackTests) ... ok test_create_stack_x2 (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateSimpleStackTests) ... ok test_get_settings_from_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsCreateComplexStackTests) ... ok test_create_flavor_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsFlavorTests) ... ok test_create_keypair_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsKeypairTests) ... ok test_create_security_group_with_stack (snaps.openstack.utils.tests.heat_utils_tests.HeatUtilsSecurityGroupTests) ... ok test_create_delete_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_back (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_both (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_qos_front (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsQoSTests) ... ok test_create_delete_volume (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTests) ... ok test_create_simple_volume (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTests) ... ok test_create_delete_volume_type (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsSimpleVolumeTypeTests) ... ok test_create_simple_volume_type (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsSimpleVolumeTypeTests) ... ok test_create_bad_key_size (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_delete_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_simple_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_with_all_attrs (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsAddEncryptionTests) ... ok test_create_with_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_invalid_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_qos (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok test_create_with_qos_and_encryption (snaps.openstack.utils.tests.cinder_utils_tests.CinderUtilsVolumeTypeCompleteTests) ... ok ---------------------------------------------------------------------- Ran 84 tests in 557.280s OK 2018-05-25 06:33:13,019 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:33:13,019 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | api_check | functest | 09:17 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-25 06:33:13,022 - xtesting.ci.run_tests - INFO - Running test case 'snaps_health_check'... 2018-05-25 06:33:13,682 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-25 06:33:46,009 - xtesting.core.unit - DEBUG - test_check_vm_ip_dhcp (snaps.openstack.tests.create_instance_tests.SimpleHealthCheck) ... ok ---------------------------------------------------------------------- Ran 1 test in 32.326s OK 2018-05-25 06:33:46,222 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:33:46,223 - xtesting.ci.run_tests - INFO - Test result: +----------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------------+------------------+------------------+----------------+ | snaps_health_check | functest | 00:32 | PASS | +----------------------------+------------------+------------------+----------------+ 2018-05-25 06:33:46,225 - xtesting.ci.run_tests - INFO - Xtesting report: +----------------------------+------------------+---------------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +----------------------------+------------------+---------------------+------------------+----------------+ | connection_check | functest | healthcheck | 00:08 | PASS | | api_check | functest | healthcheck | 09:17 | PASS | | snaps_health_check | functest | healthcheck | 00:32 | PASS | +----------------------------+------------------+---------------------+------------------+----------------+ 2018-05-25 06:33:46,227 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK 2018-05-25 06:33:49,031 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | apex | | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod1 | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 06:33:49,033 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file # Clear any old environment that may conflict. for key in $( set | awk '{FS="="} /^OS_/ {print $1}' ); do unset $key ; done export OS_USERNAME=admin export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_DOMAIN_NAME=Default export OS_BAREMETAL_API_VERSION=1.34 export NOVA_VERSION=1.1 export OS_PROJECT_NAME=admin export OS_PASSWORD=EZhwZcgCD6CaJWGE7BDRjvxtq export OS_NO_CACHE=True export COMPUTE_API_VERSION=1.1 export no_proxy=,172.30.9.26,192.30.9.4 export OS_VOLUME_API_VERSION=3 export OS_CLOUDNAME=overcloud export OS_AUTH_URL=http://172.30.9.26:5000/v3 export IRONIC_API_VERSION=1.34 export OS_IDENTITY_API_VERSION=3 export OS_IMAGE_API_VERSION=2 export OS_AUTH_TYPE=password export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available" # Add OS_CLOUDNAME to PS1 if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then export PS1=${PS1:-""} export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1 export CLOUDPROMPT_ENABLED=1 fi export SDN_CONTROLLER_IP=192.30.9.4 export OS_REGION_NAME=regionOne 2018-05-25 06:33:49,033 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-25 06:33:49,034 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ | smoke | 1 | (daily)|(weekly) | Set of basic Functional tests to | vping_ssh vping_userdata | | | | | validate the OPNFV scenarios. | tempest_smoke_serial rally_sanity | | | | | | refstack_defcore patrole snaps_smoke | | | | | | neutron_trunk | +---------------+---------------+--------------------------+------------------------------------------+----------------------------------------------+ 2018-05-25 06:33:49,036 - xtesting.ci.run_tests - INFO - Running tier 'smoke' 2018-05-25 06:33:49,036 - xtesting.ci.run_tests - INFO - Running test case 'vping_ssh'... 2018-05-25 06:33:50,719 - functest.opnfv_tests.openstack.vping.vping_base - DEBUG - ext_net: Munch({u'status': u'ACTIVE', u'subnets': [u'c4948979-8bd8-458e-ab3b-52404bb21603'], u'description': u'', u'provider:physical_network': u'datacentre', u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T06:17:02Z', u'is_default': False, u'revision_number': 5, u'port_security_enabled': True, u'mtu': 1500, u'id': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c', u'provider:segmentation_id': None, u'router:external': True, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'external', u'created_at': u'2018-05-25T06:16:58Z', u'provider:network_type': u'flat', u'tenant_id': u'e44c27cbc727498995754a89e3628dd4', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'e44c27cbc727498995754a89e3628dd4'}) 2018-05-25 06:33:51,118 - xtesting.energy.energy - INFO - API recorder available at : http://energy.opnfv.fr/resources/recorders/environment/lf-pod1 2018-05-25 06:33:51,119 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-25 06:33:51,586 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-25 06:33:51,586 - xtesting.energy.energy - DEBUG - Submitting scenario (vping_ssh/running) 2018-05-25 06:33:51,998 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Begin virtual environment setup 2018-05-25 06:33:51,999 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - vPing Start Time:'2018-05-25 06:33:51' 2018-05-25 06:33:51,999 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating image with name: 'functest-vping--4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:33:51,999 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Image metadata: None 2018-05-25 06:33:53,618 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/functest-vping--4eddadfa-15d2-429b-8da3-e0e69b022aec', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T06:33:52Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'shared', u'file': u'/v2/images/e60dd2b0-a843-4bd4-b3db-d9da719154fe/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'e60dd2b0-a843-4bd4-b3db-d9da719154fe', u'size': None, u'name': u'functest-vping--4eddadfa-15d2-429b-8da3-e0e69b022aec', u'checksum': None, u'self': u'/v2/images/e60dd2b0-a843-4bd4-b3db-d9da719154fe', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T06:33:52Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 06:33:53,618 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating network with name: 'vping-net-4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:33:53,870 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T06:33:53Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'c46f6548-4787-4440-b064-894932b373dd', u'provider:segmentation_id': 22, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'vping-net-4eddadfa-15d2-429b-8da3-e0e69b022aec', u'created_at': u'2018-05-25T06:33:53Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:33:54,373 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T06:33:54Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.130.2', u'end': u'192.168.130.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.130.0/24', u'id': u'f08d2066-13c5-44e1-afd2-81cff551ec94', u'subnetpool_id': None, u'service_types': [], u'name': u'vping-subnet-4eddadfa-15d2-429b-8da3-e0e69b022aec', u'enable_dhcp': True, u'network_id': u'c46f6548-4787-4440-b064-894932b373dd', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T06:33:54Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.130.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:33:54,373 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating router with name: 'vping-router-4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:33:55,754 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - router: Munch({u'status': u'ACTIVE', u'external_gateway_info': {u'network_id': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c', u'enable_snat': True, u'external_fixed_ips': [{u'subnet_id': u'c4948979-8bd8-458e-ab3b-52404bb21603', u'ip_address': u'172.30.9.206'}]}, u'description': u'', u'tags': [], u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T06:33:54Z', u'admin_state_up': True, u'updated_at': u'2018-05-25T06:33:55Z', u'revision_number': 2, u'routes': [], u'project_id': u'51534bd63d854b6c878cd0603da66c99', u'id': u'ebebd818-09ae-42cf-9ff5-c252a9ef351d', u'name': u'vping-router-4eddadfa-15d2-429b-8da3-e0e69b022aec'}) 2018-05-25 06:33:57,224 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating flavor with name: 'vping-flavor-4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:33:57,406 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - flavor: Munch({'name': u'vping-flavor-4eddadfa-15d2-429b-8da3-e0e69b022aec', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'071c2ea3-a469-4123-9025-8bf73ddf0bc9', 'swap': 0}) 2018-05-25 06:33:57,851 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating VM 1 instance with name: 'opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:34:06,658 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm1: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-4eddadfa-15d2-429b-8da3-e0e69b022aec': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:20:06:0d', u'version': 4, u'addr': u'192.168.130.9', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'e60dd2b0-a843-4bd4-b3db-d9da719154fe'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000b', u'OS-SRV-USG:launched_at': u'2018-05-25T06:26:38.000000', 'flavor': Munch({u'id': u'071c2ea3-a469-4123-9025-8bf73ddf0bc9'}), 'az': u'nova', 'id': u'54fef8e7-21cc-4f19-ac31-252567caedcc', 'security_groups': [Munch({u'name': u'vping-sg-4eddadfa-15d2-429b-8da3-e0e69b022aec'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'3f8554483ae64176989c964091a3d18b', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'regionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-25T06:26:38.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-25T06:34:04Z', 'hostId': u'b04b179bb822edeed16993bea1ea4334fc709be32dfa47bb73e85f9a', u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-1.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.9', 'cloud': 'envvars', 'host_id': u'b04b179bb822edeed16993bea1ea4334fc709be32dfa47bb73e85f9a', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-1.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000b', u'OS-SRV-USG:launched_at': u'2018-05-25T06:26:38.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-1.opnfvlf.org', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'51534bd63d854b6c878cd0603da66c99', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-1.opnfvlf.org', 'name': u'opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec', 'adminPass': u'3cKNESzrfymy', 'tenant_id': u'51534bd63d854b6c878cd0603da66c99', 'region': 'regionOne', 'created': u'2018-05-25T06:33:58Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True'}) 2018-05-25 06:34:08,318 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm1 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffdbfff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffdc000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffdc max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f72f0-0x000f72ff] mapped at [ffff8800000f72f0] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb16000-0x1ffcbfff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F70A0 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE14C9 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE13DD 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 00139D (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1451 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffdbfff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd7000-0x1ffdbfff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd3001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 505977637 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffdbfff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffdbfff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffdbfff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128869 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491796K/523752K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.910 MHz processor [ 0.128665] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967640) [ 0.130287] pid_max: default: 32768 minimum: 301 [ 0.131165] ACPI: Core revision 20150930 [ 0.132540] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.133832] Security Framework initialized [ 0.134636] Yama: becoming mindful. [ 0.135354] AppArmor: AppArmor initialized [ 0.136211] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.137522] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.138772] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.139955] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.141315] Initializing cgroup subsys io [ 0.142110] Initializing cgroup subsys memory [ 0.142956] Initializing cgroup subsys devices [ 0.143824] Initializing cgroup subsys freezer [ 0.144677] Initializing cgroup subsys net_cls [ 0.145526] Initializing cgroup subsys perf_event [ 0.146416] Initializing cgroup subsys net_prio [ 0.147276] Initializing cgroup subsys hugetlb [ 0.148143] Initializing cgroup subsys pids [ 0.149005] CPU: Physical Processor ID: 0 [ 0.150511] mce: CPU supports 10 MCE banks [ 0.151361] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.152353] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.163007] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.169340] ftrace: allocating 31920 entries in 125 pages [ 0.195756] smpboot: Max logical packages: 1 [ 0.196596] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.197926] x2apic enabled [ 0.198726] Switched APIC routing to physical x2apic. [ 0.200471] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.201570] smpboot: CPU0: Intel Core Processor (Haswell, no TSX) (family: 0x6, model: 0x3c, stepping: 0x1) [ 0.203532] Performance Events: unsupported p6 CPU model 60 no PMU driver, software events only. [ 0.205286] KVM setup paravirtual spinlock [ 0.206635] x86: Booted up 1 node, 1 CPUs [ 0.207429] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.208792] devtmpfs: initialized [ 0.210497] evm: security.selinux [ 0.211183] evm: security.SMACK64 [ 0.211864] evm: security.SMACK64EXEC [ 0.212596] evm: security.SMACK64TRANSMUTE [ 0.213386] evm: security.SMACK64MMAP [ 0.214115] evm: security.ima [ 0.214746] evm: security.capability [ 0.215538] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.217358] pinctrl core: initialized pinctrl subsystem [ 0.218445] RTC time: 6:26:38, date: 05/25/18 [ 0.219383] NET: Registered protocol family 16 [ 0.220340] cpuidle: using governor ladder [ 0.221141] cpuidle: using governor menu [ 0.221917] PCCT header not found. [ 0.222663] ACPI: bus type PCI registered [ 0.223449] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.224668] PCI: Using configuration type 1 for base access [ 0.226418] ACPI: Added _OSI(Module Device) [ 0.227244] ACPI: Added _OSI(Processor Device) [ 0.228089] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.228974] ACPI: Added _OSI(Processor Aggregator Device) [ 0.231101] ACPI: Interpreter enabled [ 0.231868] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.233683] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.235485] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20150930/hwxface-580) [ 0.237302] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20150930/hwxface-580) [ 0.239107] ACPI: (supports S0 S5) [ 0.239800] ACPI: Using IOAPIC for interrupt routing [ 0.240854] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.243917] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.245049] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.246258] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.247430] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.249625] acpiphp: Slot [3] registered [ 0.250416] acpiphp: Slot [4] registered [ 0.251198] acpiphp: Slot [5] registered [ 0.251988] acpiphp: Slot [6] registered [ 0.252780] acpiphp: Slot [7] registered [ 0.253571] acpiphp: Slot [8] registered [ 0.254360] acpiphp: Slot [9] registered [ 0.255140] acpiphp: Slot [10] registered [ 0.255935] acpiphp: Slot [11] registered [ 0.256739] acpiphp: Slot [12] registered [ 0.257537] acpiphp: Slot [13] registered [ 0.258337] acpiphp: Slot [14] registered [ 0.259131] acpiphp: Slot [15] registered [ 0.270054] acpiphp: Slot [16] registered [ 0.270855] acpiphp: Slot [17] registered [ 0.271653] acpiphp: Slot [18] registered [ 0.272446] acpiphp: Slot [19] registered [ 0.273235] acpiphp: Slot [20] registered [ 0.274030] acpiphp: Slot [21] registered [ 0.274833] acpiphp: Slot [22] registered [ 0.275632] acpiphp: Slot [23] registered [ 0.276428] acpiphp: Slot [24] registered [ 0.277221] acpiphp: Slot [25] registered [ 0.278017] acpiphp: Slot [26] registered [ 0.278828] acpiphp: Slot [27] registered [ 0.279629] acpiphp: Slot [28] registered [ 0.280428] acpiphp: Slot [29] registered [ 0.281219] acpiphp: Slot [30] registered [ 0.282024] acpiphp: Slot [31] registered [ 0.282823] PCI host bridge to bus 0000:00 [ 0.283623] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.284808] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.285991] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.287413] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.288859] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.295335] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.296610] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.297772] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.299024] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.306225] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.307653] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.356260] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.357723] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.359099] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.360472] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.361825] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.363886] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.365011] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.366017] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.367536] vgaarb: loaded [ 0.368123] vgaarb: bridge control possible 0000:00:02.0 [ 0.369224] SCSI subsystem initialized [ 0.370015] ACPI: bus type USB registered [ 0.370797] usbcore: registered new interface driver usbfs [ 0.371794] usbcore: registered new interface driver hub [ 0.372818] usbcore: registered new device driver usb [ 0.373830] PCI: Using ACPI for IRQ routing [ 0.374827] NetLabel: Initializing [ 0.375509] NetLabel: domain hash size = 128 [ 0.376335] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.377255] NetLabel: unlabeled traffic allowed by default [ 0.378315] clocksource: Switched to clocksource kvm-clock [ 0.383796] AppArmor: AppArmor Filesystem Enabled [ 0.384736] pnp: PnP ACPI init [ 0.385623] pnp: PnP ACPI: found 5 devices [ 0.391955] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.393668] NET: Registered protocol family 2 [ 0.394602] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.395851] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.396991] TCP: Hash tables configured (established 4096 bind 4096) [ 0.398145] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.399229] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.400386] NET: Registered protocol family 1 [ 0.401227] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.402321] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.403402] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.417301] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.431577] Trying to unpack rootfs image as initramfs... [ 0.481863] Freeing initrd memory: 4824K (ffff88001fb16000 - ffff88001ffcc000) [ 0.495798] Scanning for low memory corruption every 60 seconds [ 0.497110] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.498247] audit: initializing netlink subsys (disabled) [ 0.499270] audit: type=2000 audit(1527229599.524:1): initialized [ 0.500571] Initialise system trusted keyring [ 0.501493] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.502662] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.504625] zbud: loaded [ 0.505329] VFS: Disk quotas dquot_6.6.0 [ 0.506140] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.507644] fuse init (API version 7.23) [ 0.508539] Key type big_key registered [ 0.509337] Allocating IMA MOK and blacklist keyrings. [ 0.510439] Key type asymmetric registered [ 0.511254] Asymmetric key parser 'x509' registered [ 0.512205] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.514040] io scheduler noop registered [ 0.514843] io scheduler deadline registered (default) [ 0.515845] io scheduler cfq registered [ 0.516672] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.517711] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.518972] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.520425] ACPI: Power Button [PWRF] [ 0.521250] GHES: HEST is not enabled! [ 0.534935] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 0.563108] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 0.565374] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.590839] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.593048] Linux agpgart interface v0.103 [ 0.595077] brd: module loaded [ 0.596235] loop: module loaded [ 0.600934] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 0.602388] GPT:90111 != 2097151 [ 0.603059] GPT:Alternate GPT header not at the end of the disk. [ 0.604140] GPT:90111 != 2097151 [ 0.604811] GPT: Use GNU Parted to correct GPT errors. [ 0.605769] vda: vda1 vda15 [ 0.607519] scsi host0: ata_piix [ 0.608242] scsi host1: ata_piix [ 0.608947] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0a0 irq 14 [ 0.610151] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0a8 irq 15 [ 0.611585] libphy: Fixed MDIO Bus: probed [ 0.612395] tun: Universal TUN/TAP device driver, 1.6 [ 0.613334] tun: (C) 1999-2004 Max Krasnyansky [ 0.615548] PPP generic driver version 2.4.2 [ 0.617234] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.623240] ehci-pci: EHCI PCI platform driver [ 0.624601] ehci-platform: EHCI generic platform driver [ 0.626049] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.627724] ohci-pci: OHCI PCI platform driver [ 0.629097] ohci-platform: OHCI generic platform driver [ 0.630424] uhci_hcd: USB Universal Host Controller Interface driver [ 0.645188] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 0.646311] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 0.647883] uhci_hcd 0000:00:01.2: detected 2 ports [ 0.649034] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 [ 0.650284] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 0.652001] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.653953] usb usb1: Product: UHCI Host Controller [ 0.655138] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 0.656573] usb usb1: SerialNumber: 0000:00:01.2 [ 0.657758] hub 1-0:1.0: USB hub found [ 0.658800] hub 1-0:1.0: 2 ports detected [ 0.659924] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.662400] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.663490] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.664638] mousedev: PS/2 mouse device common for all mice [ 0.666042] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.668254] rtc_cmos 00:00: RTC can wake from S4 [ 0.669456] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 0.670779] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 0.672164] i2c /dev entries driver [ 0.673054] device-mapper: uevent: version 1.0.3 [ 0.674131] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 0.676079] ledtrig-cpu: registered to indicate activity on CPUs [ 0.677561] NET: Registered protocol family 10 [ 0.678701] NET: Registered protocol family 17 [ 0.679697] Key type dns_resolver registered [ 0.680742] microcode: CPU0 sig=0x306c1, pf=0x1, revision=0x1 [ 0.681989] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 0.684090] registered taskstats version 1 [ 0.685021] Loading compiled-in X.509 certificates [ 0.686579] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 0.688700] zswap: loaded using pool lzo/zbud [ 0.690727] Key type trusted registered [ 0.692777] Key type encrypted registered [ 0.693742] AppArmor: AppArmor sha1 policy hashing enabled [ 0.694942] ima: No TPM chip found, activating TPM-bypass! [ 0.696133] evm: HMAC attrs: 0x1 [ 0.697170] Magic number: 2:880:418 [ 0.698177] rtc_cmos 00:00: setting system clock to 2018-05-25 06:26:39 UTC (1527229599) [ 0.700078] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.701347] EDD information not available. [ 0.766856] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.768521] ata1.00: configured for MWDMA2 [ 0.769872] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.782494] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.784096] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.785599] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.787735] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 0.789519] Write protecting the kernel read-only data: 14336k [ 0.791343] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 0.793298] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 0.63 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 1.40 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 1.603973] random: dd urandom read with 13 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.9... Lease of 192.168.130.9 obtained, lease time 43200 Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: RDO OpenStack Compute Container: none Arch: x86_64 CPU(s): 1 @ 3491.910 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT sr0 11:0 468992 config-2 vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCfbr9vJQ814Ct8q6JeQ5L6vomI8r8VTd3qMHsLInE35+pWaitrdD/pTNa3lE0DbM1YDuBKeoVelJsjbtmlIPjcERfjP7YKMNBHlIoEXZrkNhaaUplEmRyUBBkGBjE+6LGrTkduHxyv4aU2ZAYSD3qvubizcoBb+GXcXfPMx6Mo1UHeOqnpi/pqlz1bMUzHloISLL8kzkPR6mEu9IlnP5EmA5ZRzRokUPZABdl2zMiMkPna+hZjGfUcIZsrZR4apu3iWaOSb5YAgltb1tLRkTy+zxwT8Umb4+TTOHrOaY7wvjKkrUAxJ+dYOpUEw4DQ+4FzjIys1zHDQxamT3WXGwPN root@opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec ssh-dss AAAAB3NzaC1kc3MAAACBAJKzeKYtFGkPAXFjFJuhjug8E40FrDUbx3jqoRG2920ZCaM+Btn8vndKAApfFZfOzces70sBDE0iJjD/E3iW5sEzbiotZTaBRve++kl5BhyVWdB2Hj2rCxXj0a34CS3zTFKW0Vzv6HA9U8befLtiTNrr0JE9cn1Ng7LK0lO8on33AAAAFQDMmZUO/n+nI8MFgfoiIIWyZzo/AQAAAIAe4VRK4CSmq6p937lyJryJT8lMmncEHIve8MzSQjCYeJuBPMOM0egjwcQ48KGgo3fZJk1Dxh8Yr4KYMnuzVF1RjABExN8WE5N8Cx6hgreWlNnjcJRx9EsEqpi0rqakYvcbTgPVXSEGNeFw4g4HC7GGR5FYbu2x6ANpbUEaW7us1gAAAIB3Ru2hNNayOdPgmQGDvwpEAdzUMHcj6WU5MNxXnh3VbldTvrU5ybJHdP9E0Wb57PsIcwj4KA/8hr+jyK19xSIJwkRNH2noDGNHIGUtsu2CEc8227peoKhDDcIwpJPQuf6QzfLB6KCWAvjVbQEtdrdGpbp0DT7kZW9dHjgJp+zmHA== root@opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.9,24,fe80::f816:3eff:fe20:60d/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.9 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: 54fef8e7-21cc-4f19-ac31-252567caedcc name: opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec availability-zone: nova local-hostname: opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec launch-index: 0 === cirros: current=0.4.0 uptime=2.07 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-1-4eddadfa-15d2-429b-8da3-e0e69b022aec login: 2018-05-25 06:34:08,319 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating keypair with name: 'vping-keypair-4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:34:08,656 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - keypair: Munch({'public_key': u'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC7B5FiB0B7Do95xR4tZWzPbN98bJXqgITNCwHi3aeD5L2X5e8M4rzEXic4sf0FwYzGwQBXfq94ZOPzcTk8HJfpdPJCZ1Y/1AFwAHzqTacBl1TUaLkm3kL7G7kxAaCV+cfIunIm6swkKAZbMIPwmU6rLhgvQShI2O1occrpRlbPTZPJ3YpNE9lvk93R1EjuW2tIdDjY+gmLO6HBSsFO+i3qYROYRhYqeKaoXBrT/oZB6PG8G4YZrVGdaeJAeaaALXi8iw6WiIHnjNkW7+85HYgv4EzqkiFxq3svubJFCP1qZjgX/M3/vGdYmJ0AoxMlbwd5hnviDO6Qbx0hzotW0lZF Generated-by-Nova', 'private_key': u'-----BEGIN RSA PRIVATE KEY-----\nMIIEpQIBAAKCAQEAuweRYgdAew6PecUeLWVsz2zffGyV6oCEzQsB4t2ng+S9l+Xv\nDOK8xF4nOLH9BcGMxsEAV36veGTj83E5PByX6XTyQmdWP9QBcAB86k2nAZdU1Gi5\nJt5C+xu5MQGglfnHyLpyJurMJCgGWzCD8JlOqy4YL0EoSNjtaHHK6UZWz02Tyd2K\nTRPZb5Pd0dRI7ltrSHQ42PoJizuhwUrBTvot6mETmEYWKnimqFwa0/6GQejxvBuG\nGa1RnWniQHmmgC14vIsOloiB54zZFu/vOR2IL+BM6pIhcat7L7myRQj9amY4F/zN\n/7xnWJidAKMTJW8HeYZ74gzukG8dIc6LVtJWRQIDAQABAoIBAQC0sygk1SrYegXn\nOarhY2gQtHjshyEFE6y7SpJE9bD+focrdk1TXtHQy8MLRPRYllsEQL6qykyQfrG6\nqD8LM/kV5xaVT7AGNTg6VU5bNjFQGT0tiAyzX/TJFk0D6zWTEWSULIdT0HDx0fXq\nLuKbGBPo0b0uEr7wOx6NVbwdTAdde0VOzp8O3NOZH3oFAzB4Tli6iuAwLHJcPjZS\nNr6OlqXzmORi1rgJkEJ8jk0iW6Ac3CuPDf6PTXBkixB5SuZ6prUvQimXJveC21HW\nK3jY60jpxKPtS+aX2ssVKVtzRmOQOjDSrBt1qNVh3SK3ZWLKP8bH2ka3olwa+S/u\nQ8UVI+YhAoGBAPgeZ1kXFLhQi5J0KTgcfZBOJMwXYdpPoP1925CN39HBJHOqLbEw\ne2sKhxVn0egAFocnMucKlCBok+pmzwAv0oXvoERMqEMSC8tSL4x0KN7ksBLzTbFa\nHOTPnZd7BTjR57avytuvtcu2Zi+ELQAkyU4UzXbDdAxqMj2hk0FldWsJAoGBAMD4\naavqJV3MkAScbmTpoNFZBg2dnPHs/De5B4Ql+boCvm0Lsqb/dlVQkRqOinqi2nCm\nOxSNTzM6XFurxdPoqcWWZlAq+hkOhgUVCVu0YgFmJBYjC70uaBT5hkJ/Qopgs5P5\nHhEEKEg3FUmLrcIL68okIZclS9IIVJPh6YRxD9RdAoGBAJ91ssff/JIEOd8yxnbo\nYI5Imn+MG3hZqsafh2fctlaxAYNQgLMazIbbqjtIkO/Adrn/qEgyVUaKz11bG3gs\nQ+mOOnsKpS0NwQS32hUzZjzxznMvaOQtXNp0z/xVtOJyjK+tRPtxbq3wmLW7BczM\n149V8UJ9lOyRp55SZDgoQ5E5AoGBALhVlDwQ85jirEB7Xkkvk9vneozPHvlLNLPW\nIIPv8tnpfRaVshcsuVFOIQ6JU2dK4ffyE0XSpvF8snUvZU7EVkjVHu893qLI6OU5\n7zKW4XgMpjQvTitthSdkJQiooFunfGPB+SKwIfq6A6+5qkZPNPJoCV5k1kTQiFqr\n13IYvtJRAoGAIOzePFzub/aF3JkHSNpzCuM0+qjd79sMwW/zCYJaheyGhuPc0bsZ\n4FpjkR0qg23OaumyDLFEKJQGyG07+nmuxcKMII9lBEXnmPSHqUYwLI9PpW5pc8hs\n6NBe7toS5nBNf1yCLTP9TsKupOkT6GBDZ67e8NtZ56MCRg2tUPqdgRQ=\n-----END RSA PRIVATE KEY-----\n', 'user_id': u'3f8554483ae64176989c964091a3d18b', 'name': u'vping-keypair-4eddadfa-15d2-429b-8da3-e0e69b022aec', 'created_at': '2018-05-25T06:34:08.656317', 'properties': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), 'fingerprint': u'f4:a7:e2:51:23:ad:30:0a:09:a2:0d:cb:34:b9:3a:1b', 'type': 'ssh', 'id': u'vping-keypair-4eddadfa-15d2-429b-8da3-e0e69b022aec'}) 2018-05-25 06:34:08,656 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - private_key: -----BEGIN RSA PRIVATE KEY----- MIIEpQIBAAKCAQEAuweRYgdAew6PecUeLWVsz2zffGyV6oCEzQsB4t2ng+S9l+Xv DOK8xF4nOLH9BcGMxsEAV36veGTj83E5PByX6XTyQmdWP9QBcAB86k2nAZdU1Gi5 Jt5C+xu5MQGglfnHyLpyJurMJCgGWzCD8JlOqy4YL0EoSNjtaHHK6UZWz02Tyd2K TRPZb5Pd0dRI7ltrSHQ42PoJizuhwUrBTvot6mETmEYWKnimqFwa0/6GQejxvBuG Ga1RnWniQHmmgC14vIsOloiB54zZFu/vOR2IL+BM6pIhcat7L7myRQj9amY4F/zN /7xnWJidAKMTJW8HeYZ74gzukG8dIc6LVtJWRQIDAQABAoIBAQC0sygk1SrYegXn OarhY2gQtHjshyEFE6y7SpJE9bD+focrdk1TXtHQy8MLRPRYllsEQL6qykyQfrG6 qD8LM/kV5xaVT7AGNTg6VU5bNjFQGT0tiAyzX/TJFk0D6zWTEWSULIdT0HDx0fXq LuKbGBPo0b0uEr7wOx6NVbwdTAdde0VOzp8O3NOZH3oFAzB4Tli6iuAwLHJcPjZS Nr6OlqXzmORi1rgJkEJ8jk0iW6Ac3CuPDf6PTXBkixB5SuZ6prUvQimXJveC21HW K3jY60jpxKPtS+aX2ssVKVtzRmOQOjDSrBt1qNVh3SK3ZWLKP8bH2ka3olwa+S/u Q8UVI+YhAoGBAPgeZ1kXFLhQi5J0KTgcfZBOJMwXYdpPoP1925CN39HBJHOqLbEw e2sKhxVn0egAFocnMucKlCBok+pmzwAv0oXvoERMqEMSC8tSL4x0KN7ksBLzTbFa HOTPnZd7BTjR57avytuvtcu2Zi+ELQAkyU4UzXbDdAxqMj2hk0FldWsJAoGBAMD4 aavqJV3MkAScbmTpoNFZBg2dnPHs/De5B4Ql+boCvm0Lsqb/dlVQkRqOinqi2nCm OxSNTzM6XFurxdPoqcWWZlAq+hkOhgUVCVu0YgFmJBYjC70uaBT5hkJ/Qopgs5P5 HhEEKEg3FUmLrcIL68okIZclS9IIVJPh6YRxD9RdAoGBAJ91ssff/JIEOd8yxnbo YI5Imn+MG3hZqsafh2fctlaxAYNQgLMazIbbqjtIkO/Adrn/qEgyVUaKz11bG3gs Q+mOOnsKpS0NwQS32hUzZjzxznMvaOQtXNp0z/xVtOJyjK+tRPtxbq3wmLW7BczM 149V8UJ9lOyRp55SZDgoQ5E5AoGBALhVlDwQ85jirEB7Xkkvk9vneozPHvlLNLPW IIPv8tnpfRaVshcsuVFOIQ6JU2dK4ffyE0XSpvF8snUvZU7EVkjVHu893qLI6OU5 7zKW4XgMpjQvTitthSdkJQiooFunfGPB+SKwIfq6A6+5qkZPNPJoCV5k1kTQiFqr 13IYvtJRAoGAIOzePFzub/aF3JkHSNpzCuM0+qjd79sMwW/zCYJaheyGhuPc0bsZ 4FpjkR0qg23OaumyDLFEKJQGyG07+nmuxcKMII9lBEXnmPSHqUYwLI9PpW5pc8hs 6NBe7toS5nBNf1yCLTP9TsKupOkT6GBDZ67e8NtZ56MCRg2tUPqdgRQ= -----END RSA PRIVATE KEY----- 2018-05-25 06:34:09,437 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Creating VM 2 instance with name: 'opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec' 2018-05-25 06:34:17,629 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm2: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-4eddadfa-15d2-429b-8da3-e0e69b022aec': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:88:a8:a9', u'version': 4, u'addr': u'192.168.130.7', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'e60dd2b0-a843-4bd4-b3db-d9da719154fe'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000c', u'OS-SRV-USG:launched_at': u'2018-05-25T06:26:41.000000', 'flavor': Munch({u'id': u'071c2ea3-a469-4123-9025-8bf73ddf0bc9'}), 'az': u'nova', 'id': u'441495d7-eeff-45fe-891d-7ea22145b754', 'security_groups': [Munch({u'name': u'vping-sg-4eddadfa-15d2-429b-8da3-e0e69b022aec'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'3f8554483ae64176989c964091a3d18b', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'regionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-25T06:26:41.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-25T06:34:15Z', 'hostId': u'd27fc3f0d3057d1c48851918bfc2a1dff425f6c67f06bbabc2f8d673', u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-0.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, 'key_name': u'vping-keypair-4eddadfa-15d2-429b-8da3-e0e69b022aec', 'public_v6': '', 'private_v4': u'192.168.130.7', 'cloud': 'envvars', 'host_id': u'd27fc3f0d3057d1c48851918bfc2a1dff425f6c67f06bbabc2f8d673', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-0.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000c', u'OS-SRV-USG:launched_at': u'2018-05-25T06:26:41.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-0.opnfvlf.org', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'51534bd63d854b6c878cd0603da66c99', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-0.opnfvlf.org', 'name': u'opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec', 'adminPass': u'jRYyASe5pndT', 'tenant_id': u'51534bd63d854b6c878cd0603da66c99', 'region': 'regionOne', 'created': u'2018-05-25T06:34:10Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True'}) 2018-05-25 06:34:19,506 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - floating_ip2: Munch({'status': u'DOWN', 'router_id': u'ebebd818-09ae-42cf-9ff5-c252a9ef351d', 'properties': Munch({u'tags': []}), 'description': u'', u'tags': [], 'tenant_id': u'51534bd63d854b6c878cd0603da66c99', 'created_at': u'2018-05-25T06:34:19Z', 'attached': True, 'updated_at': u'2018-05-25T06:34:19Z', 'id': u'62f1a800-3b3a-48e0-8533-fb5a54c82ba5', 'floating_network_id': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c', 'fixed_ip_address': u'192.168.130.7', 'floating_ip_address': u'172.30.9.201', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), 'revision_number': 0, 'router': u'ebebd818-09ae-42cf-9ff5-c252a9ef351d', 'project_id': u'51534bd63d854b6c878cd0603da66c99', 'port_id': u'fddf176f-24ce-43ea-af24-cc0562405fe5', 'port': u'fddf176f-24ce-43ea-af24-cc0562405fe5', 'network': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c'}) 2018-05-25 06:34:21,182 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - vm2 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffdbfff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffdc000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffdc max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f72f0-0x000f72ff] mapped at [ffff8800000f72f0] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb16000-0x1ffcbfff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F70A0 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE14C9 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE13DD 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 00139D (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1451 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffdbfff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd7000-0x1ffdbfff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd3001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 545983507 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffdbfff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffdbfff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffdbfff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128869 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491796K/523752K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.912 MHz processor [ 0.146311] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967648) [ 0.147926] pid_max: default: 32768 minimum: 301 [ 0.148799] ACPI: Core revision 20150930 [ 0.150587] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.152246] Security Framework initialized [ 0.153280] Yama: becoming mindful. [ 0.154217] AppArmor: AppArmor initialized [ 0.155306] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.156965] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.158584] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.160086] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.161842] Initializing cgroup subsys io [ 0.162867] Initializing cgroup subsys memory [ 0.163945] Initializing cgroup subsys devices [ 0.165046] Initializing cgroup subsys freezer [ 0.166140] Initializing cgroup subsys net_cls [ 0.167225] Initializing cgroup subsys perf_event [ 0.168367] Initializing cgroup subsys net_prio [ 0.169483] Initializing cgroup subsys hugetlb [ 0.170574] Initializing cgroup subsys pids [ 0.171669] CPU: Physical Processor ID: 0 [ 0.173404] mce: CPU supports 10 MCE banks [ 0.174459] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.175706] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.188096] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.194759] ftrace: allocating 31920 entries in 125 pages [ 0.223295] smpboot: Max logical packages: 1 [ 0.224175] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.225511] x2apic enabled [ 0.226305] Switched APIC routing to physical x2apic. [ 0.228054] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.229165] smpboot: CPU0: Intel Core Processor (Haswell, no TSX) (family: 0x6, model: 0x3c, stepping: 0x1) [ 0.231123] Performance Events: unsupported p6 CPU model 60 no PMU driver, software events only. [ 0.232875] KVM setup paravirtual spinlock [ 0.234192] x86: Booted up 1 node, 1 CPUs [ 0.234981] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.236364] devtmpfs: initialized [ 0.238079] evm: security.selinux [ 0.238766] evm: security.SMACK64 [ 0.239445] evm: security.SMACK64EXEC [ 0.240171] evm: security.SMACK64TRANSMUTE [ 0.240961] evm: security.SMACK64MMAP [ 0.241695] evm: security.ima [ 0.242322] evm: security.capability [ 0.243125] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.244944] pinctrl core: initialized pinctrl subsystem [ 0.246038] RTC time: 6:26:41, date: 05/25/18 [ 0.246983] NET: Registered protocol family 16 [ 0.247948] cpuidle: using governor ladder [ 0.248768] cpuidle: using governor menu [ 0.249538] PCCT header not found. [ 0.250283] ACPI: bus type PCI registered [ 0.251076] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.252295] PCI: Using configuration type 1 for base access [ 0.254053] ACPI: Added _OSI(Module Device) [ 0.254880] ACPI: Added _OSI(Processor Device) [ 0.255724] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.256601] ACPI: Added _OSI(Processor Aggregator Device) [ 0.258753] ACPI: Interpreter enabled [ 0.259517] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.261313] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.263100] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20150930/hwxface-580) [ 0.264912] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20150930/hwxface-580) [ 0.266724] ACPI: (supports S0 S5) [ 0.267425] ACPI: Using IOAPIC for interrupt routing [ 0.268557] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.271644] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.272763] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.273961] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.275119] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.277290] acpiphp: Slot [3] registered [ 0.278079] acpiphp: Slot [4] registered [ 0.278865] acpiphp: Slot [5] registered [ 0.279645] acpiphp: Slot [6] registered [ 0.280423] acpiphp: Slot [7] registered [ 0.281205] acpiphp: Slot [8] registered [ 0.281997] acpiphp: Slot [9] registered [ 0.282778] acpiphp: Slot [10] registered [ 0.283570] acpiphp: Slot [11] registered [ 0.284369] acpiphp: Slot [12] registered [ 0.285164] acpiphp: Slot [13] registered [ 0.285967] acpiphp: Slot [14] registered [ 0.286764] acpiphp: Slot [15] registered [ 0.297683] acpiphp: Slot [16] registered [ 0.298483] acpiphp: Slot [17] registered [ 0.299275] acpiphp: Slot [18] registered [ 0.300084] acpiphp: Slot [19] registered [ 0.300875] acpiphp: Slot [20] registered [ 0.301674] acpiphp: Slot [21] registered [ 0.302474] acpiphp: Slot [22] registered [ 0.303264] acpiphp: Slot [23] registered [ 0.304061] acpiphp: Slot [24] registered [ 0.304856] acpiphp: Slot [25] registered [ 0.305648] acpiphp: Slot [26] registered [ 0.306449] acpiphp: Slot [27] registered [ 0.307242] acpiphp: Slot [28] registered [ 0.308044] acpiphp: Slot [29] registered [ 0.308842] acpiphp: Slot [30] registered [ 0.309635] acpiphp: Slot [31] registered [ 0.310425] PCI host bridge to bus 0000:00 [ 0.311217] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.312406] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.313585] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.314998] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.316420] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.322725] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.323979] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.325128] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.326361] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.333409] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.334847] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.384399] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.385898] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.387320] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.388726] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.390099] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.391368] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.392903] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.393980] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.395515] vgaarb: loaded [ 0.396113] vgaarb: bridge control possible 0000:00:02.0 [ 0.397269] SCSI subsystem initialized [ 0.398097] ACPI: bus type USB registered [ 0.398914] usbcore: registered new interface driver usbfs [ 0.399939] usbcore: registered new interface driver hub [ 0.400938] usbcore: registered new device driver usb [ 0.401992] PCI: Using ACPI for IRQ routing [ 0.403026] NetLabel: Initializing [ 0.403737] NetLabel: domain hash size = 128 [ 0.404584] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.405529] NetLabel: unlabeled traffic allowed by default [ 0.406631] clocksource: Switched to clocksource kvm-clock [ 0.412501] AppArmor: AppArmor Filesystem Enabled [ 0.413476] pnp: PnP ACPI init [ 0.414390] pnp: PnP ACPI: found 5 devices [ 0.421110] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.423392] NET: Registered protocol family 2 [ 0.424656] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.426385] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.427979] TCP: Hash tables configured (established 4096 bind 4096) [ 0.429562] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.431016] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.432589] NET: Registered protocol family 1 [ 0.433736] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.435212] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.436635] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.451056] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.465671] Trying to unpack rootfs image as initramfs... [ 0.516258] Freeing initrd memory: 4824K (ffff88001fb16000 - ffff88001ffcc000) [ 0.530907] Scanning for low memory corruption every 60 seconds [ 0.532626] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.534169] audit: initializing netlink subsys (disabled) [ 0.535567] audit: type=2000 audit(1527229602.795:1): initialized [ 0.537282] Initialise system trusted keyring [ 0.538485] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.540058] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.542419] zbud: loaded [ 0.550247] VFS: Disk quotas dquot_6.6.0 [ 0.551408] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.553359] fuse init (API version 7.23) [ 0.554519] Key type big_key registered [ 0.555573] Allocating IMA MOK and blacklist keyrings. [ 0.557024] Key type asymmetric registered [ 0.558111] Asymmetric key parser 'x509' registered [ 0.559392] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.561330] io scheduler noop registered [ 0.562375] io scheduler deadline registered (default) [ 0.563694] io scheduler cfq registered [ 0.564792] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.566186] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.567878] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.569794] ACPI: Power Button [PWRF] [ 0.571267] GHES: HEST is not enabled! [ 0.585269] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 0.614486] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 0.617349] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.641597] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.644180] Linux agpgart interface v0.103 [ 0.646471] brd: module loaded [ 0.647828] loop: module loaded [ 0.652590] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 0.654529] GPT:90111 != 2097151 [ 0.655423] GPT:Alternate GPT header not at the end of the disk. [ 0.656895] GPT:90111 != 2097151 [ 0.657781] GPT: Use GNU Parted to correct GPT errors. [ 0.659069] vda: vda1 vda15 [ 0.660714] scsi host0: ata_piix [ 0.661643] scsi host1: ata_piix [ 0.662536] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0a0 irq 14 [ 0.664137] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0a8 irq 15 [ 0.665965] libphy: Fixed MDIO Bus: probed [ 0.667231] tun: Universal TUN/TAP device driver, 1.6 [ 0.668518] tun: (C) 1999-2004 Max Krasnyansky [ 0.671042] PPP generic driver version 2.4.2 [ 0.672364] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.674111] ehci-pci: EHCI PCI platform driver [ 0.675379] ehci-platform: EHCI generic platform driver [ 0.676817] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.678459] ohci-pci: OHCI PCI platform driver [ 0.679711] ohci-platform: OHCI generic platform driver [ 0.681144] uhci_hcd: USB Universal Host Controller Interface driver [ 0.696130] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 0.697587] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 0.699704] uhci_hcd 0000:00:01.2: detected 2 ports [ 0.701161] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 [ 0.702786] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 0.704575] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.706715] usb usb1: Product: UHCI Host Controller [ 0.708076] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 0.709748] usb usb1: SerialNumber: 0000:00:01.2 [ 0.711118] hub 1-0:1.0: USB hub found [ 0.712224] hub 1-0:1.0: 2 ports detected [ 0.713486] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.716276] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.717662] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.719102] mousedev: PS/2 mouse device common for all mice [ 0.720854] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.723875] rtc_cmos 00:00: RTC can wake from S4 [ 0.725911] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 0.728047] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 0.730247] i2c /dev entries driver [ 0.731621] device-mapper: uevent: version 1.0.3 [ 0.733325] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 0.736211] ledtrig-cpu: registered to indicate activity on CPUs [ 0.738554] NET: Registered protocol family 10 [ 0.739632] NET: Registered protocol family 17 [ 0.740503] Key type dns_resolver registered [ 0.741428] microcode: CPU0 sig=0x306c1, pf=0x1, revision=0x1 [ 0.742499] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 0.744233] registered taskstats version 1 [ 0.745050] Loading compiled-in X.509 certificates [ 0.746474] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 0.748314] zswap: loaded using pool lzo/zbud [ 0.750166] Key type trusted registered [ 0.751703] Key type encrypted registered [ 0.752525] AppArmor: AppArmor sha1 policy hashing enabled [ 0.753529] ima: No TPM chip found, activating TPM-bypass! [ 0.754542] evm: HMAC attrs: 0x1 [ 0.755485] Magic number: 2:880:418 [ 0.756320] rtc_cmos 00:00: setting system clock to 2018-05-25 06:26:42 UTC (1527229602) [ 0.757884] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.758983] EDD information not available. [ 0.819182] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.820602] ata1.00: configured for MWDMA2 [ 0.821791] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.834254] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.835465] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.836703] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.838553] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 0.840088] Write protecting the kernel read-only data: 14336k [ 0.841630] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 0.843402] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 0.65 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep [ 0.899891] random: blkid urandom read with 10 bits of entropy available info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 1.05 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.7... Lease of 192.168.130.7 obtained, lease time 43200 Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: RDO OpenStack Compute Container: none Arch: x86_64 CPU(s): 1 @ 3491.912 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT sr0 11:0 468992 config-2 vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDbhUEryUpxEOGgQJydpxivB/y1IAwSdhOXO/Q4/4SsMoqs6vgFDlG/7XsAkXca70fbkzfH9zp5i3f3aCSg22Np3w4Rg5aeAHPv/5LpNYwOSOzBRWPgn47JPv82cmzNu8uRAoxBvm9iVEEehHjJiNaACTD4qqjX/9kcId/IaTT0ca/pu1MZZjsIFYFyQNFzOW3BnG8TucZLLhuz5K2MSVYHziLFby8BrgzEzTlvjumzm0ebCMhwMqvXOnMseDs1Jubj2whsMEb1VswESAW/Amcc8tnxKSbQ2qXPsu+IOPYhYheQ9CvhvhBJyAypMGO+o4m9I2V8K9hfnaEwUD46rqLz root@opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec ssh-dss AAAAB3NzaC1kc3MAAACBAM6k2bxnyPzkCxKTRo1FmHdlDNlCtPx5yImZkwAjlghRiayhZLJFYx6wXvrPdqtdy8coKQQqZoRFnhzOnytnIguXL+KOJQuJjzvrxXZn0Hht3WtEmfkF328ao6uFc1DVB2r6zGcqO9VlN3NAzVWSiyc3Iu0OGhWNHRKYctGdExK9AAAAFQDK7IlLjWrzcQw6MrrU4egX5viLzQAAAIEAxlj/FKinTDAaUSlVzARzlSGLoESQ0ARz0++Re50ct6UROqzU8VkNmfWx+Qd8vWbtHUNpTbBlHF4rSyd9mVFPDfLPvZRHmjk6ZizFsGuQosRSnAbJBkVOLe8XzQ8bfljLTNcrYUIhv8RTlTQnxgU/VXGjhlgSBvDEqWv4yAs8780AAACAfW1MYhp0zqr6xsfaKE3uSPzk0Dr2D6Ag1v4Y4h7kp+pgK/0FuzaH7jUjtu3kVPOmQPAmgfGSPcXLFUF5Cu6hj0gytia9GJXB7JUjqV8PgWM/QKia2AP98fUvQVg1kvFxwv/i+0n4DtGxG2NdQyHFLDta7etxhJec/43tds78qE0= root@opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.7,24,fe80::f816:3eff:fe88:a8a9/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.7 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: 441495d7-eeff-45fe-891d-7ea22145b754 name: opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec availability-zone: nova local-hostname: opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec launch-index: 0 === cirros: current=0.4.0 latest=0.4.0 uptime=2.83 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-2-ssh--4eddadfa-15d2-429b-8da3-e0e69b022aec login: 2018-05-25 06:34:21,183 - functest.opnfv_tests.openstack.vping.vping_ssh - INFO - Begin test execution 2018-05-25 06:34:23,886 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - ssh: 2018-05-25 06:34:23,890 - functest.opnfv_tests.openstack.vping.vping_ssh - DEBUG - ping output: >> 2018-05-25 06:34:23,894 - xtesting.energy.energy - DEBUG - Restoring previous scenario (default/default) 2018-05-25 06:34:23,894 - xtesting.energy.energy - DEBUG - Submitting scenario (default/default) 2018-05-25 06:34:24,433 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:34:24,433 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | vping_ssh | functest | 00:32 | PASS | +-------------------+------------------+------------------+----------------+ 2018-05-25 06:34:34,842 - xtesting.ci.run_tests - INFO - Running test case 'vping_userdata'... 2018-05-25 06:34:35,524 - functest.opnfv_tests.openstack.vping.vping_base - DEBUG - ext_net: Munch({u'status': u'ACTIVE', u'subnets': [u'c4948979-8bd8-458e-ab3b-52404bb21603'], u'description': u'', u'provider:physical_network': u'datacentre', u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T06:17:02Z', u'is_default': False, u'revision_number': 5, u'port_security_enabled': True, u'mtu': 1500, u'id': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c', u'provider:segmentation_id': None, u'router:external': True, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'external', u'created_at': u'2018-05-25T06:16:58Z', u'provider:network_type': u'flat', u'tenant_id': u'e44c27cbc727498995754a89e3628dd4', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'e44c27cbc727498995754a89e3628dd4'}) 2018-05-25 06:34:35,525 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Begin virtual environment setup 2018-05-25 06:34:35,525 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - vPing Start Time:'2018-05-25 06:34:35' 2018-05-25 06:34:35,525 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating image with name: 'functest-vping--3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:35,525 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Image metadata: None 2018-05-25 06:34:36,521 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/functest-vping--3c5cf7d5-54bc-477f-960a-04cb9f119225', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T06:34:35Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'shared', u'file': u'/v2/images/914afe03-14b0-4fea-bd5c-f45dcc675518/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'914afe03-14b0-4fea-bd5c-f45dcc675518', u'size': None, u'name': u'functest-vping--3c5cf7d5-54bc-477f-960a-04cb9f119225', u'checksum': None, u'self': u'/v2/images/914afe03-14b0-4fea-bd5c-f45dcc675518', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T06:34:35Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 06:34:36,521 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating network with name: 'vping-net-3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:36,851 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T06:34:36Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'9a86ece1-fdda-46ce-b15b-f50eed0cc669', u'provider:segmentation_id': 40, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'vping-net-3c5cf7d5-54bc-477f-960a-04cb9f119225', u'created_at': u'2018-05-25T06:34:36Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:34:37,307 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T06:34:37Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.130.2', u'end': u'192.168.130.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.130.0/24', u'id': u'0c6817c8-5d83-4248-bde9-846c4fc5cdcb', u'subnetpool_id': None, u'service_types': [], u'name': u'vping-subnet-3c5cf7d5-54bc-477f-960a-04cb9f119225', u'enable_dhcp': True, u'network_id': u'9a86ece1-fdda-46ce-b15b-f50eed0cc669', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T06:34:37Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.130.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:34:37,307 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating router with name: 'vping-router-3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:38,772 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - router: Munch({u'status': u'ACTIVE', u'external_gateway_info': {u'network_id': u'c6bc58df-e3b9-4bf6-8e71-7467451b945c', u'enable_snat': True, u'external_fixed_ips': [{u'subnet_id': u'c4948979-8bd8-458e-ab3b-52404bb21603', u'ip_address': u'172.30.9.208'}]}, u'description': u'', u'tags': [], u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T06:34:37Z', u'admin_state_up': True, u'updated_at': u'2018-05-25T06:34:38Z', u'revision_number': 2, u'routes': [], u'project_id': u'51534bd63d854b6c878cd0603da66c99', u'id': u'e6ecefa1-907f-48ca-bfde-a46f64e9e448', u'name': u'vping-router-3c5cf7d5-54bc-477f-960a-04cb9f119225'}) 2018-05-25 06:34:40,304 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating flavor with name: 'vping-flavor-3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:40,514 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - flavor: Munch({'name': u'vping-flavor-3c5cf7d5-54bc-477f-960a-04cb9f119225', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'4872ec2f-0d20-4a55-9741-be26f0da69a7', 'swap': 0}) 2018-05-25 06:34:40,957 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating VM 1 instance with name: 'opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:47,553 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm1: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-3c5cf7d5-54bc-477f-960a-04cb9f119225': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:97:65:97', u'version': 4, u'addr': u'192.168.130.10', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'914afe03-14b0-4fea-bd5c-f45dcc675518'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000d', u'OS-SRV-USG:launched_at': u'2018-05-25T06:27:20.000000', 'flavor': Munch({u'id': u'4872ec2f-0d20-4a55-9741-be26f0da69a7'}), 'az': u'nova', 'id': u'ba1a0923-1193-4c8b-ab86-dd8bdd333574', 'security_groups': [Munch({u'name': u'vping-sg-3c5cf7d5-54bc-477f-960a-04cb9f119225'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'3f8554483ae64176989c964091a3d18b', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'regionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-25T06:27:20.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-25T06:34:46Z', 'hostId': u'b04b179bb822edeed16993bea1ea4334fc709be32dfa47bb73e85f9a', u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-1.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.10', 'cloud': 'envvars', 'host_id': u'b04b179bb822edeed16993bea1ea4334fc709be32dfa47bb73e85f9a', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-1.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000d', u'OS-SRV-USG:launched_at': u'2018-05-25T06:27:20.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-1.opnfvlf.org', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'51534bd63d854b6c878cd0603da66c99', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-1.opnfvlf.org', 'name': u'opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225', 'adminPass': u'J9Mx2Wqn6pTQ', 'tenant_id': u'51534bd63d854b6c878cd0603da66c99', 'region': 'regionOne', 'created': u'2018-05-25T06:34:41Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True'}) 2018-05-25 06:34:49,749 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm1 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffdbfff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffdc000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffdc max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f72f0-0x000f72ff] mapped at [ffff8800000f72f0] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb16000-0x1ffcbfff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F70A0 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE14C9 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE13DD 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 00139D (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1451 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffdbfff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd7000-0x1ffdbfff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd3001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 469491788 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffdbfff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffdbfff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffdbfff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128869 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491796K/523752K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.910 MHz processor [ 0.127217] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967640) [ 0.128847] pid_max: default: 32768 minimum: 301 [ 0.129718] ACPI: Core revision 20150930 [ 0.131081] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.132352] Security Framework initialized [ 0.133145] Yama: becoming mindful. [ 0.133854] AppArmor: AppArmor initialized [ 0.134702] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.136000] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.137239] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.138419] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.139770] Initializing cgroup subsys io [ 0.140557] Initializing cgroup subsys memory [ 0.141393] Initializing cgroup subsys devices [ 0.142238] Initializing cgroup subsys freezer [ 0.143091] Initializing cgroup subsys net_cls [ 0.143939] Initializing cgroup subsys perf_event [ 0.144824] Initializing cgroup subsys net_prio [ 0.145683] Initializing cgroup subsys hugetlb [ 0.146534] Initializing cgroup subsys pids [ 0.147389] CPU: Physical Processor ID: 0 [ 0.148884] mce: CPU supports 10 MCE banks [ 0.149709] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.150690] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.161713] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.168043] ftrace: allocating 31920 entries in 125 pages [ 0.194246] smpboot: Max logical packages: 1 [ 0.195080] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.196397] x2apic enabled [ 0.197194] Switched APIC routing to physical x2apic. [ 0.198929] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.200011] smpboot: CPU0: Intel Core Processor (Haswell, no TSX) (family: 0x6, model: 0x3c, stepping: 0x1) [ 0.201943] Performance Events: unsupported p6 CPU model 60 no PMU driver, software events only. [ 0.203671] KVM setup paravirtual spinlock [ 0.204935] x86: Booted up 1 node, 1 CPUs [ 0.205720] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.207066] devtmpfs: initialized [ 0.208762] evm: security.selinux [ 0.209442] evm: security.SMACK64 [ 0.210117] evm: security.SMACK64EXEC [ 0.210839] evm: security.SMACK64TRANSMUTE [ 0.211623] evm: security.SMACK64MMAP [ 0.212341] evm: security.ima [ 0.212968] evm: security.capability [ 0.213753] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.215541] pinctrl core: initialized pinctrl subsystem [ 0.216602] RTC time: 6:27:20, date: 05/25/18 [ 0.217537] NET: Registered protocol family 16 [ 0.218485] cpuidle: using governor ladder [ 0.219281] cpuidle: using governor menu [ 0.220042] PCCT header not found. [ 0.220769] ACPI: bus type PCI registered [ 0.221552] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.222761] PCI: Using configuration type 1 for base access [ 0.224477] ACPI: Added _OSI(Module Device) [ 0.225294] ACPI: Added _OSI(Processor Device) [ 0.226135] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.227011] ACPI: Added _OSI(Processor Aggregator Device) [ 0.229126] ACPI: Interpreter enabled [ 0.229873] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.231672] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.233455] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20150930/hwxface-580) [ 0.235249] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20150930/hwxface-580) [ 0.237039] ACPI: (supports S0 S5) [ 0.237724] ACPI: Using IOAPIC for interrupt routing [ 0.238644] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.241805] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.242905] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.244093] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.245248] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.247396] acpiphp: Slot [3] registered [ 0.248177] acpiphp: Slot [4] registered [ 0.248952] acpiphp: Slot [5] registered [ 0.249735] acpiphp: Slot [6] registered [ 0.250507] acpiphp: Slot [7] registered [ 0.251282] acpiphp: Slot [8] registered [ 0.252057] acpiphp: Slot [9] registered [ 0.252829] acpiphp: Slot [10] registered [ 0.253613] acpiphp: Slot [11] registered [ 0.254403] acpiphp: Slot [12] registered [ 0.255198] acpiphp: Slot [13] registered [ 0.255985] acpiphp: Slot [14] registered [ 0.256770] acpiphp: Slot [15] registered [ 0.267545] acpiphp: Slot [16] registered [ 0.268346] acpiphp: Slot [17] registered [ 0.269139] acpiphp: Slot [18] registered [ 0.269924] acpiphp: Slot [19] registered [ 0.270708] acpiphp: Slot [20] registered [ 0.271489] acpiphp: Slot [21] registered [ 0.272283] acpiphp: Slot [22] registered [ 0.273071] acpiphp: Slot [23] registered [ 0.273856] acpiphp: Slot [24] registered [ 0.274642] acpiphp: Slot [25] registered [ 0.275424] acpiphp: Slot [26] registered [ 0.276208] acpiphp: Slot [27] registered [ 0.277002] acpiphp: Slot [28] registered [ 0.277793] acpiphp: Slot [29] registered [ 0.278581] acpiphp: Slot [30] registered [ 0.279368] acpiphp: Slot [31] registered [ 0.280152] PCI host bridge to bus 0000:00 [ 0.280940] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.282112] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.283282] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.284687] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.286090] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.292349] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.293585] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.294719] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.295942] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.303305] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.304697] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.355125] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.356560] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.357958] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.359340] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.360691] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.361912] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.363028] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.364052] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.365550] vgaarb: loaded [ 0.366141] vgaarb: bridge control possible 0000:00:02.0 [ 0.367250] SCSI subsystem initialized [ 0.368050] ACPI: bus type USB registered [ 0.368846] usbcore: registered new interface driver usbfs [ 0.369852] usbcore: registered new interface driver hub [ 0.370832] usbcore: registered new device driver usb [ 0.371859] PCI: Using ACPI for IRQ routing [ 0.372859] NetLabel: Initializing [ 0.373550] NetLabel: domain hash size = 128 [ 0.374383] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.375306] NetLabel: unlabeled traffic allowed by default [ 0.376369] clocksource: Switched to clocksource kvm-clock [ 0.381909] AppArmor: AppArmor Filesystem Enabled [ 0.382843] pnp: PnP ACPI init [ 0.383744] pnp: PnP ACPI: found 5 devices [ 0.390095] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.391750] NET: Registered protocol family 2 [ 0.392685] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.393912] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.395041] TCP: Hash tables configured (established 4096 bind 4096) [ 0.396158] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.397203] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.398342] NET: Registered protocol family 1 [ 0.399188] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.400246] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.401288] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.415321] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.429567] Trying to unpack rootfs image as initramfs... [ 0.480100] Freeing initrd memory: 4824K (ffff88001fb16000 - ffff88001ffcc000) [ 0.493867] Scanning for low memory corruption every 60 seconds [ 0.495196] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.496327] audit: initializing netlink subsys (disabled) [ 0.497342] audit: type=2000 audit(1527229641.191:1): initialized [ 0.498634] Initialise system trusted keyring [ 0.499539] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.500695] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.502632] zbud: loaded [ 0.503319] VFS: Disk quotas dquot_6.6.0 [ 0.504126] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.505616] fuse init (API version 7.23) [ 0.506493] Key type big_key registered [ 0.507280] Allocating IMA MOK and blacklist keyrings. [ 0.508752] Key type asymmetric registered [ 0.509559] Asymmetric key parser 'x509' registered [ 0.510493] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.511932] io scheduler noop registered [ 0.512720] io scheduler deadline registered (default) [ 0.513692] io scheduler cfq registered [ 0.514506] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.515535] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.516787] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.518213] ACPI: Power Button [PWRF] [ 0.519028] GHES: HEST is not enabled! [ 0.532690] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 0.562265] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 0.564481] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.588478] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.590564] Linux agpgart interface v0.103 [ 0.592542] brd: module loaded [ 0.593706] loop: module loaded [ 0.597114] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 0.598530] GPT:90111 != 2097151 [ 0.599191] GPT:Alternate GPT header not at the end of the disk. [ 0.600251] GPT:90111 != 2097151 [ 0.600920] GPT: Use GNU Parted to correct GPT errors. [ 0.601862] vda: vda1 vda15 [ 0.603195] scsi host0: ata_piix [ 0.603916] scsi host1: ata_piix [ 0.604614] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0a0 irq 14 [ 0.605828] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0a8 irq 15 [ 0.607234] libphy: Fixed MDIO Bus: probed [ 0.608034] tun: Universal TUN/TAP device driver, 1.6 [ 0.608963] tun: (C) 1999-2004 Max Krasnyansky [ 0.611285] PPP generic driver version 2.4.2 [ 0.612426] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.615383] ehci-pci: EHCI PCI platform driver [ 0.616477] ehci-platform: EHCI generic platform driver [ 0.617695] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.619078] ohci-pci: OHCI PCI platform driver [ 0.620135] ohci-platform: OHCI generic platform driver [ 0.621355] uhci_hcd: USB Universal Host Controller Interface driver [ 0.636232] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 0.637439] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 0.639200] uhci_hcd 0000:00:01.2: detected 2 ports [ 0.640397] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 [ 0.641745] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 0.643230] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.644931] usb usb1: Product: UHCI Host Controller [ 0.646328] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 0.647738] usb usb1: SerialNumber: 0000:00:01.2 [ 0.648874] hub 1-0:1.0: USB hub found [ 0.649794] hub 1-0:1.0: 2 ports detected [ 0.650848] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.653260] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.654385] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.655592] mousedev: PS/2 mouse device common for all mice [ 0.657079] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.659269] rtc_cmos 00:00: RTC can wake from S4 [ 0.660561] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 0.661984] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 0.663442] i2c /dev entries driver [ 0.664319] device-mapper: uevent: version 1.0.3 [ 0.665415] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 0.667369] ledtrig-cpu: registered to indicate activity on CPUs [ 0.668877] NET: Registered protocol family 10 [ 0.670068] NET: Registered protocol family 17 [ 0.671132] Key type dns_resolver registered [ 0.672220] microcode: CPU0 sig=0x306c1, pf=0x1, revision=0x1 [ 0.673582] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 0.675749] registered taskstats version 1 [ 0.676745] Loading compiled-in X.509 certificates [ 0.678347] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 0.680598] zswap: loaded using pool lzo/zbud [ 0.682169] Key type trusted registered [ 0.683858] Key type encrypted registered [ 0.684850] AppArmor: AppArmor sha1 policy hashing enabled [ 0.686114] ima: No TPM chip found, activating TPM-bypass! [ 0.687361] evm: HMAC attrs: 0x1 [ 0.688844] Magic number: 2:433:469 [ 0.689842] rtc_cmos 00:00: setting system clock to 2018-05-25 06:27:21 UTC (1527229641) [ 0.691722] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.693100] EDD information not available. [ 0.761218] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.762890] ata1.00: configured for MWDMA2 [ 0.764292] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.776920] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.778414] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.779870] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.781939] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 0.783779] Write protecting the kernel read-only data: 14336k [ 0.785572] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 0.787703] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 0.62 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.46 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.677763] random: dd urandom read with 13 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.10... Lease of 192.168.130.10 obtained, lease time 43200 Top of dropbear init script Starting dropbear sshd: 2018-05-25 06:34:49,750 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Creating VM 2 instance with name: 'opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225' 2018-05-25 06:34:57,819 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm2: Munch({'vm_state': u'active', u'OS-EXT-STS:task_state': None, 'addresses': Munch({u'vping-net-3c5cf7d5-54bc-477f-960a-04cb9f119225': [Munch({u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:5d:3a:9e', u'version': 4, u'addr': u'192.168.130.8', u'OS-EXT-IPS:type': u'fixed'})]}), 'terminated_at': None, 'image': Munch({u'id': u'914afe03-14b0-4fea-bd5c-f45dcc675518'}), u'OS-EXT-AZ:availability_zone': u'nova', u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000e', u'OS-SRV-USG:launched_at': u'2018-05-25T06:27:23.000000', 'flavor': Munch({u'id': u'4872ec2f-0d20-4a55-9741-be26f0da69a7'}), 'az': u'nova', 'id': u'e3eec34c-02fb-486f-b6e1-0a1a81ad8868', 'security_groups': [Munch({u'name': u'default'})], u'os-extended-volumes:volumes_attached': [], 'user_id': u'3f8554483ae64176989c964091a3d18b', 'disk_config': u'MANUAL', u'OS-DCF:diskConfig': u'MANUAL', 'networks': {}, 'accessIPv4': '', 'accessIPv6': '', 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': u'nova', 'region_name': 'regionOne', 'cloud': 'envvars'}), 'power_state': 1, 'public_v4': '', 'progress': 0, u'OS-EXT-STS:power_state': 1, 'interface_ip': '', 'launched_at': u'2018-05-25T06:27:23.000000', 'metadata': Munch({}), 'status': u'ACTIVE', 'updated': u'2018-05-25T06:34:57Z', 'hostId': u'd27fc3f0d3057d1c48851918bfc2a1dff425f6c67f06bbabc2f8d673', u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-0.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, 'key_name': None, 'public_v6': '', 'private_v4': u'192.168.130.8', 'cloud': 'envvars', 'host_id': u'd27fc3f0d3057d1c48851918bfc2a1dff425f6c67f06bbabc2f8d673', 'task_state': None, 'properties': Munch({u'OS-EXT-STS:task_state': None, u'OS-EXT-SRV-ATTR:host': u'overcloud-novacompute-0.opnfvlf.org', u'OS-SRV-USG:terminated_at': None, u'OS-DCF:diskConfig': u'MANUAL', u'os-extended-volumes:volumes_attached': [], u'OS-EXT-STS:vm_state': u'active', u'OS-EXT-SRV-ATTR:instance_name': u'instance-0000000e', u'OS-SRV-USG:launched_at': u'2018-05-25T06:27:23.000000', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-0.opnfvlf.org', u'OS-EXT-STS:power_state': 1, u'OS-EXT-AZ:availability_zone': u'nova'}), 'project_id': u'51534bd63d854b6c878cd0603da66c99', u'OS-EXT-SRV-ATTR:hypervisor_hostname': u'overcloud-novacompute-0.opnfvlf.org', 'name': u'opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225', 'adminPass': u'2EeRSJTJh52U', 'tenant_id': u'51534bd63d854b6c878cd0603da66c99', 'region': 'regionOne', 'created': u'2018-05-25T06:34:50Z', 'has_config_drive': True, 'volumes': [], 'config_drive': u'True'}) 2018-05-25 06:34:59,535 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - vm2 console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffdbfff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffdc000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffdc max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f72f0-0x000f72ff] mapped at [ffff8800000f72f0] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb16000-0x1ffcbfff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F70A0 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE14C9 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE13DD 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 00139D (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1451 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffdbfff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd7000-0x1ffdbfff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd3001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 469491788 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffdbfff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffdbfff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffdbfff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128869 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491796K/523752K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.910 MHz processor [ 0.127217] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967640) [ 0.128847] pid_max: default: 32768 minimum: 301 [ 0.129718] ACPI: Core revision 20150930 [ 0.131081] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.132352] Security Framework initialized [ 0.133145] Yama: becoming mindful. [ 0.133854] AppArmor: AppArmor initialized [ 0.134702] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.136000] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.137239] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.138419] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.139770] Initializing cgroup subsys io [ 0.140557] Initializing cgroup subsys memory [ 0.141393] Initializing cgroup subsys devices [ 0.142238] Initializing cgroup subsys freezer [ 0.143091] Initializing cgroup subsys net_cls [ 0.143939] Initializing cgroup subsys perf_event [ 0.144824] Initializing cgroup subsys net_prio [ 0.145683] Initializing cgroup subsys hugetlb [ 0.146534] Initializing cgroup subsys pids [ 0.147389] CPU: Physical Processor ID: 0 [ 0.148884] mce: CPU supports 10 MCE banks [ 0.149709] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.150690] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.161713] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.168043] ftrace: allocating 31920 entries in 125 pages [ 0.194246] smpboot: Max logical packages: 1 [ 0.195080] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.196397] x2apic enabled [ 0.197194] Switched APIC routing to physical x2apic. [ 0.198929] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.200011] smpboot: CPU0: Intel Core Processor (Haswell, no TSX) (family: 0x6, model: 0x3c, stepping: 0x1) [ 0.201943] Performance Events: unsupported p6 CPU model 60 no PMU driver, software events only. [ 0.203671] KVM setup paravirtual spinlock [ 0.204935] x86: Booted up 1 node, 1 CPUs [ 0.205720] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.207066] devtmpfs: initialized [ 0.208762] evm: security.selinux [ 0.209442] evm: security.SMACK64 [ 0.210117] evm: security.SMACK64EXEC [ 0.210839] evm: security.SMACK64TRANSMUTE [ 0.211623] evm: security.SMACK64MMAP [ 0.212341] evm: security.ima [ 0.212968] evm: security.capability [ 0.213753] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.215541] pinctrl core: initialized pinctrl subsystem [ 0.216602] RTC time: 6:27:20, date: 05/25/18 [ 0.217537] NET: Registered protocol family 16 [ 0.218485] cpuidle: using governor ladder [ 0.219281] cpuidle: using governor menu [ 0.220042] PCCT header not found. [ 0.220769] ACPI: bus type PCI registered [ 0.221552] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.222761] PCI: Using configuration type 1 for base access [ 0.224477] ACPI: Added _OSI(Module Device) [ 0.225294] ACPI: Added _OSI(Processor Device) [ 0.226135] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.227011] ACPI: Added _OSI(Processor Aggregator Device) [ 0.229126] ACPI: Interpreter enabled [ 0.229873] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.231672] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.233455] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20150930/hwxface-580) [ 0.235249] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20150930/hwxface-580) [ 0.237039] ACPI: (supports S0 S5) [ 0.237724] ACPI: Using IOAPIC for interrupt routing [ 0.238644] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.241805] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.242905] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.244093] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.245248] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.247396] acpiphp: Slot [3] registered [ 0.248177] acpiphp: Slot [4] registered [ 0.248952] acpiphp: Slot [5] registered [ 0.249735] acpiphp: Slot [6] registered [ 0.250507] acpiphp: Slot [7] registered [ 0.251282] acpiphp: Slot [8] registered [ 0.252057] acpiphp: Slot [9] registered [ 0.252829] acpiphp: Slot [10] registered [ 0.253613] acpiphp: Slot [11] registered [ 0.254403] acpiphp: Slot [12] registered [ 0.255198] acpiphp: Slot [13] registered [ 0.255985] acpiphp: Slot [14] registered [ 0.256770] acpiphp: Slot [15] registered [ 0.267545] acpiphp: Slot [16] registered [ 0.268346] acpiphp: Slot [17] registered [ 0.269139] acpiphp: Slot [18] registered [ 0.269924] acpiphp: Slot [19] registered [ 0.270708] acpiphp: Slot [20] registered [ 0.271489] acpiphp: Slot [21] registered [ 0.272283] acpiphp: Slot [22] registered [ 0.273071] acpiphp: Slot [23] registered [ 0.273856] acpiphp: Slot [24] registered [ 0.274642] acpiphp: Slot [25] registered [ 0.275424] acpiphp: Slot [26] registered [ 0.276208] acpiphp: Slot [27] registered [ 0.277002] acpiphp: Slot [28] registered [ 0.277793] acpiphp: Slot [29] registered [ 0.278581] acpiphp: Slot [30] registered [ 0.279368] acpiphp: Slot [31] registered [ 0.280152] PCI host bridge to bus 0000:00 [ 0.280940] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.282112] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.283282] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.284687] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.286090] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.292349] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.293585] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.294719] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.295942] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.303305] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.304697] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.355125] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.356560] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.357958] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.359340] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.360691] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.361912] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.363028] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.364052] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.365550] vgaarb: loaded [ 0.366141] vgaarb: bridge control possible 0000:00:02.0 [ 0.367250] SCSI subsystem initialized [ 0.368050] ACPI: bus type USB registered [ 0.368846] usbcore: registered new interface driver usbfs [ 0.369852] usbcore: registered new interface driver hub [ 0.370832] usbcore: registered new device driver usb [ 0.371859] PCI: Using ACPI for IRQ routing [ 0.372859] NetLabel: Initializing [ 0.373550] NetLabel: domain hash size = 128 [ 0.374383] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.375306] NetLabel: unlabeled traffic allowed by default [ 0.376369] clocksource: Switched to clocksource kvm-clock [ 0.381909] AppArmor: AppArmor Filesystem Enabled [ 0.382843] pnp: PnP ACPI init [ 0.383744] pnp: PnP ACPI: found 5 devices [ 0.390095] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.391750] NET: Registered protocol family 2 [ 0.392685] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.393912] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.395041] TCP: Hash tables configured (established 4096 bind 4096) [ 0.396158] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.397203] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.398342] NET: Registered protocol family 1 [ 0.399188] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.400246] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.401288] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.415321] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.429567] Trying to unpack rootfs image as initramfs... [ 0.480100] Freeing initrd memory: 4824K (ffff88001fb16000 - ffff88001ffcc000) [ 0.493867] Scanning for low memory corruption every 60 seconds [ 0.495196] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.496327] audit: initializing netlink subsys (disabled) [ 0.497342] audit: type=2000 audit(1527229641.191:1): initialized [ 0.498634] Initialise system trusted keyring [ 0.499539] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.500695] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.502632] zbud: loaded [ 0.503319] VFS: Disk quotas dquot_6.6.0 [ 0.504126] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.505616] fuse init (API version 7.23) [ 0.506493] Key type big_key registered [ 0.507280] Allocating IMA MOK and blacklist keyrings. [ 0.508752] Key type asymmetric registered [ 0.509559] Asymmetric key parser 'x509' registered [ 0.510493] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.511932] io scheduler noop registered [ 0.512720] io scheduler deadline registered (default) [ 0.513692] io scheduler cfq registered [ 0.514506] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.515535] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.516787] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.518213] ACPI: Power Button [PWRF] [ 0.519028] GHES: HEST is not enabled! [ 0.532690] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 0.562265] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 0.564481] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.588478] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.590564] Linux agpgart interface v0.103 [ 0.592542] brd: module loaded [ 0.593706] loop: module loaded [ 0.597114] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 0.598530] GPT:90111 != 2097151 [ 0.599191] GPT:Alternate GPT header not at the end of the disk. [ 0.600251] GPT:90111 != 2097151 [ 0.600920] GPT: Use GNU Parted to correct GPT errors. [ 0.601862] vda: vda1 vda15 [ 0.603195] scsi host0: ata_piix [ 0.603916] scsi host1: ata_piix [ 0.604614] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0a0 irq 14 [ 0.605828] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0a8 irq 15 [ 0.607234] libphy: Fixed MDIO Bus: probed [ 0.608034] tun: Universal TUN/TAP device driver, 1.6 [ 0.608963] tun: (C) 1999-2004 Max Krasnyansky [ 0.611285] PPP generic driver version 2.4.2 [ 0.612426] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.615383] ehci-pci: EHCI PCI platform driver [ 0.616477] ehci-platform: EHCI generic platform driver [ 0.617695] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.619078] ohci-pci: OHCI PCI platform driver [ 0.620135] ohci-platform: OHCI generic platform driver [ 0.621355] uhci_hcd: USB Universal Host Controller Interface driver [ 0.636232] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 0.637439] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 0.639200] uhci_hcd 0000:00:01.2: detected 2 ports [ 0.640397] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 [ 0.641745] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 0.643230] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.644931] usb usb1: Product: UHCI Host Controller [ 0.646328] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 0.647738] usb usb1: SerialNumber: 0000:00:01.2 [ 0.648874] hub 1-0:1.0: USB hub found [ 0.649794] hub 1-0:1.0: 2 ports detected [ 0.650848] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.653260] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.654385] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.655592] mousedev: PS/2 mouse device common for all mice [ 0.657079] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.659269] rtc_cmos 00:00: RTC can wake from S4 [ 0.660561] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 0.661984] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 0.663442] i2c /dev entries driver [ 0.664319] device-mapper: uevent: version 1.0.3 [ 0.665415] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 0.667369] ledtrig-cpu: registered to indicate activity on CPUs [ 0.668877] NET: Registered protocol family 10 [ 0.670068] NET: Registered protocol family 17 [ 0.671132] Key type dns_resolver registered [ 0.672220] microcode: CPU0 sig=0x306c1, pf=0x1, revision=0x1 [ 0.673582] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 0.675749] registered taskstats version 1 [ 0.676745] Loading compiled-in X.509 certificates [ 0.678347] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 0.680598] zswap: loaded using pool lzo/zbud [ 0.682169] Key type trusted registered [ 0.683858] Key type encrypted registered [ 0.684850] AppArmor: AppArmor sha1 policy hashing enabled [ 0.686114] ima: No TPM chip found, activating TPM-bypass! [ 0.687361] evm: HMAC attrs: 0x1 [ 0.688844] Magic number: 2:433:469 [ 0.689842] rtc_cmos 00:00: setting system clock to 2018-05-25 06:27:21 UTC (1527229641) [ 0.691722] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.693100] EDD information not available. [ 0.761218] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.762890] ata1.00: configured for MWDMA2 [ 0.764292] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.776920] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.778414] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.779870] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.781939] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 0.783779] Write protecting the kernel read-only data: 14336k [ 0.785572] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 0.787703] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 0.62 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.46 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.677763] random: dd urandom read with 13 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.10... Lease of 192.168.130.10 obtained, lease time 43200 Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 no userdata for datasource === system information === Platform: RDO OpenStack Compute Container: none Arch: x86_64 CPU(s): 1 @ 3491.910 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT sr0 11:0 468992 config-2 vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCZej2JsLmyuMEXcnR6EklWWL4BGizAEGiU03M0dvDXbGREvn5G7dwhBfr8FLIJ9uFYdILrJMeW53TR7kX3u6ROqMoSfMA5OZrUjzsYVCV21XJm/6um7kSBkKwHYsbnbP67PhAuawnGiEJ81kmnoQyg7dorXhZQh9tRs3iY137OSGRxqf0ctGsOnW/9Hx+NUT23DESeMiRk2UdovfSAv7TyuVl7REDUCzLsr7d7DOlZIYbbgs0gjmD3B97jlWA/2ie0SMl2VALOaiaJXlG8rNzrUg4t+3TzZfITi8baC6BvMfKLv56VlwC2imRxxGZDQWCYntLhsusWdCfyk8T3NRfF root@opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225 ssh-dss AAAAB3NzaC1kc3MAAACBALGrdj3Y6EynLLY142munawtpuCxdhXrXJZoi+trUpUGuBUnpMI8QxM3NEiVYc1ulhmf5GaNfayWAL43AAG73IMGy0VcIeuw6C1BmQV30oH2SarnJmApfaL7A54N1Y3pLsZeizjmoUPez066+U5PqfWyVIMGKpFHyRtk9dQkx+7xAAAAFQCWVXZaIMqdNAaNHtmHY5YamTEMOQAAAIEAky+TAhka2r2uDr7EHGtC8/hIc+igMZX4iSWMzLgtSyzHxQKHtehcII16nq6lTBVVocLDK/TcQVzV5xZDgm+9f4s85mn+rZrS5c7D4eEyxpzZIkbxQJ1Gn1hGvawukIaMpZDDxFRh4yUSArsJutTdsGYkQzJkyPLgSEC0AQZ9/vwAAACAEVip5iOwX3UE8qmQOun1Z46/HWZAcL+5CfxM9FBsfuTSLWaH6vxCDoOhL/QKfDWpKbSbnpcdQt8TpoTBrRFKq1kjIqEK/AKu4rs+bL+sDN/KIfYkBKkPiS+yMfSQ/8eiPzI5OwSWksWW6EJO1Nn+vSXQ4UjPk4IL7ak0dq4RTdU= root@opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225 -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.10,24,fe80::f816:3eff:fe97:6597/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.10 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: ba1a0923-1193-4c8b-ab86-dd8bdd333574 name: opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225 availability-zone: nova local-hostname: opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225 launch-index: 0 === cirros: current=0.4.0 uptime=3.22 === ____ ____ ____ / __/ __ ____ ____ / __ \/ __/ / /__ / // __// __// /_/ /\ \ \___//_//_/ /_/ \____/___/ http://cirros-cloud.net login as 'cirros' user. default password: 'gocubsgo'. use 'sudo' for root. opnfv-vping-1-3c5cf7d5-54bc-477f-960a-04cb9f119225 login: /dev/root resized successfully [took 4.92s] 2018-05-25 06:34:59,536 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Begin test execution 2018-05-25 06:34:59,536 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - Waiting for ping... 2018-05-25 06:35:01,817 - functest.opnfv_tests.openstack.vping.vping_userdata - DEBUG - console: [ 0.000000] Initializing cgroup subsys cpuset [ 0.000000] Initializing cgroup subsys cpu [ 0.000000] Initializing cgroup subsys cpuacct [ 0.000000] Linux version 4.4.0-28-generic (buildd@lcy01-13) (gcc version 5.3.1 20160413 (Ubuntu 5.3.1-14ubuntu2.1) ) #47-Ubuntu SMP Fri Jun 24 10:09:13 UTC 2016 (Ubuntu 4.4.0-28.47-generic 4.4.13) [ 0.000000] Command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] KERNEL supported cpus: [ 0.000000] Intel GenuineIntel [ 0.000000] AMD AuthenticAMD [ 0.000000] Centaur CentaurHauls [ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 [ 0.000000] x86/fpu: Supporting XSAVE feature 0x01: 'x87 floating point registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x02: 'SSE registers' [ 0.000000] x86/fpu: Supporting XSAVE feature 0x04: 'AVX registers' [ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. [ 0.000000] x86/fpu: Using 'eager' FPU context switches. [ 0.000000] e820: BIOS-provided physical RAM map: [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable [ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000001ffdbfff] usable [ 0.000000] BIOS-e820: [mem 0x000000001ffdc000-0x000000001fffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved [ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved [ 0.000000] NX (Execute Disable) protection: active [ 0.000000] SMBIOS 2.8 present. [ 0.000000] Hypervisor detected: KVM [ 0.000000] e820: last_pfn = 0x1ffdc max_arch_pfn = 0x400000000 [ 0.000000] x86/PAT: Configuration [0-7]: WB WC UC- UC WB WC UC- WT [ 0.000000] found SMP MP-table at [mem 0x000f72f0-0x000f72ff] mapped at [ffff8800000f72f0] [ 0.000000] Scanning 1 areas for low memory corruption [ 0.000000] Using GB pages for direct mapping [ 0.000000] RAMDISK: [mem 0x1fb16000-0x1ffcbfff] [ 0.000000] ACPI: Early table checksum verification disabled [ 0.000000] ACPI: RSDP 0x00000000000F70A0 000014 (v00 BOCHS ) [ 0.000000] ACPI: RSDT 0x000000001FFE14C9 00002C (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACP 0x000000001FFE13DD 000074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) [ 0.000000] ACPI: DSDT 0x000000001FFE0040 00139D (v01 BOCHS BXPCDSDT 00000001 BXPC 00000001) [ 0.000000] ACPI: FACS 0x000000001FFE0000 000040 [ 0.000000] ACPI: APIC 0x000000001FFE1451 000078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) [ 0.000000] No NUMA configuration found [ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000001ffdbfff] [ 0.000000] NODE_DATA(0) allocated [mem 0x1ffd7000-0x1ffdbfff] [ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 [ 0.000000] kvm-clock: cpu 0, msr 0:1ffd3001, primary cpu clock [ 0.000000] kvm-clock: using sched offset of 550094147 cycles [ 0.000000] clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns [ 0.000000] Zone ranges: [ 0.000000] DMA [mem 0x0000000000001000-0x0000000000ffffff] [ 0.000000] DMA32 [mem 0x0000000001000000-0x000000001ffdbfff] [ 0.000000] Normal empty [ 0.000000] Device empty [ 0.000000] Movable zone start for each node [ 0.000000] Early memory node ranges [ 0.000000] node 0: [mem 0x0000000000001000-0x000000000009efff] [ 0.000000] node 0: [mem 0x0000000000100000-0x000000001ffdbfff] [ 0.000000] Initmem setup node 0 [mem 0x0000000000001000-0x000000001ffdbfff] [ 0.000000] ACPI: PM-Timer IO Port: 0x608 [ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) [ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs [ 0.000000] PM: Registered nosave memory: [mem 0x00000000-0x00000fff] [ 0.000000] PM: Registered nosave memory: [mem 0x0009f000-0x0009ffff] [ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000effff] [ 0.000000] PM: Registered nosave memory: [mem 0x000f0000-0x000fffff] [ 0.000000] e820: [mem 0x20000000-0xfeffbfff] available for PCI devices [ 0.000000] Booting paravirtualized kernel on KVM [ 0.000000] clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645519600211568 ns [ 0.000000] setup_percpu: NR_CPUS:256 nr_cpumask_bits:256 nr_cpu_ids:1 nr_node_ids:1 [ 0.000000] PERCPU: Embedded 33 pages/cpu @ffff88001f800000 s98008 r8192 d28968 u2097152 [ 0.000000] KVM setup async PF for cpu 0 [ 0.000000] kvm-stealtime: cpu 0, msr 1f80d940 [ 0.000000] PV qspinlock hash table entries: 256 (order: 0, 4096 bytes) [ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 128869 [ 0.000000] Policy zone: DMA32 [ 0.000000] Kernel command line: LABEL=cirros-rootfs ro console=tty1 console=ttyS0 [ 0.000000] PID hash table entries: 2048 (order: 2, 16384 bytes) [ 0.000000] Memory: 491796K/523752K available (8368K kernel code, 1280K rwdata, 3928K rodata, 1480K init, 1292K bss, 31956K reserved, 0K cma-reserved) [ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 [ 0.000000] Hierarchical RCU implementation. [ 0.000000] Build-time adjustment of leaf fanout to 64. [ 0.000000] RCU restricting CPUs from NR_CPUS=256 to nr_cpu_ids=1. [ 0.000000] RCU: Adjusting geometry for rcu_fanout_leaf=64, nr_cpu_ids=1 [ 0.000000] NR_IRQS:16640 nr_irqs:256 16 [ 0.000000] Console: colour VGA+ 80x25 [ 0.000000] console [tty1] enabled [ 0.000000] console [ttyS0] enabled [ 0.000000] tsc: Detected 3491.912 MHz processor [ 0.137362] Calibrating delay loop (skipped) preset value.. 6983.82 BogoMIPS (lpj=13967648) [ 0.139043] pid_max: default: 32768 minimum: 301 [ 0.139979] ACPI: Core revision 20150930 [ 0.141364] ACPI: 1 ACPI AML tables successfully acquired and loaded [ 0.142682] Security Framework initialized [ 0.143502] Yama: becoming mindful. [ 0.144231] AppArmor: AppArmor initialized [ 0.145096] Dentry cache hash table entries: 65536 (order: 7, 524288 bytes) [ 0.146457] Inode-cache hash table entries: 32768 (order: 6, 262144 bytes) [ 0.147736] Mount-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.148962] Mountpoint-cache hash table entries: 1024 (order: 1, 8192 bytes) [ 0.150348] Initializing cgroup subsys io [ 0.151240] Initializing cgroup subsys memory [ 0.152113] Initializing cgroup subsys devices [ 0.152997] Initializing cgroup subsys freezer [ 0.153871] Initializing cgroup subsys net_cls [ 0.154733] Initializing cgroup subsys perf_event [ 0.155639] Initializing cgroup subsys net_prio [ 0.156527] Initializing cgroup subsys hugetlb [ 0.157401] Initializing cgroup subsys pids [ 0.158303] CPU: Physical Processor ID: 0 [ 0.159967] mce: CPU supports 10 MCE banks [ 0.160994] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 [ 0.162206] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 [ 0.173186] Freeing SMP alternatives memory: 28K (ffffffff820b4000 - ffffffff820bb000) [ 0.179921] ftrace: allocating 31920 entries in 125 pages [ 0.206501] smpboot: Max logical packages: 1 [ 0.207561] smpboot: APIC(0) Converting physical 0 to logical package 0 [ 0.209179] x2apic enabled [ 0.210111] Switched APIC routing to physical x2apic. [ 0.212091] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [ 0.213487] smpboot: CPU0: Intel Core Processor (Haswell, no TSX) (family: 0x6, model: 0x3c, stepping: 0x1) [ 0.215891] Performance Events: unsupported p6 CPU model 60 no PMU driver, software events only. [ 0.218024] KVM setup paravirtual spinlock [ 0.219467] x86: Booted up 1 node, 1 CPUs [ 0.220435] smpboot: Total of 1 processors activated (6983.82 BogoMIPS) [ 0.222090] devtmpfs: initialized [ 0.223929] evm: security.selinux [ 0.224750] evm: security.SMACK64 [ 0.225561] evm: security.SMACK64EXEC [ 0.226476] evm: security.SMACK64TRANSMUTE [ 0.227465] evm: security.SMACK64MMAP [ 0.228334] evm: security.ima [ 0.229096] evm: security.capability [ 0.230044] clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 7645041785100000 ns [ 0.232283] pinctrl core: initialized pinctrl subsystem [ 0.233589] RTC time: 6:27:23, date: 05/25/18 [ 0.234720] NET: Registered protocol family 16 [ 0.235874] cpuidle: using governor ladder [ 0.236870] cpuidle: using governor menu [ 0.237815] PCCT header not found. [ 0.238697] ACPI: bus type PCI registered [ 0.239664] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [ 0.241165] PCI: Using configuration type 1 for base access [ 0.243157] ACPI: Added _OSI(Module Device) [ 0.244172] ACPI: Added _OSI(Processor Device) [ 0.245215] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.246300] ACPI: Added _OSI(Processor Aggregator Device) [ 0.248624] ACPI: Interpreter enabled [ 0.249558] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20150930/hwxface-580) [ 0.251808] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20150930/hwxface-580) [ 0.254058] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20150930/hwxface-580) [ 0.256296] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20150930/hwxface-580) [ 0.258645] ACPI: (supports S0 S5) [ 0.259501] ACPI: Using IOAPIC for interrupt routing [ 0.260653] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.264146] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) [ 0.265538] acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI] [ 0.267060] acpi PNP0A03:00: _OSC failed (AE_NOT_FOUND); disabling ASPM [ 0.268532] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. [ 0.271223] acpiphp: Slot [3] registered [ 0.272193] acpiphp: Slot [4] registered [ 0.273160] acpiphp: Slot [5] registered [ 0.274122] acpiphp: Slot [6] registered [ 0.275089] acpiphp: Slot [7] registered [ 0.276054] acpiphp: Slot [8] registered [ 0.277017] acpiphp: Slot [9] registered [ 0.277978] acpiphp: Slot [10] registered [ 0.278956] acpiphp: Slot [11] registered [ 0.279938] acpiphp: Slot [12] registered [ 0.280918] acpiphp: Slot [13] registered [ 0.281895] acpiphp: Slot [14] registered [ 0.282875] acpiphp: Slot [15] registered [ 0.294102] acpiphp: Slot [16] registered [ 0.295098] acpiphp: Slot [17] registered [ 0.296078] acpiphp: Slot [18] registered [ 0.297045] acpiphp: Slot [19] registered [ 0.298019] acpiphp: Slot [20] registered [ 0.299011] acpiphp: Slot [21] registered [ 0.299992] acpiphp: Slot [22] registered [ 0.300977] acpiphp: Slot [23] registered [ 0.301954] acpiphp: Slot [24] registered [ 0.302932] acpiphp: Slot [25] registered [ 0.303908] acpiphp: Slot [26] registered [ 0.304884] acpiphp: Slot [27] registered [ 0.305864] acpiphp: Slot [28] registered [ 0.306845] acpiphp: Slot [29] registered [ 0.307822] acpiphp: Slot [30] registered [ 0.308789] acpiphp: Slot [31] registered [ 0.309764] PCI host bridge to bus 0000:00 [ 0.310755] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] [ 0.312321] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] [ 0.313831] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] [ 0.315548] pci_bus 0000:00: root bus resource [mem 0x20000000-0xfebfffff window] [ 0.317261] pci_bus 0000:00: root bus resource [bus 00-ff] [ 0.323687] pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] [ 0.325232] pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] [ 0.326653] pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] [ 0.328189] pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] [ 0.334667] pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI [ 0.336391] pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB [ 0.381911] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) [ 0.383640] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) [ 0.385343] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) [ 0.387016] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) [ 0.389292] ACPI: PCI Interrupt Link [LNKS] (IRQs *9) [ 0.390768] ACPI: Enabled 2 GPEs in block 00 to 0F [ 0.392119] vgaarb: setting as boot device: PCI:0000:00:02.0 [ 0.393444] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none [ 0.395333] vgaarb: loaded [ 0.396102] vgaarb: bridge control possible 0000:00:02.0 [ 0.397487] SCSI subsystem initialized [ 0.398444] ACPI: bus type USB registered [ 0.399465] usbcore: registered new interface driver usbfs [ 0.400696] usbcore: registered new interface driver hub [ 0.401899] usbcore: registered new device driver usb [ 0.403176] PCI: Using ACPI for IRQ routing [ 0.404352] NetLabel: Initializing [ 0.405222] NetLabel: domain hash size = 128 [ 0.406261] NetLabel: protocols = UNLABELED CIPSOv4 [ 0.407386] NetLabel: unlabeled traffic allowed by default [ 0.408712] clocksource: Switched to clocksource kvm-clock [ 0.414341] AppArmor: AppArmor Filesystem Enabled [ 0.415527] pnp: PnP ACPI init [ 0.416578] pnp: PnP ACPI: found 5 devices [ 0.423138] clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns [ 0.425231] NET: Registered protocol family 2 [ 0.426355] TCP established hash table entries: 4096 (order: 3, 32768 bytes) [ 0.427940] TCP bind hash table entries: 4096 (order: 4, 65536 bytes) [ 0.429394] TCP: Hash tables configured (established 4096 bind 4096) [ 0.430848] UDP hash table entries: 256 (order: 1, 8192 bytes) [ 0.432204] UDP-Lite hash table entries: 256 (order: 1, 8192 bytes) [ 0.433647] NET: Registered protocol family 1 [ 0.434701] pci 0000:00:00.0: Limiting direct PCI/PCI transfers [ 0.436065] pci 0000:00:01.0: PIIX3: Enabling Passive Release [ 0.437395] pci 0000:00:01.0: Activating ISA DMA hang workarounds [ 0.451576] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 [ 0.466021] Trying to unpack rootfs image as initramfs... [ 0.516045] Freeing initrd memory: 4824K (ffff88001fb16000 - ffff88001ffcc000) [ 0.530493] Scanning for low memory corruption every 60 seconds [ 0.532126] futex hash table entries: 256 (order: 2, 16384 bytes) [ 0.533530] audit: initializing netlink subsys (disabled) [ 0.534778] audit: type=2000 audit(1527229644.433:1): initialized [ 0.536332] Initialise system trusted keyring [ 0.537446] HugeTLB registered 1 GB page size, pre-allocated 0 pages [ 0.538875] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.541111] zbud: loaded [ 0.543846] VFS: Disk quotas dquot_6.6.0 [ 0.544836] VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.546630] fuse init (API version 7.23) [ 0.547682] Key type big_key registered [ 0.548656] Allocating IMA MOK and blacklist keyrings. [ 0.549982] Key type asymmetric registered [ 0.550975] Asymmetric key parser 'x509' registered [ 0.552128] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) [ 0.554266] io scheduler noop registered [ 0.555229] io scheduler deadline registered (default) [ 0.556429] io scheduler cfq registered [ 0.557437] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.558722] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.560262] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.562033] ACPI: Power Button [PWRF] [ 0.563024] GHES: HEST is not enabled! [ 0.576643] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 [ 0.605175] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 [ 0.607710] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.631887] 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.634313] Linux agpgart interface v0.103 [ 0.636520] brd: module loaded [ 0.637813] loop: module loaded [ 0.642904] GPT:Primary header thinks Alt. header is not at the end of the disk. [ 0.644630] GPT:90111 != 2097151 [ 0.645458] GPT:Alternate GPT header not at the end of the disk. [ 0.646832] GPT:90111 != 2097151 [ 0.647630] GPT: Use GNU Parted to correct GPT errors. [ 0.648814] vda: vda1 vda15 [ 0.650323] scsi host0: ata_piix [ 0.651194] scsi host1: ata_piix [ 0.652044] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0a0 irq 14 [ 0.653552] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0a8 irq 15 [ 0.655297] libphy: Fixed MDIO Bus: probed [ 0.656302] tun: Universal TUN/TAP device driver, 1.6 [ 0.657467] tun: (C) 1999-2004 Max Krasnyansky [ 0.659791] PPP generic driver version 2.4.2 [ 0.660931] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.662405] ehci-pci: EHCI PCI platform driver [ 0.663431] ehci-platform: EHCI generic platform driver [ 0.664625] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.666030] ohci-pci: OHCI PCI platform driver [ 0.667124] ohci-platform: OHCI generic platform driver [ 0.668319] uhci_hcd: USB Universal Host Controller Interface driver [ 0.682704] uhci_hcd 0000:00:01.2: UHCI Host Controller [ 0.683928] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 [ 0.685640] uhci_hcd 0000:00:01.2: detected 2 ports [ 0.686851] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 [ 0.688170] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 [ 0.689649] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.691377] usb usb1: Product: UHCI Host Controller [ 0.692498] usb usb1: Manufacturer: Linux 4.4.0-28-generic uhci_hcd [ 0.693920] usb usb1: SerialNumber: 0000:00:01.2 [ 0.695033] hub 1-0:1.0: USB hub found [ 0.695975] hub 1-0:1.0: 2 ports detected [ 0.697052] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.699502] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.700675] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.701915] mousedev: PS/2 mouse device common for all mice [ 0.703391] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.705576] rtc_cmos 00:00: RTC can wake from S4 [ 0.706866] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 [ 0.708310] rtc_cmos 00:00: alarms up to one day, y3k, 114 bytes nvram [ 0.709765] i2c /dev entries driver [ 0.710695] device-mapper: uevent: version 1.0.3 [ 0.711857] device-mapper: ioctl: 4.34.0-ioctl (2015-10-28) initialised: dm-devel@redhat.com [ 0.713852] ledtrig-cpu: registered to indicate activity on CPUs [ 0.715374] NET: Registered protocol family 10 [ 0.716561] NET: Registered protocol family 17 [ 0.717652] Key type dns_resolver registered [ 0.718752] microcode: CPU0 sig=0x306c1, pf=0x1, revision=0x1 [ 0.720090] microcode: Microcode Update Driver: v2.01 , Peter Oruba [ 0.722204] registered taskstats version 1 [ 0.723213] Loading compiled-in X.509 certificates [ 0.724927] Loaded X.509 cert 'Build time autogenerated kernel key: 6ea974e07bd0b30541f4d838a3b7a8a80d5ca9af' [ 0.727202] zswap: loaded using pool lzo/zbud [ 0.728761] Key type trusted registered [ 0.730489] Key type encrypted registered [ 0.731481] AppArmor: AppArmor sha1 policy hashing enabled [ 0.732742] ima: No TPM chip found, activating TPM-bypass! [ 0.734012] evm: HMAC attrs: 0x1 [ 0.735067] Magic number: 2:433:469 [ 0.736084] rtc_cmos 00:00: setting system clock to 2018-05-25 06:27:24 UTC (1527229644) [ 0.738048] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.739413] EDD information not available. [ 0.809231] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.810883] ata1.00: configured for MWDMA2 [ 0.812262] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.824974] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.826453] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.827955] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.830064] Freeing unused kernel memory: 1480K (ffffffff81f42000 - ffffffff820b4000) [ 0.831935] Write protecting the kernel read-only data: 14336k [ 0.833759] Freeing unused kernel memory: 1860K (ffff88000182f000 - ffff880001a00000) [ 0.835923] Freeing unused kernel memory: 168K (ffff880001dd6000 - ffff880001e00000) info: initramfs: up at 0.65 modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep info: copying initramfs to /dev/vda1 info: initramfs loading root from /dev/vda1 info: /etc/init.d/rc.sysinit: up at 2.12 info: container: none Starting logging: OK modprobe: module virtio_pci not found in modules.dep modprobe: module virtio_blk not found in modules.dep modprobe: module virtio_net not found in modules.dep modprobe: module vfat not found in modules.dep modprobe: module nls_cp437 not found in modules.dep WARN: /etc/rc3.d/S10-load-modules failed Initializing random number generator... [ 2.337795] random: dd urandom read with 13 bits of entropy available done. Starting acpid: OK Starting network... udhcpc (v1.23.2) started Sending discover... Sending select for 192.168.130.8... Lease of 192.168.130.8 obtained, lease time 43200 Top of dropbear init script Starting dropbear sshd: OK GROWROOT: CHANGED: partition=1 start=18432 old: size=71647 end=90079 new: size=2078687,end=2097119 vPing OK === system information === Platform: RDO OpenStack Compute Container: none Arch: x86_64 CPU(s): 1 @ 3491.912 MHz Cores/Sockets/Threads: 1/1/1 Virt-type: RAM Size: 488MB Disks: NAME MAJ:MIN SIZE LABEL MOUNTPOINT sr0 11:0 489472 config-2 vda 253:0 1073741824 vda1 253:1 1064287744 cirros-rootfs / vda15 253:15 8388608 === sshd host keys === -----BEGIN SSH HOST KEY KEYS----- ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCbJMRnP2utMmDmM4WtsXZ2Z902wBZJsgJegs/A/wkA58aVku/NJokavZoxYVbfbmfS1Lm/0HQEiJimWNJsB3hdYIeTTGeiJYetlMsvZwrgVDERQACTpSBZJVzQo3pQVPxO6Sm5/2qp2f9W0wCocsDlFY8qmi19PGBhKMGFqB4nIrYq4qJXjV8RYbFEHi/Y08M5gwQcNqXw8Iq9m3LF06siP9IbaLpcM2pcStzzX9DzxbKrZdSyssygh5rem6jJDCSZb73+M4mK/CRp1zTyFeDkVZ0wysfoexLt54/1QtG+fFWirnS2NuX1jc0qHC+SPk/J6GyPqNxQfGJr/Nau5FQ1 root@opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225 ssh-dss AAAAB3NzaC1kc3MAAACBAJzarrehpUrhSIepec9eaoGMdfEBlXIQRYY9cJYD04ZfLWQ3BNOm016Pg3mQsxTjMc5uPOFO3bgUrBkTddNN5dAP3mZsqZPZeeu2+z7E527tWlVXQQvBBzH7PkCnwUT3m4JaAHE/4pdsEU7IYBn1L5VHGBYBdphHIf+lowG87rOZAAAAFQD2AsOrwzCQpOy6D/NwDWs6jBhH2QAAAIAj6i/X2IU0jsDBWEfwygIH/qgxQx2vnf/NxOexRZjB+dyMoGHcvAdE8dEb2iAUdeS9lYwlwrKqzvFdXSHPGyNYK7ZoE7te6PqRyZY0+vUL70HAvYMv+0JxkZ2+jJOSSO+Yi3W54gVCE7uolq0QyRUJLji4RXT0ba6oSxqnHo+J8gAAAIEAgYOr13L7EgRHaNUQoutlS9bdgFmkkEM5KOqfFW+UeJOvl55OzqQu2wfH/+1YXfGMGLlvvZbpoDf/CksGzz3RP7xF3kJjINK2oJyHWznk4Bkoq50FDlExxTPdeg8nY2ZmoZr2TH3YRNTSocEbuIc8EV0Fkb4/MPpHkMNxFumJiUc= root@opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225 -----END SSH HOST KEY KEYS----- === network info === if-info: lo,up,127.0.0.1,8,, if-info: eth0,up,192.168.130.8,24,fe80::f816:3eff:fe5d:3a9e/64, ip-route:default via 192.168.130.1 dev eth0 ip-route:192.168.130.0/24 dev eth0 src 192.168.130.8 ip-route6:fe80::/64 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 ip-route6:ff00::/8 dev eth0 metric 256 ip-route6:unreachable default dev lo metric -1 error -101 === datasource: configdrive local === instance-id: e3eec34c-02fb-486f-b6e1-0a1a81ad8868 name: opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225 availability-zone: nova local-hostname: opnfv-vping-2-userdata--3c5cf7d5-54bc-477f-960a-04cb9f119225 launch-index: 0 === cirros: current=0.4.0 latest=0.4.0 uptime=3.21 === 2018-05-25 06:35:01,818 - functest.opnfv_tests.openstack.vping.vping_userdata - INFO - vPing detected! 2018-05-25 06:35:01,929 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:35:01,929 - xtesting.ci.run_tests - INFO - Test result: +------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +------------------------+------------------+------------------+----------------+ | vping_userdata | functest | 00:26 | PASS | +------------------------+------------------+------------------+----------------+ 2018-05-25 06:35:16,023 - xtesting.ci.run_tests - INFO - Running test case 'tempest_smoke_serial'... 2018-05-25 06:35:16,130 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-25 06:35:16,131 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-25 06:35:16,131 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-25 06:35:20,924 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally 2018-05-25 06:35:20.676 14 INFO rally.deployment.engines.existing [-] Save deployment 'opnfv-rally' (uuid=eacfb6a2-8084-4330-bc86-c203d0d8abde) with 'openstack' platform. +--------------------------------------+---------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+---------------------+-------------+------------------+--------+ | eacfb6a2-8084-4330-bc86-c203d0d8abde | 2018-05-25T06:35:20 | opnfv-rally | deploy->finished | | +--------------------------------------+---------------------+-------------+------------------+--------+ Using deployment: eacfb6a2-8084-4330-bc86-c203d0d8abde ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-25 06:35:24,021 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | key-manager | Available | | __unknown__ | placement | Available | | __unknown__ | policy | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-25 06:35:24,022 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-25 06:35:29,570 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide 2018-05-25 06:35:28.144 23 INFO rally.api [-] Creating verifier 'opnfv-tempest'. 2018-05-25 06:35:28.274 23 INFO rally.verification.manager [-] Cloning verifier repo from /src/tempest. 2018-05-25 06:35:29.387 23 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97) has been successfully created! Using verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97) as the default verifier for the future CLI operations. 2018-05-25 06:35:32,644 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-c5eeb3f6-909d-402f-a393-b6ad037f433f' 2018-05-25 06:35:33,531 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T06:35:33Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'665a1f3a-d049-476c-82ad-fc4ecf1686b3', u'provider:segmentation_id': 14, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'tempest-net-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'created_at': u'2018-05-25T06:35:33Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:35:34,082 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T06:35:33Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'33b484ca-7858-43f1-8d21-59ea2510389e', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'enable_dhcp': True, u'network_id': u'665a1f3a-d049-476c-82ad-fc4ecf1686b3', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T06:35:33Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 06:35:34,082 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-25 06:35:34,082 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-c5eeb3f6-909d-402f-a393-b6ad037f433f' 2018-05-25 06:35:35,051 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T06:35:34Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/6fcc5c05-87c7-417b-bf38-671bc2de378e/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'6fcc5c05-87c7-417b-bf38-671bc2de378e', u'size': None, u'name': u'Cirros-0.4.0-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'checksum': None, u'self': u'/v2/images/6fcc5c05-87c7-417b-bf38-671bc2de378e', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T06:35:34Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 06:35:35,051 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-c5eeb3f6-909d-402f-a393-b6ad037f433f' 2018-05-25 06:35:36,100 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T06:35:35Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/b02f4fb8-722d-4b91-bab7-6e4e77dcb5e9/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'b02f4fb8-722d-4b91-bab7-6e4e77dcb5e9', u'size': None, u'name': u'Cirros-0.4.0-1-c5eeb3f6-909d-402f-a393-b6ad037f433f', u'checksum': None, u'self': u'/v2/images/b02f4fb8-722d-4b91-bab7-6e4e77dcb5e9', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T06:35:35Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 06:35:36,100 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-25 06:35:36,316 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-c5eeb3f6-909d-402f-a393-b6ad037f433f', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'3b029cd2-d17e-4727-a028-8da4f7693577', 'swap': 0}) 2018-05-25 06:35:36,407 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-c5eeb3f6-909d-402f-a393-b6ad037f433f', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'f3bb4cbe-806d-459c-8f86-dfd54df42352', 'swap': 0}) 2018-05-25 06:35:39,406 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-25 06:35:38.573 42 INFO rally.api [-] Configuring verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97) for deployment 'opnfv-rally' (UUID=eacfb6a2-8084-4330-bc86-c203d0d8abde). 2018-05-25 06:35:39.244 42 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97) has been successfully configured for deployment 'opnfv-rally' (UUID=eacfb6a2-8084-4330-bc86-c203d0d8abde)! 2018-05-25 06:35:39,406 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-25 06:35:39,407 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-25 06:35:39,409 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-25 06:35:39,412 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-25 06:35:42,517 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-958d68a1-b4b4-4151-b0d6-568036179a97/repo; testr list-tests '^tempest\.(api|scenario).*\[.*\bsmoke\b.*\]$' >/home/opnfv/functest/results/tempest/test_list.txt 2>/dev/null) 2018-05-25 06:35:42,517 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Applying tempest blacklist... 2018-05-25 06:35:42,519 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Tempest blacklist file does not exist. 2018-05-25 06:35:42,520 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', u'/home/opnfv/functest/results/tempest/test_list.txt', '--concurrency', '1']'. 2018-05-25 06:35:45,070 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:35:45.069 51 INFO rally.api [-] Starting verification (UUID=9ff7f108-61a5-42fd-906b-5dbb30d7b56d) for deployment 'opnfv-rally' (UUID=eacfb6a2-8084-4330-bc86-c203d0d8abde) by verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97). 2018-05-25 06:35:45,071 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Verification UUID: 9ff7f108-61a5-42fd-906b-5dbb30d7b56d 2018-05-25 06:35:51,173 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:35:51.173 51 INFO opnfv-tempest [-] {0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_get_flavor ... success [0.172s] 2018-05-25 06:35:51,232 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:35:51.232 51 INFO opnfv-tempest [-] {0} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors ... success [0.059s] 2018-05-25 06:35:56,192 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:35:56.191 51 INFO opnfv-tempest [-] {0} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_create ... success [0.832s] 2018-05-25 06:35:57,601 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:35:57.601 51 INFO opnfv-tempest [-] {0} tempest.api.compute.security_groups.test_security_group_rules.SecurityGroupRulesTestJSON.test_security_group_rules_list ... success [1.409s] 2018-05-25 06:36:05,165 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:36:05.165 51 INFO opnfv-tempest [-] {0} tempest.api.compute.security_groups.test_security_groups.SecurityGroupsTestJSON.test_security_groups_create_list_delete ... success [2.152s] 2018-05-25 06:36:17,228 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:36:17.228 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_attach_interfaces.AttachInterfacesTestJSON.test_add_remove_fixed_ip ... success [7.447s] 2018-05-25 06:36:45,184 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:36:45.184 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestBootFromVolume.test_list_servers ... success [0.066s] 2018-05-25 06:36:45,185 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:36:45.185 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestBootFromVolume.test_verify_server_details ... success [0.001s] 2018-05-25 06:37:08,834 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:37:08.834 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers ... success [0.064s] 2018-05-25 06:37:08,835 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:37:08.835 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details ... success [0.001s] 2018-05-25 06:37:31,306 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:37:31.305 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers ... success [0.064s] 2018-05-25 06:37:31,307 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:37:31.307 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details ... success [0.001s] 2018-05-25 06:38:05,421 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:05.421 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard ... success [11.317s] 2018-05-25 06:38:23,233 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:23.233 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses ... success [0.048s] 2018-05-25 06:38:23,326 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:23.325 51 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_server_addresses.ServerAddressesTestJSON.test_list_server_addresses_by_network ... success [0.093s] 2018-05-25 06:38:30,003 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:30.003 51 INFO opnfv-tempest [-] {0} tempest.api.compute.test_versions.TestVersions.test_get_version_details ... success [0.773s] 2018-05-25 06:38:30,012 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:30.012 51 INFO opnfv-tempest [-] {0} tempest.api.compute.test_versions.TestVersions.test_list_api_versions ... success [0.009s] 2018-05-25 06:38:31,118 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:31.118 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v2.test_services.ServicesTestJSON ... skip: Identity api v2 is not enabled 2018-05-25 06:38:31,119 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:31.119 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v2.test_users.UsersTestJSON ... skip: Identity api v2 is not enabled 2018-05-25 06:38:34,912 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:34.912 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_credentials.CredentialsTestJSON.test_credentials_create_get_update_delete ... success [0.328s] 2018-05-25 06:38:39,740 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:39.740 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_domains.DefaultDomainTestJSON.test_default_domain_exists ... success [0.075s] 2018-05-25 06:38:45,344 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:45.343 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_domains.DomainsTestJSON.test_create_update_delete_domain ... success [0.575s] 2018-05-25 06:38:51,322 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:51.321 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_endpoints.EndPointsTestJSON.test_update_endpoint ... success [0.387s] 2018-05-25 06:38:59,536 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:38:59.536 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_groups.GroupsV3TestJSON.test_group_users_add_list_delete ... success [3.009s] 2018-05-25 06:39:04,704 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:04.703 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_policies.PoliciesTestJSON.test_create_update_delete_policy ... success [0.319s] 2018-05-25 06:39:09,757 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:09.757 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_regions.RegionsTestJSON.test_create_region_with_specific_id ... success [0.157s] 2018-05-25 06:39:16,105 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:16.105 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_roles.RolesV3TestJSON.test_role_create_update_show_list ... success [0.411s] 2018-05-25 06:39:21,816 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:21.815 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_services.ServicesTestJSON.test_create_update_get_service ... success [0.397s] 2018-05-25 06:39:28,866 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:28.865 51 INFO opnfv-tempest [-] {0} tempest.api.identity.admin.v3.test_trusts.TrustsV3TestJSON.test_get_trusts_all ... success [2.591s] 2018-05-25 06:39:30,664 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:30.663 51 INFO opnfv-tempest [-] {0} tempest.api.identity.v2.test_api_discovery.TestApiDiscovery ... skip: Identity api v2 is not enabled 2018-05-25 06:39:32,556 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:32.556 51 INFO opnfv-tempest [-] {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_media_types ... success [0.068s] 2018-05-25 06:39:32,623 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:32.622 51 INFO opnfv-tempest [-] {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_resources ... success [0.066s] 2018-05-25 06:39:32,697 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:32.697 51 INFO opnfv-tempest [-] {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_statuses ... success [0.073s] 2018-05-25 06:39:32,703 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:32.703 51 INFO opnfv-tempest [-] {0} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_list_api_versions ... success [0.006s] 2018-05-25 06:39:36,158 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:36.158 51 INFO opnfv-tempest [-] {0} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_delete_image ... success [0.417s] 2018-05-25 06:39:36,830 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:36.829 51 INFO opnfv-tempest [-] {0} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_register_upload_get_image_file ... success [0.671s] 2018-05-25 06:39:37,925 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:37.925 51 INFO opnfv-tempest [-] {0} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image ... success [1.095s] 2018-05-25 06:39:43,328 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:43.328 51 INFO opnfv-tempest [-] {0} tempest.api.image.v2.test_versions.VersionsTest.test_list_versions ... success [0.006s] 2018-05-25 06:39:48,124 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:48.123 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_extensions.ExtensionsTestJSON.test_list_show_extensions ... success [1.875s] 2018-05-25 06:39:59,610 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:39:59.610 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_floating_ip_specifying_a_fixed_ip_address ... success [1.922s] 2018-05-25 06:40:01,838 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:01.838 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_floating_ips.FloatingIPTestJSON.test_create_list_show_update_delete_floating_ip ... success [2.227s] 2018-05-25 06:40:11,208 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:11.208 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network ... success [2.295s] 2018-05-25 06:40:14,350 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:14.350 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port ... success [3.141s] 2018-05-25 06:40:18,763 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:18.763 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet ... success [4.412s] 2018-05-25 06:40:26,692 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:26.691 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_network ... success [2.618s] 2018-05-25 06:40:29,737 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:29.736 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_port ... success [3.044s] 2018-05-25 06:40:32,170 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:32.170 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.BulkNetworkOpsTest.test_bulk_create_delete_subnet ... success [2.433s] 2018-05-25 06:40:43,358 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.357 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_create_update_delete_network_subnet ... success [2.557s] 2018-05-25 06:40:43,517 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.517 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_external_network_visibility ... success [0.160s] 2018-05-25 06:40:43,623 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.623 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_networks ... success [0.105s] 2018-05-25 06:40:43,671 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.671 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_list_subnets ... success [0.048s] 2018-05-25 06:40:43,763 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.763 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_network ... success [0.091s] 2018-05-25 06:40:43,965 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:43.964 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksIpV6Test.test_show_subnet ... success [0.201s] 2018-05-25 06:40:52,454 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:52.453 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_update_delete_network_subnet ... success [2.923s] 2018-05-25 06:40:52,751 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:52.751 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_external_network_visibility ... success [0.298s] 2018-05-25 06:40:53,041 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:53.040 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_networks ... success [0.288s] 2018-05-25 06:40:53,095 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:53.095 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_subnets ... success [0.054s] 2018-05-25 06:40:53,197 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:53.197 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_network ... success [0.102s] 2018-05-25 06:40:53,247 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:40:53.246 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_subnet ... success [0.049s] 2018-05-25 06:41:01,048 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:01.047 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_in_allowed_allocation_pools ... success [2.883s] 2018-05-25 06:41:03,958 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:03.958 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_port_with_no_securitygroups ... success [2.910s] 2018-05-25 06:41:05,423 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:05.423 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_create_update_delete_port ... success [1.464s] 2018-05-25 06:41:05,474 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:05.473 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_list_ports ... success [0.050s] 2018-05-25 06:41:05,524 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:05.524 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsIpV6TestJSON.test_show_port ... success [0.051s] 2018-05-25 06:41:13,580 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:13.580 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsTestJSON.test_create_port_in_allowed_allocation_pools ... success [2.533s] 2018-05-25 06:41:16,482 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:16.482 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsTestJSON.test_create_port_with_no_securitygroups ... success [2.901s] 2018-05-25 06:41:17,933 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:17.933 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsTestJSON.test_create_update_delete_port ... success [1.451s] 2018-05-25 06:41:17,987 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:17.987 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsTestJSON.test_list_ports ... success [0.054s] 2018-05-25 06:41:18,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:18.038 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_ports.PortsTestJSON.test_show_port ... success [0.051s] 2018-05-25 06:41:30,108 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:30.107 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_multiple_router_interfaces ... success [7.327s] 2018-05-25 06:41:34,462 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:34.462 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_port_id ... success [4.354s] 2018-05-25 06:41:38,545 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:38.544 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersIpV6Test.test_add_remove_router_interface_with_subnet_id ... success [4.081s] 2018-05-25 06:41:41,698 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:41.698 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersIpV6Test.test_create_show_list_update_delete_router ... success [3.153s] 2018-05-25 06:41:55,558 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:41:55.558 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersTest.test_add_multiple_router_interfaces ... success [7.196s] 2018-05-25 06:42:00,373 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:00.373 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_port_id ... success [4.815s] 2018-05-25 06:42:04,288 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:04.288 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersTest.test_add_remove_router_interface_with_subnet_id ... success [3.914s] 2018-05-25 06:42:07,419 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:07.418 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_routers.RoutersTest.test_create_show_list_update_delete_router ... success [3.130s] 2018-05-25 06:42:15,501 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:15.500 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_list_update_show_delete_security_group ... success [0.940s] 2018-05-25 06:42:17,190 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:17.190 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_show_delete_security_group_rule ... success [1.687s] 2018-05-25 06:42:17,232 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:17.231 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupIPv6Test.test_list_security_groups ... success [0.044s] 2018-05-25 06:42:21,249 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:21.248 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupTest.test_create_list_update_show_delete_security_group ... success [1.290s] 2018-05-25 06:42:22,664 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:22.664 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupTest.test_create_show_delete_security_group_rule ... success [1.414s] 2018-05-25 06:42:22,834 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:22.834 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_security_groups.SecGroupTest.test_list_security_groups ... success [0.173s] 2018-05-25 06:42:26,133 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:26.132 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_subnetpools_extensions.SubnetPoolsTestJSON.test_create_list_show_update_delete_subnetpools ... success [0.607s] 2018-05-25 06:42:29,300 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:29.300 51 INFO opnfv-tempest [-] {0} tempest.api.network.test_versions.NetworksApiDiscovery.test_api_version_resources ... success [0.006s] 2018-05-25 06:42:30,345 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:30.345 51 INFO opnfv-tempest [-] {0} tempest.api.object_storage.test_account_quotas.AccountQuotasTest ... skip: AccountQuotasTest skipped as swift is not available 2018-05-25 06:42:30,347 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:30.347 51 INFO opnfv-tempest [-] {0} tempest.api.object_storage.test_account_services.AccountTest ... skip: AccountTest skipped as swift is not available 2018-05-25 06:42:30,350 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:30.349 51 INFO opnfv-tempest [-] {0} tempest.api.object_storage.test_container_quotas.ContainerQuotasTest ... skip: ContainerQuotasTest skipped as swift is not available 2018-05-25 06:42:30,350 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:30.350 51 INFO opnfv-tempest [-] {0} tempest.api.object_storage.test_container_services.ContainerTest ... skip: ContainerTest skipped as swift is not available 2018-05-25 06:42:30,351 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:30.351 51 INFO opnfv-tempest [-] {0} tempest.api.object_storage.test_object_services.ObjectTest ... skip: ObjectTest skipped as swift is not available 2018-05-25 06:42:32,121 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:32.121 51 INFO opnfv-tempest [-] {0} tempest.api.volume.test_versions.VersionsTest.test_list_versions ... success [0.013s] 2018-05-25 06:42:46,060 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:46.060 51 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_actions.VolumesActionsTest.test_attach_detach_volume_to_instance ... success [9.406s] 2018-05-25 06:42:56,151 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:42:56.151 51 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete ... success [6.181s] 2018-05-25 06:43:06,270 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:43:06.270 51 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_from_image ... success [10.115s] 2018-05-25 06:43:13,974 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:43:13.974 51 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list ... success [0.037s] 2018-05-25 06:44:11,370 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:44:11.369 51 INFO opnfv-tempest [-] {0} tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops ... success [51.950s] 2018-05-25 06:46:47,070 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:46:47.069 51 INFO opnfv-tempest [-] {0} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops ... fail [148.201s] 2018-05-25 06:47:53,439 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 06:47:53.438 51 INFO opnfv-tempest [-] {0} tempest.scenario.test_server_multinode.TestServerMultinode.test_schedule_to_all_nodes ... success [52.042s] 2018-05-25 06:48:02,722 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', '9ff7f108-61a5-42fd-906b-5dbb30d7b56d']'. 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | 9ff7f108-61a5-42fd-906b-5dbb30d7b56d | 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | failed | 2018-05-25 06:48:03,728 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-25 06:35:44 | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-25 06:48:02 | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:12:18 | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | concurrency: 1 | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 958d68a1-b4b4-4151-b0d6-568036179a97) | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: eacfb6a2-8084-4330-bc86-c203d0d8abde) | 2018-05-25 06:48:03,729 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 109 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 724.171 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 89 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 19 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 1 | 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 06:48:03,730 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-25 06:48:03,801 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest tempest_smoke_serial success_rate is 98.8888888889% 2018-05-25 06:48:07,841 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 06:48:07,841 - xtesting.ci.run_tests - INFO - Test result: +------------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +------------------------------+------------------+------------------+----------------+ | tempest_smoke_serial | functest | 12:35 | FAIL | +------------------------------+------------------+------------------+----------------+ 2018-05-25 06:48:07,846 - xtesting.ci.run_tests - ERROR - The test case 'tempest_smoke_serial' failed. 2018-05-25 06:48:07,846 - xtesting.ci.run_tests - INFO - Running test case 'rally_sanity'... 2018-05-25 06:48:07,929 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-25 06:48:08,396 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-25 06:48:08,396 - xtesting.energy.energy - DEBUG - Submitting scenario (rally_sanity/running) 2018-05-25 06:48:08,812 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-25 06:48:11,319 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-25 06:48:10.914 97 INFO rally.deployment.engine [-] Deployment eacfb6a2-8084-4330-bc86-c203d0d8abde | Starting: Destroy cloud and free allocated resources. 2018-05-25 06:48:11.003 97 INFO rally.deployment.engine [-] Deployment eacfb6a2-8084-4330-bc86-c203d0d8abde | Completed: Destroy cloud and free allocated resources. 2018-05-25 06:48:11.045 97 INFO rally.api [-] Deleting all verifications created by verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97) for deployment 'opnfv-rally'. 2018-05-25 06:48:11.063 97 INFO rally.api [-] Deleting verification (UUID=9ff7f108-61a5-42fd-906b-5dbb30d7b56d). 2018-05-25 06:48:11.099 97 INFO rally.api [-] Verification has been successfully deleted! 2018-05-25 06:48:11.099 97 INFO rally.api [-] Deleting deployment-specific data for verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97). 2018-05-25 06:48:11.111 97 INFO rally.api [-] Deployment-specific data has been successfully deleted! 2018-05-25 06:48:13,685 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally 2018-05-25 06:48:13.393 100 INFO rally.deployment.engines.existing [-] Save deployment 'opnfv-rally' (uuid=814d9c0f-dc1c-4d5e-acad-54e99d5557b6) with 'openstack' platform. +--------------------------------------+---------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+---------------------+-------------+------------------+--------+ | 814d9c0f-dc1c-4d5e-acad-54e99d5557b6 | 2018-05-25T06:48:13 | opnfv-rally | deploy->finished | | +--------------------------------------+---------------------+-------------+------------------+--------+ Using deployment: 814d9c0f-dc1c-4d5e-acad-54e99d5557b6 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-25 06:48:16,999 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | key-manager | Available | | __unknown__ | placement | Available | | __unknown__ | policy | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-25 06:48:16,999 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Validating the test name... 2018-05-25 06:48:17,815 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating image 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70'... 2018-05-25 06:48:19,769 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating network 'rally-net-072c6be1-d397-4e5f-8d5e-7eaca68dfe70'... 2018-05-25 06:48:21,388 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating router 'rally-router-072c6be1-d397-4e5f-8d5e-7eaca68dfe70'... 2018-05-25 06:48:24,363 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating flavor 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70'... 2018-05-25 06:48:24,434 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Creating flavor 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70'... 2018-05-25 06:48:24,508 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "authenticate" ... 2018-05-25 06:48:24,508 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-authenticate.yaml 2018-05-25 06:48:24,509 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:48:24,528 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:48:24,528 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['authenticate'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 06:49:23,711 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 9c9f3cab-2af3-41de-b564-a9b22aae4e88 2018-05-25 06:49:23,711 - functest.opnfv_tests.openstack.rally.rally - DEBUG - /home/opnfv/functest/results/rally does not exist, we create it. 2018-05-25 06:49:23,712 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '9c9f3cab-2af3-41de-b564-a9b22aae4e88'] 2018-05-25 06:49:24,707 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88: finished -------------------------------------------------------------------------------- test scenario Authenticate.keystone args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.keystone | 0.446 | 0.446 | 0.446 | 0.446 | 0.446 | 0.446 | 100.0% | 1 | | total | 0.447 | 0.447 | 0.447 | 0.447 | 0.447 | 0.447 | 100.0% | 1 | | -> duration | 0.447 | 0.447 | 0.447 | 0.447 | 0.447 | 0.447 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.446574 Full duration: 6.502996 -------------------------------------------------------------------------------- test scenario Authenticate.validate_cinder args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_cinder | 0.307 | 0.307 | 0.307 | 0.307 | 0.307 | 0.307 | 100.0% | 1 | | total | 0.728 | 0.728 | 0.728 | 0.728 | 0.728 | 0.728 | 100.0% | 1 | | -> duration | 0.728 | 0.728 | 0.728 | 0.728 | 0.728 | 0.728 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.727754 Full duration: 6.720313 -------------------------------------------------------------------------------- test scenario Authenticate.validate_glance args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_glance | 0.204 | 0.204 | 0.204 | 0.204 | 0.204 | 0.204 | 100.0% | 1 | | total | 0.602 | 0.602 | 0.602 | 0.602 | 0.602 | 0.602 | 100.0% | 1 | | -> duration | 0.602 | 0.602 | 0.602 | 0.602 | 0.602 | 0.602 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.601515 Full duration: 6.612025 -------------------------------------------------------------------------------- test scenario Authenticate.validate_heat args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_heat | 0.328 | 0.328 | 0.328 | 0.328 | 0.328 | 0.328 | 100.0% | 1 | | total | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 100.0% | 1 | | -> duration | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.729148 Full duration: 6.805415 -------------------------------------------------------------------------------- test scenario Authenticate.validate_neutron args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_neutron | 0.339 | 0.339 | 0.339 | 0.339 | 0.339 | 0.339 | 100.0% | 1 | | total | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 100.0% | 1 | | -> duration | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.723415 Full duration: 6.76983 -------------------------------------------------------------------------------- test scenario Authenticate.validate_nova args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "repetitions": 2 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9c9f3cab-2af3-41de-b564-a9b22aae4e88 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | authenticate.validate_nova | 0.198 | 0.198 | 0.198 | 0.198 | 0.198 | 0.198 | 100.0% | 1 | | total | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 100.0% | 1 | | -> duration | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 0.634 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.634016 Full duration: 6.490188 HINTS: * To plot HTML graphics with this data, run: rally task report 9c9f3cab-2af3-41de-b564-a9b22aae4e88 --out output.html * To generate a JUnit report, run: rally task export 9c9f3cab-2af3-41de-b564-a9b22aae4e88 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 9c9f3cab-2af3-41de-b564-a9b22aae4e88 --json --out output.json 2018-05-25 06:49:24,707 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', '9c9f3cab-2af3-41de-b564-a9b22aae4e88'] 2018-05-25 06:49:25,744 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 06:49:25,745 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '9c9f3cab-2af3-41de-b564-a9b22aae4e88', '--out', u'/home/opnfv/functest/results/rally/opnfv-authenticate.html'] 2018-05-25 06:49:25,752 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "authenticate" OK. 2018-05-25 06:49:25,753 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "glance" ... 2018-05-25 06:49:25,753 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-glance.yaml 2018-05-25 06:49:25,753 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:49:25,770 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:49:25,771 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['glance'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 06:50:52,589 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 2018-05-25 06:50:52,589 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1'] 2018-05-25 06:50:53,590 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1: finished -------------------------------------------------------------------------------- test scenario GlanceImages.create_and_delete_image args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "container_format": "bare", "disk_format": "qcow2", "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 3.909 | 3.909 | 3.909 | 3.909 | 3.909 | 3.909 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.071 | 0.071 | 0.071 | 0.071 | 0.071 | 0.071 | 100.0% | 1 | | -> glance_v2.upload_data | 1.597 | 1.597 | 1.597 | 1.597 | 1.597 | 1.597 | 100.0% | 1 | | glance_v2.delete_image | 1.506 | 1.506 | 1.506 | 1.506 | 1.506 | 1.506 | 100.0% | 1 | | total | 5.415 | 5.415 | 5.415 | 5.415 | 5.415 | 5.415 | 100.0% | 1 | | -> duration | 5.415 | 5.415 | 5.415 | 5.415 | 5.415 | 5.415 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.414528 Full duration: 12.89831 -------------------------------------------------------------------------------- test scenario GlanceImages.create_and_list_image args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "container_format": "bare", "disk_format": "qcow2", "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 3.215 | 3.215 | 3.215 | 3.215 | 3.215 | 3.215 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 100.0% | 1 | | -> glance_v2.upload_data | 0.841 | 0.841 | 0.841 | 0.841 | 0.841 | 0.841 | 100.0% | 1 | | glance_v2.list_images | 0.037 | 0.037 | 0.037 | 0.037 | 0.037 | 0.037 | 100.0% | 1 | | total | 3.252 | 3.252 | 3.252 | 3.252 | 3.252 | 3.252 | 100.0% | 1 | | -> duration | 3.252 | 3.252 | 3.252 | 3.252 | 3.252 | 3.252 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.252384 Full duration: 13.249348 -------------------------------------------------------------------------------- test scenario GlanceImages.list_images args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.list_images | 0.174 | 0.174 | 0.174 | 0.174 | 0.174 | 0.174 | 100.0% | 1 | | total | 0.175 | 0.175 | 0.175 | 0.175 | 0.175 | 0.175 | 100.0% | 1 | | -> duration | 0.175 | 0.175 | 0.175 | 0.175 | 0.175 | 0.175 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.174671 Full duration: 7.044558 -------------------------------------------------------------------------------- test scenario GlanceImages.create_image_and_boot_instances args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "container_format": "bare", "disk_format": "qcow2", "image_location": "/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img", "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "number_instances": 2, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "nova": { "cores": -1, "floating_ips": -1, "instances": -1, "ram": -1, "security_group_rules": -1, "security_groups": -1 } } } } -------------------------------------------------------------------------------- Task 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | glance_v2.create_image | 4.091 | 4.091 | 4.091 | 4.091 | 4.091 | 4.091 | 100.0% | 1 | | -> glance_v2.get_image (x2) | 0.052 | 0.052 | 0.052 | 0.052 | 0.052 | 0.052 | 100.0% | 1 | | -> glance_v2.upload_data | 1.799 | 1.799 | 1.799 | 1.799 | 1.799 | 1.799 | 100.0% | 1 | | nova.boot_servers | 10.565 | 10.565 | 10.565 | 10.565 | 10.565 | 10.565 | 100.0% | 1 | | total | 14.657 | 14.657 | 14.657 | 14.657 | 14.657 | 14.657 | 100.0% | 1 | | -> duration | 13.657 | 13.657 | 13.657 | 13.657 | 13.657 | 13.657 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 13.656909 Full duration: 38.241136 HINTS: * To plot HTML graphics with this data, run: rally task report 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 --out output.html * To generate a JUnit report, run: rally task export 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1 --json --out output.json 2018-05-25 06:50:53,590 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', '9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1'] 2018-05-25 06:50:54,632 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 06:50:54,632 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '9f67a6ac-1be5-45d1-97a2-6ccdcd03bff1', '--out', u'/home/opnfv/functest/results/rally/opnfv-glance.html'] 2018-05-25 06:50:54,639 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "glance" OK. 2018-05-25 06:50:54,640 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "cinder" ... 2018-05-25 06:50:54,640 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-cinder.yaml 2018-05-25 06:50:54,640 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:50:54,656 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:50:54,657 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['cinder'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 06:53:54,420 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : c477076c-7df4-4546-9967-e65eb1959286 2018-05-25 06:53:54,421 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', 'c477076c-7df4-4546-9967-e65eb1959286'] 2018-05-25 06:53:55,466 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286: finished -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_snapshot args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "force": false }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } }, "volumes": { "size": 1, "volumes_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_snapshot | 3.126 | 3.126 | 3.126 | 3.126 | 3.126 | 3.126 | 100.0% | 1 | | cinder_v2.delete_snapshot | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 | 100.0% | 1 | | total | 5.677 | 5.677 | 5.677 | 5.677 | 5.677 | 5.677 | 100.0% | 1 | | -> duration | 5.677 | 5.677 | 5.677 | 5.677 | 5.677 | 5.677 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.677022 Full duration: 21.502597 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "size": { "max": 1, "min": 1 } }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 3.051 | 3.051 | 3.051 | 3.051 | 3.051 | 3.051 | 100.0% | 1 | | cinder_v2.delete_volume | 2.566 | 2.566 | 2.566 | 2.566 | 2.566 | 2.566 | 100.0% | 1 | | total | 5.617 | 5.617 | 5.617 | 5.617 | 5.617 | 5.617 | 100.0% | 1 | | -> duration | 5.617 | 5.617 | 5.617 | 5.617 | 5.617 | 5.617 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.617188 Full duration: 14.650543 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 7.464 | 7.464 | 7.464 | 7.464 | 7.464 | 7.464 | 100.0% | 1 | | cinder_v2.delete_volume | 2.495 | 2.495 | 2.495 | 2.495 | 2.495 | 2.495 | 100.0% | 1 | | total | 9.959 | 9.959 | 9.959 | 9.959 | 9.959 | 9.959 | 100.0% | 1 | | -> duration | 9.959 | 9.959 | 9.959 | 9.959 | 9.959 | 9.959 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 9.959399 Full duration: 19.705584 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 2.898 | 2.898 | 2.898 | 2.898 | 2.898 | 2.898 | 100.0% | 1 | | cinder_v2.delete_volume | 2.562 | 2.562 | 2.562 | 2.562 | 2.562 | 2.562 | 100.0% | 1 | | total | 5.46 | 5.46 | 5.46 | 5.46 | 5.46 | 5.46 | 100.0% | 1 | | -> duration | 5.46 | 5.46 | 5.46 | 5.46 | 5.46 | 5.46 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.460418 Full duration: 14.195307 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_and_extend_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "new_size": 2, "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 2.945 | 2.945 | 2.945 | 2.945 | 2.945 | 2.945 | 100.0% | 1 | | cinder_v2.extend_volume | 2.556 | 2.556 | 2.556 | 2.556 | 2.556 | 2.556 | 100.0% | 1 | | cinder_v2.delete_volume | 2.443 | 2.443 | 2.443 | 2.443 | 2.443 | 2.443 | 100.0% | 1 | | total | 7.944 | 7.944 | 7.944 | 7.944 | 7.944 | 7.944 | 100.0% | 1 | | -> duration | 7.944 | 7.944 | 7.944 | 7.944 | 7.944 | 7.944 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 7.943636 Full duration: 17.016206 -------------------------------------------------------------------------------- test scenario CinderVolumes.create_from_volume_and_delete_volume args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "size": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 } }, "volumes": { "size": 1, "volumes_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 3.195 | 3.195 | 3.195 | 3.195 | 3.195 | 3.195 | 100.0% | 1 | | cinder_v2.delete_volume | 2.563 | 2.563 | 2.563 | 2.563 | 2.563 | 2.563 | 100.0% | 1 | | total | 5.759 | 5.759 | 5.759 | 5.759 | 5.759 | 5.759 | 100.0% | 1 | | -> duration | 5.759 | 5.759 | 5.759 | 5.759 | 5.759 | 5.759 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.758905 Full duration: 21.44175 -------------------------------------------------------------------------------- test scenario CinderQos.create_and_list_qos args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "consumer": "both", "write_iops_sec": "10", "read_iops_sec": "1000" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_qos | 0.546 | 0.546 | 0.546 | 0.546 | 0.546 | 0.546 | 100.0% | 1 | | cinder_v2.list_qos | 0.176 | 0.176 | 0.176 | 0.176 | 0.176 | 0.176 | 100.0% | 1 | | total | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 100.0% | 1 | | -> duration | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 0.723 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.722504 Full duration: 11.03071 -------------------------------------------------------------------------------- test scenario CinderQos.create_and_set_qos args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "consumer": "back-end", "write_iops_sec": "10", "read_iops_sec": "1000", "set_consumer": "both", "set_write_iops_sec": "11", "set_read_iops_sec": "1001" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_qos | 0.569 | 0.569 | 0.569 | 0.569 | 0.569 | 0.569 | 100.0% | 1 | | cinder_v2.set_qos | 0.184 | 0.184 | 0.184 | 0.184 | 0.184 | 0.184 | 100.0% | 1 | | total | 0.754 | 0.754 | 0.754 | 0.754 | 0.754 | 0.754 | 100.0% | 1 | | -> duration | 0.754 | 0.754 | 0.754 | 0.754 | 0.754 | 0.754 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.753634 Full duration: 11.015782 -------------------------------------------------------------------------------- test scenario CinderVolumeTypes.create_and_list_volume_types args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "description": "rally tests creating types" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume_type | 0.565 | 0.565 | 0.565 | 0.565 | 0.565 | 0.565 | 100.0% | 1 | | cinder_v2.list_types | 0.164 | 0.164 | 0.164 | 0.164 | 0.164 | 0.164 | 100.0% | 1 | | total | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 100.0% | 1 | | -> duration | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 0.729 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.729214 Full duration: 11.226499 -------------------------------------------------------------------------------- test scenario CinderVolumeTypes.create_volume_type_and_encryption_type args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "description": "rally tests creating types", "provider": "LuksEncryptor", "cipher": "aes-xts-plain64", "key_size": 512, "control_location": "front-end" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task c477076c-7df4-4546-9967-e65eb1959286 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume_type | 0.773 | 0.773 | 0.773 | 0.773 | 0.773 | 0.773 | 100.0% | 1 | | cinder_v2.create_encryption_type | 0.185 | 0.185 | 0.185 | 0.185 | 0.185 | 0.185 | 100.0% | 1 | | total | 0.958 | 0.958 | 0.958 | 0.958 | 0.958 | 0.958 | 100.0% | 1 | | -> duration | 0.958 | 0.958 | 0.958 | 0.958 | 0.958 | 0.958 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.95843 Full duration: 11.127874 HINTS: * To plot HTML graphics with this data, run: rally task report c477076c-7df4-4546-9967-e65eb1959286 --out output.html * To generate a JUnit report, run: rally task export c477076c-7df4-4546-9967-e65eb1959286 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report c477076c-7df4-4546-9967-e65eb1959286 --json --out output.json 2018-05-25 06:53:55,467 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', 'c477076c-7df4-4546-9967-e65eb1959286'] 2018-05-25 06:53:56,570 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 06:53:56,570 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', 'c477076c-7df4-4546-9967-e65eb1959286', '--out', u'/home/opnfv/functest/results/rally/opnfv-cinder.html'] 2018-05-25 06:53:56,578 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "cinder" OK. 2018-05-25 06:53:56,579 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "heat" ... 2018-05-25 06:53:56,579 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-heat.yaml 2018-05-25 06:53:56,579 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:53:56,596 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:53:56,596 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['heat'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 06:55:08,476 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : d4919069-1bd3-4441-83f9-df1f8545b43b 2018-05-25 06:55:08,476 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', 'd4919069-1bd3-4441-83f9-df1f8545b43b'] 2018-05-25 06:55:09,475 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task d4919069-1bd3-4441-83f9-df1f8545b43b: finished -------------------------------------------------------------------------------- test scenario HeatStacks.create_update_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/autoscaling_policy.yaml.template", "updated_template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/updated_autoscaling_policy_inplace.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "roles": [ "heat_stack_owner" ] } } -------------------------------------------------------------------------------- Task d4919069-1bd3-4441-83f9-df1f8545b43b has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 4.213 | 4.213 | 4.213 | 4.213 | 4.213 | 4.213 | 100.0% | 1 | | heat.update_stack | 2.588 | 2.588 | 2.588 | 2.588 | 2.588 | 2.588 | 100.0% | 1 | | heat.delete_stack | 2.346 | 2.346 | 2.346 | 2.346 | 2.346 | 2.346 | 100.0% | 1 | | total | 9.147 | 9.147 | 9.147 | 9.147 | 9.147 | 9.147 | 100.0% | 1 | | -> duration | 5.147 | 5.147 | 5.147 | 5.147 | 5.147 | 5.147 | 100.0% | 1 | | -> idle_duration | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 4.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.146975 Full duration: 17.981953 -------------------------------------------------------------------------------- test scenario HeatStacks.create_check_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/random_strings.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "roles": [ "heat_stack_owner" ] } } -------------------------------------------------------------------------------- Task d4919069-1bd3-4441-83f9-df1f8545b43b has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 4.376 | 4.376 | 4.376 | 4.376 | 4.376 | 4.376 | 100.0% | 1 | | heat.check_stack | 1.375 | 1.375 | 1.375 | 1.375 | 1.375 | 1.375 | 100.0% | 1 | | heat.delete_stack | 2.388 | 2.388 | 2.388 | 2.388 | 2.388 | 2.388 | 100.0% | 1 | | total | 8.14 | 8.14 | 8.14 | 8.14 | 8.14 | 8.14 | 100.0% | 1 | | -> duration | 6.14 | 6.14 | 6.14 | 6.14 | 6.14 | 6.14 | 100.0% | 1 | | -> idle_duration | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 6.139715 Full duration: 15.581301 -------------------------------------------------------------------------------- test scenario HeatStacks.create_suspend_resume_delete_stack args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "template_path": "/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates/random_strings.yaml.template" }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "roles": [ "heat_stack_owner" ] } } -------------------------------------------------------------------------------- Task d4919069-1bd3-4441-83f9-df1f8545b43b has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.create_stack | 4.328 | 4.328 | 4.328 | 4.328 | 4.328 | 4.328 | 100.0% | 1 | | heat.suspend_stack | 0.492 | 0.492 | 0.492 | 0.492 | 0.492 | 0.492 | 100.0% | 1 | | heat.resume_stack | 1.367 | 1.367 | 1.367 | 1.367 | 1.367 | 1.367 | 100.0% | 1 | | heat.delete_stack | 2.278 | 2.278 | 2.278 | 2.278 | 2.278 | 2.278 | 100.0% | 1 | | total | 8.465 | 8.465 | 8.465 | 8.465 | 8.465 | 8.465 | 100.0% | 1 | | -> duration | 6.465 | 6.465 | 6.465 | 6.465 | 6.465 | 6.465 | 100.0% | 1 | | -> idle_duration | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 2.0 | 100.0% | 1 | +--------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 6.464602 Full duration: 16.024037 -------------------------------------------------------------------------------- test scenario HeatStacks.list_stacks_and_resources args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "roles": [ "heat_stack_owner" ] } } -------------------------------------------------------------------------------- Task d4919069-1bd3-4441-83f9-df1f8545b43b has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | heat.list_stacks | 0.546 | 0.546 | 0.546 | 0.546 | 0.546 | 0.546 | 100.0% | 1 | | total | 0.547 | 0.547 | 0.547 | 0.547 | 0.547 | 0.547 | 100.0% | 1 | | -> duration | 0.547 | 0.547 | 0.547 | 0.547 | 0.547 | 0.547 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.546528 Full duration: 7.439005 HINTS: * To plot HTML graphics with this data, run: rally task report d4919069-1bd3-4441-83f9-df1f8545b43b --out output.html * To generate a JUnit report, run: rally task export d4919069-1bd3-4441-83f9-df1f8545b43b --type junit --to output.xml * To get raw JSON output of task results, run: rally task report d4919069-1bd3-4441-83f9-df1f8545b43b --json --out output.json 2018-05-25 06:55:09,476 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', 'd4919069-1bd3-4441-83f9-df1f8545b43b'] 2018-05-25 06:55:10,580 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 06:55:10,581 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', 'd4919069-1bd3-4441-83f9-df1f8545b43b', '--out', u'/home/opnfv/functest/results/rally/opnfv-heat.html'] 2018-05-25 06:55:10,588 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "heat" OK. 2018-05-25 06:55:10,588 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "keystone" ... 2018-05-25 06:55:10,588 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-keystone.yaml 2018-05-25 06:55:10,588 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:55:10,606 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:55:10,607 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['keystone'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 06:58:07,610 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 887ce1c5-741d-40f2-b5d1-2f642b396369 2018-05-25 06:58:07,610 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '887ce1c5-741d-40f2-b5d1-2f642b396369'] 2018-05-25 06:58:08,654 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369: finished -------------------------------------------------------------------------------- test scenario KeystoneBasic.add_and_remove_user_role args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.466 | 0.466 | 0.466 | 0.466 | 0.466 | 0.466 | 100.0% | 1 | | keystone_v3.add_role | 0.104 | 0.104 | 0.104 | 0.104 | 0.104 | 0.104 | 100.0% | 1 | | keystone_v3.revoke_role | 0.12 | 0.12 | 0.12 | 0.12 | 0.12 | 0.12 | 100.0% | 1 | | total | 0.691 | 0.691 | 0.691 | 0.691 | 0.691 | 0.691 | 100.0% | 1 | | -> duration | 0.691 | 0.691 | 0.691 | 0.691 | 0.691 | 0.691 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.691061 Full duration: 13.452509 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_add_and_list_user_roles args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.466 | 0.466 | 0.466 | 0.466 | 0.466 | 0.466 | 100.0% | 1 | | keystone_v3.add_role | 0.11 | 0.11 | 0.11 | 0.11 | 0.11 | 0.11 | 100.0% | 1 | | keystone_v3.list_roles | 0.093 | 0.093 | 0.093 | 0.093 | 0.093 | 0.093 | 100.0% | 1 | | total | 0.668 | 0.668 | 0.668 | 0.668 | 0.668 | 0.668 | 100.0% | 1 | | -> duration | 0.668 | 0.668 | 0.668 | 0.668 | 0.668 | 0.668 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.668178 Full duration: 13.081484 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_list_tenants args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.702 | 0.702 | 0.702 | 0.702 | 0.702 | 0.702 | 100.0% | 1 | | keystone_v3.list_projects | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 100.0% | 1 | | total | 0.779 | 0.779 | 0.779 | 0.779 | 0.779 | 0.779 | 100.0% | 1 | | -> duration | 0.779 | 0.779 | 0.779 | 0.779 | 0.779 | 0.779 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.778736 Full duration: 13.711578 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_delete_role args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_role | 0.473 | 0.473 | 0.473 | 0.473 | 0.473 | 0.473 | 100.0% | 1 | | keystone_v3.delete_role | 0.104 | 0.104 | 0.104 | 0.104 | 0.104 | 0.104 | 100.0% | 1 | | total | 0.577 | 0.577 | 0.577 | 0.577 | 0.577 | 0.577 | 100.0% | 1 | | -> duration | 0.577 | 0.577 | 0.577 | 0.577 | 0.577 | 0.577 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.57719 Full duration: 11.211544 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_delete_service args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_service | 0.462 | 0.462 | 0.462 | 0.462 | 0.462 | 0.462 | 100.0% | 1 | | keystone_v3.delete_service | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 0.092 | 100.0% | 1 | | total | 0.554 | 0.554 | 0.554 | 0.554 | 0.554 | 0.554 | 100.0% | 1 | | -> duration | 0.554 | 0.554 | 0.554 | 0.554 | 0.554 | 0.554 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.553651 Full duration: 11.076583 -------------------------------------------------------------------------------- test scenario KeystoneBasic.get_entities args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.619 | 0.619 | 0.619 | 0.619 | 0.619 | 0.619 | 100.0% | 1 | | keystone_v3.create_user | 0.425 | 0.425 | 0.425 | 0.425 | 0.425 | 0.425 | 100.0% | 1 | | -> keystone_v3.list_roles | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 100.0% | 1 | | -> keystone_v3.add_role | 0.107 | 0.107 | 0.107 | 0.107 | 0.107 | 0.107 | 100.0% | 1 | | keystone_v3.create_role | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 100.0% | 1 | | keystone_v3.get_project | 0.083 | 0.083 | 0.083 | 0.083 | 0.083 | 0.083 | 100.0% | 1 | | keystone_v3.get_user | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 0.088 | 100.0% | 1 | | keystone_v3.get_role | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 100.0% | 1 | | keystone_v3.list_services | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 0.075 | 100.0% | 1 | | keystone_v3.get_services | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 0.076 | 100.0% | 1 | | total | 1.619 | 1.619 | 1.619 | 1.619 | 1.619 | 1.619 | 100.0% | 1 | | -> duration | 1.619 | 1.619 | 1.619 | 1.619 | 1.619 | 1.619 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.618915 Full duration: 18.217961 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_update_and_delete_tenant args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.639 | 0.639 | 0.639 | 0.639 | 0.639 | 0.639 | 100.0% | 1 | | keystone_v3.update_project | 0.096 | 0.096 | 0.096 | 0.096 | 0.096 | 0.096 | 100.0% | 1 | | keystone_v3.delete_project | 0.122 | 0.122 | 0.122 | 0.122 | 0.122 | 0.122 | 100.0% | 1 | | total | 0.857 | 0.857 | 0.857 | 0.857 | 0.857 | 0.857 | 100.0% | 1 | | -> duration | 0.857 | 0.857 | 0.857 | 0.857 | 0.857 | 0.857 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.856752 Full duration: 11.498177 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_user args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": {} } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_user | 0.622 | 0.622 | 0.622 | 0.622 | 0.622 | 0.622 | 100.0% | 1 | | total | 0.709 | 0.709 | 0.709 | 0.709 | 0.709 | 0.709 | 100.0% | 1 | | -> duration | 0.709 | 0.709 | 0.709 | 0.709 | 0.709 | 0.709 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.709466 Full duration: 13.184057 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_tenant args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": {} } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.614 | 0.614 | 0.614 | 0.614 | 0.614 | 0.614 | 100.0% | 1 | | total | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 100.0% | 1 | | -> duration | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.614784 Full duration: 13.449371 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_and_list_users args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": {} } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_user | 0.628 | 0.628 | 0.628 | 0.628 | 0.628 | 0.628 | 100.0% | 1 | | keystone_v3.list_users | 0.174 | 0.174 | 0.174 | 0.174 | 0.174 | 0.174 | 100.0% | 1 | | total | 0.891 | 0.891 | 0.891 | 0.891 | 0.891 | 0.891 | 100.0% | 1 | | -> duration | 0.891 | 0.891 | 0.891 | 0.891 | 0.891 | 0.891 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.891061 Full duration: 13.448793 -------------------------------------------------------------------------------- test scenario KeystoneBasic.create_tenant_with_users args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "users_per_tenant": 10 }, "sla": { "failure_rate": { "max": 0 } }, "context": {} } -------------------------------------------------------------------------------- Task 887ce1c5-741d-40f2-b5d1-2f642b396369 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | keystone_v3.create_project | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 0.615 | 100.0% | 1 | | keystone_v3.create_users | 4.375 | 4.375 | 4.375 | 4.375 | 4.375 | 4.375 | 100.0% | 1 | | -> keystone_v3.create_user (x10) | 4.374 | 4.374 | 4.374 | 4.374 | 4.374 | 4.374 | 100.0% | 1 | | --> keystone_v3.list_roles (x10) | 0.756 | 0.756 | 0.756 | 0.756 | 0.756 | 0.756 | 100.0% | 1 | | --> keystone_v3.add_role (x10) | 1.102 | 1.102 | 1.102 | 1.102 | 1.102 | 1.102 | 100.0% | 1 | | total | 5.875 | 5.875 | 5.875 | 5.875 | 5.875 | 5.875 | 100.0% | 1 | | -> duration | 5.875 | 5.875 | 5.875 | 5.875 | 5.875 | 5.875 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 5.874602 Full duration: 22.846669 HINTS: * To plot HTML graphics with this data, run: rally task report 887ce1c5-741d-40f2-b5d1-2f642b396369 --out output.html * To generate a JUnit report, run: rally task export 887ce1c5-741d-40f2-b5d1-2f642b396369 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 887ce1c5-741d-40f2-b5d1-2f642b396369 --json --out output.json 2018-05-25 06:58:08,655 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', '887ce1c5-741d-40f2-b5d1-2f642b396369'] 2018-05-25 06:58:09,705 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 06:58:09,706 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '887ce1c5-741d-40f2-b5d1-2f642b396369', '--out', u'/home/opnfv/functest/results/rally/opnfv-keystone.html'] 2018-05-25 06:58:09,713 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "keystone" OK. 2018-05-25 06:58:09,714 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "neutron" ... 2018-05-25 06:58:09,714 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-neutron.yaml 2018-05-25 06:58:09,714 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 06:58:09,731 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 06:58:09,732 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['neutron'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 07:02:25,706 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : a12d2510-995a-4e2b-9496-124b64314d64 2018-05-25 07:02:25,707 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', 'a12d2510-995a-4e2b-9496-124b64314d64'] 2018-05-25 07:02:26,663 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64: finished -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_networks args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "neutron": { "network": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.892 | 0.892 | 0.892 | 0.892 | 0.892 | 0.892 | 100.0% | 1 | | neutron.delete_network | 0.633 | 0.633 | 0.633 | 0.633 | 0.633 | 0.633 | 100.0% | 1 | | total | 1.525 | 1.525 | 1.525 | 1.525 | 1.525 | 1.525 | 100.0% | 1 | | -> duration | 1.525 | 1.525 | 1.525 | 1.525 | 1.525 | 1.525 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.524789 Full duration: 13.699532 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_ports args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "port_create_args": {}, "ports_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "port": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_port | 1.593 | 1.593 | 1.593 | 1.593 | 1.593 | 1.593 | 100.0% | 1 | | neutron.delete_port | 0.49 | 0.49 | 0.49 | 0.49 | 0.49 | 0.49 | 100.0% | 1 | | total | 2.084 | 2.084 | 2.084 | 2.084 | 2.084 | 2.084 | 100.0% | 1 | | -> duration | 2.084 | 2.084 | 2.084 | 2.084 | 2.084 | 2.084 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 2.083567 Full duration: 22.721224 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_routers args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "router_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnet_create_args": {}, "subnets_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "subnet": -1, "port": -1, "router": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.761 | 0.761 | 0.761 | 0.761 | 0.761 | 0.761 | 100.0% | 1 | | neutron.create_subnet | 0.594 | 0.594 | 0.594 | 0.594 | 0.594 | 0.594 | 100.0% | 1 | | neutron.create_router | 0.151 | 0.151 | 0.151 | 0.151 | 0.151 | 0.151 | 100.0% | 1 | | neutron.add_interface_router | 1.51 | 1.51 | 1.51 | 1.51 | 1.51 | 1.51 | 100.0% | 1 | | neutron.remove_interface_router | 0.959 | 0.959 | 0.959 | 0.959 | 0.959 | 0.959 | 100.0% | 1 | | neutron.delete_router | 0.23 | 0.23 | 0.23 | 0.23 | 0.23 | 0.23 | 100.0% | 1 | | total | 4.205 | 4.205 | 4.205 | 4.205 | 4.205 | 4.205 | 100.0% | 1 | | -> duration | 4.205 | 4.205 | 4.205 | 4.205 | 4.205 | 4.205 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 4.205075 Full duration: 28.754821 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_delete_subnets args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnet_create_args": {}, "subnets_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "subnet": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +--------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_subnet | 1.18 | 1.18 | 1.18 | 1.18 | 1.18 | 1.18 | 100.0% | 1 | | neutron.delete_subnet | 0.377 | 0.377 | 0.377 | 0.377 | 0.377 | 0.377 | 100.0% | 1 | | total | 1.558 | 1.558 | 1.558 | 1.558 | 1.558 | 1.558 | 100.0% | 1 | | -> duration | 1.558 | 1.558 | 1.558 | 1.558 | 1.558 | 1.558 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.558002 Full duration: 21.74969 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_networks args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {} }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "neutron": { "network": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.978 | 0.978 | 0.978 | 0.978 | 0.978 | 0.978 | 100.0% | 1 | | neutron.list_networks | 0.155 | 0.155 | 0.155 | 0.155 | 0.155 | 0.155 | 100.0% | 1 | | total | 1.134 | 1.134 | 1.134 | 1.134 | 1.134 | 1.134 | 100.0% | 1 | | -> duration | 1.134 | 1.134 | 1.134 | 1.134 | 1.134 | 1.134 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.133526 Full duration: 15.268924 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_ports args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "port_create_args": {}, "ports_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "port": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_port | 1.379 | 1.379 | 1.379 | 1.379 | 1.379 | 1.379 | 100.0% | 1 | | neutron.list_ports | 0.083 | 0.083 | 0.083 | 0.083 | 0.083 | 0.083 | 100.0% | 1 | | total | 1.463 | 1.463 | 1.463 | 1.463 | 1.463 | 1.463 | 100.0% | 1 | | -> duration | 1.463 | 1.463 | 1.463 | 1.463 | 1.463 | 1.463 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.462819 Full duration: 24.091716 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_routers args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "router_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnet_create_args": {}, "subnets_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "subnet": -1, "router": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.735 | 0.735 | 0.735 | 0.735 | 0.735 | 0.735 | 100.0% | 1 | | neutron.create_subnet | 0.57 | 0.57 | 0.57 | 0.57 | 0.57 | 0.57 | 100.0% | 1 | | neutron.create_router | 0.14 | 0.14 | 0.14 | 0.14 | 0.14 | 0.14 | 100.0% | 1 | | neutron.add_interface_router | 1.459 | 1.459 | 1.459 | 1.459 | 1.459 | 1.459 | 100.0% | 1 | | neutron.list_routers | 0.065 | 0.065 | 0.065 | 0.065 | 0.065 | 0.065 | 100.0% | 1 | | total | 2.968 | 2.968 | 2.968 | 2.968 | 2.968 | 2.968 | 100.0% | 1 | | -> duration | 2.968 | 2.968 | 2.968 | 2.968 | 2.968 | 2.968 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 2.968298 Full duration: 31.795957 -------------------------------------------------------------------------------- test scenario NeutronNetworks.create_and_list_subnets args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": {}, "subnet_cidr_start": "1.1.0.0/30", "subnet_create_args": {}, "subnets_per_network": 1 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {}, "quotas": { "neutron": { "network": -1, "subnet": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.895 | 0.895 | 0.895 | 0.895 | 0.895 | 0.895 | 100.0% | 1 | | neutron.create_subnet | 0.449 | 0.449 | 0.449 | 0.449 | 0.449 | 0.449 | 100.0% | 1 | | neutron.list_subnets | 0.09 | 0.09 | 0.09 | 0.09 | 0.09 | 0.09 | 100.0% | 1 | | total | 1.433 | 1.433 | 1.433 | 1.433 | 1.433 | 1.433 | 100.0% | 1 | | -> duration | 1.433 | 1.433 | 1.433 | 1.433 | 1.433 | 1.433 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.4334 Full duration: 25.845486 -------------------------------------------------------------------------------- test scenario NeutronSecurityGroup.create_and_delete_security_groups args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "neutron": { "security_group": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_security_group | 0.881 | 0.881 | 0.881 | 0.881 | 0.881 | 0.881 | 100.0% | 1 | | neutron.delete_security_group | 0.109 | 0.109 | 0.109 | 0.109 | 0.109 | 0.109 | 100.0% | 1 | | total | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 100.0% | 1 | | -> duration | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.989857 Full duration: 12.779277 -------------------------------------------------------------------------------- test scenario NeutronSecurityGroup.create_and_delete_security_group_rule args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": {}, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "neutron": { "security_group": -1 } } } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +---------------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_security_group | 0.815 | 0.815 | 0.815 | 0.815 | 0.815 | 0.815 | 100.0% | 1 | | neutron.create_security_group_rule | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 0.137 | 100.0% | 1 | | neutron.delete_security_group_rule | 0.236 | 0.236 | 0.236 | 0.236 | 0.236 | 0.236 | 100.0% | 1 | | neutron.delete_security_group | 0.11 | 0.11 | 0.11 | 0.11 | 0.11 | 0.11 | 100.0% | 1 | | total | 1.298 | 1.298 | 1.298 | 1.298 | 1.298 | 1.298 | 100.0% | 1 | | -> duration | 1.298 | 1.298 | 1.298 | 1.298 | 1.298 | 1.298 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +------------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 1.297572 Full duration: 13.313848 -------------------------------------------------------------------------------- test scenario NeutronNetworks.set_and_clear_router_gateway args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "network_create_args": { "router:external": true } }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "quotas": { "neutron": { "network": -1, "router": -1 } }, "roles": [ "admin" ] } } -------------------------------------------------------------------------------- Task a12d2510-995a-4e2b-9496-124b64314d64 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | neutron.create_network | 0.893 | 0.893 | 0.893 | 0.893 | 0.893 | 0.893 | 100.0% | 1 | | neutron.create_router | 0.133 | 0.133 | 0.133 | 0.133 | 0.133 | 0.133 | 100.0% | 1 | | neutron.add_gateway_router | 1.127 | 1.127 | 1.127 | 1.127 | 1.127 | 1.127 | 100.0% | 1 | | neutron.remove_gateway_router | 0.859 | 0.859 | 0.859 | 0.859 | 0.859 | 0.859 | 100.0% | 1 | | total | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 100.0% | 1 | | -> duration | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 3.012 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +-------------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 3.011868 Full duration: 22.035809 HINTS: * To plot HTML graphics with this data, run: rally task report a12d2510-995a-4e2b-9496-124b64314d64 --out output.html * To generate a JUnit report, run: rally task export a12d2510-995a-4e2b-9496-124b64314d64 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report a12d2510-995a-4e2b-9496-124b64314d64 --json --out output.json 2018-05-25 07:02:26,664 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', 'a12d2510-995a-4e2b-9496-124b64314d64'] 2018-05-25 07:02:27,686 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 07:02:27,687 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', 'a12d2510-995a-4e2b-9496-124b64314d64', '--out', u'/home/opnfv/functest/results/rally/opnfv-neutron.html'] 2018-05-25 07:02:27,694 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "neutron" OK. 2018-05-25 07:02:27,694 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "nova" ... 2018-05-25 07:02:27,695 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/sanity/opnfv-nova.yaml 2018-05-25 07:02:27,695 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 07:02:27,712 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 07:02:27,713 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['nova'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 07:08:29,074 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : a3106a39-5e0d-462b-b711-9567159a1bf9 2018-05-25 07:08:29,075 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', 'a3106a39-5e0d-462b-b711-9567159a1bf9'] 2018-05-25 07:08:30,118 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9: finished -------------------------------------------------------------------------------- test scenario NovaServers.boot_and_live_migrate_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "block_migration": false, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 8.191 | 8.191 | 8.191 | 8.191 | 8.191 | 8.191 | 100.0% | 1 | | nova.find_host_to_migrate | 1.017 | 1.017 | 1.017 | 1.017 | 1.017 | 1.017 | 100.0% | 1 | | nova.live_migrate | 8.818 | 8.818 | 8.818 | 8.818 | 8.818 | 8.818 | 100.0% | 1 | | nova.delete_server | 2.367 | 2.367 | 2.367 | 2.367 | 2.367 | 2.367 | 100.0% | 1 | | total | 20.393 | 20.393 | 20.393 | 20.393 | 20.393 | 20.393 | 100.0% | 1 | | -> duration | 19.393 | 19.393 | 19.393 | 19.393 | 19.393 | 19.393 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 19.393188 Full duration: 29.131507 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_attach_created_volume_and_live_migrate args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "size": 10, "block_migration": false, "boot_server_kwargs": { "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] } }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 9.821 | 9.821 | 9.821 | 9.821 | 9.821 | 9.821 | 100.0% | 1 | | cinder_v2.create_volume | 2.715 | 2.715 | 2.715 | 2.715 | 2.715 | 2.715 | 100.0% | 1 | | nova.attach_volume | 4.082 | 4.082 | 4.082 | 4.082 | 4.082 | 4.082 | 100.0% | 1 | | nova.find_host_to_migrate | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 100.0% | 1 | | nova.live_migrate | 8.922 | 8.922 | 8.922 | 8.922 | 8.922 | 8.922 | 100.0% | 1 | | nova.detach_volume | 1.154 | 1.154 | 1.154 | 1.154 | 1.154 | 1.154 | 100.0% | 1 | | cinder_v2.delete_volume | 2.142 | 2.142 | 2.142 | 2.142 | 2.142 | 2.142 | 100.0% | 1 | | nova.delete_server | 2.669 | 2.669 | 2.669 | 2.669 | 2.669 | 2.669 | 100.0% | 1 | | total | 32.516 | 32.516 | 32.516 | 32.516 | 32.516 | 32.516 | 100.0% | 1 | | -> duration | 31.516 | 31.516 | 31.516 | 31.516 | 31.516 | 31.516 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 31.515806 Full duration: 43.080009 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_from_volume_and_live_migrate args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "block_migration": false, "volume_size": 10, "force_delete": false, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 7.681 | 7.681 | 7.681 | 7.681 | 7.681 | 7.681 | 100.0% | 1 | | nova.boot_server | 7.598 | 7.598 | 7.598 | 7.598 | 7.598 | 7.598 | 100.0% | 1 | | nova.find_host_to_migrate | 0.972 | 0.972 | 0.972 | 0.972 | 0.972 | 0.972 | 100.0% | 1 | | nova.live_migrate | 13.944 | 13.944 | 13.944 | 13.944 | 13.944 | 13.944 | 100.0% | 1 | | nova.delete_server | 7.137 | 7.137 | 7.137 | 7.137 | 7.137 | 7.137 | 100.0% | 1 | | total | 37.333 | 37.333 | 37.333 | 37.333 | 37.333 | 37.333 | 100.0% | 1 | | -> duration | 36.333 | 36.333 | 36.333 | 36.333 | 36.333 | 36.333 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 36.333035 Full duration: 47.718915 -------------------------------------------------------------------------------- test scenario NovaKeypair.boot_and_delete_server_with_keypair args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "server_kwargs": { "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] } }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" }, "quotas": { "neutron": { "network": -1, "port": -1, "subnet": -1 }, "nova": { "cores": -1, "floating_ips": -1, "instances": -1, "key_pairs": -1, "ram": -1, "security_group_rules": -1, "security_groups": -1 } } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.create_keypair | 0.675 | 0.675 | 0.675 | 0.675 | 0.675 | 0.675 | 100.0% | 1 | | nova.boot_server | 7.102 | 7.102 | 7.102 | 7.102 | 7.102 | 7.102 | 100.0% | 1 | | nova.delete_server | 2.346 | 2.346 | 2.346 | 2.346 | 2.346 | 2.346 | 100.0% | 1 | | nova.delete_keypair | 0.043 | 0.043 | 0.043 | 0.043 | 0.043 | 0.043 | 100.0% | 1 | | total | 10.169 | 10.169 | 10.169 | 10.169 | 10.169 | 10.169 | 100.0% | 1 | | -> duration | 9.169 | 9.169 | 9.169 | 9.169 | 9.169 | 9.169 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 9.16855 Full duration: 28.413427 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_from_volume_and_delete args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "volume_size": 5, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" }, "quotas": { "cinder": { "gigabytes": -1, "snapshots": -1, "volumes": -1 }, "neutron": { "network": -1, "port": -1, "subnet": -1 }, "nova": { "cores": -1, "floating_ips": -1, "instances": -1, "ram": -1, "security_group_rules": -1, "security_groups": -1 } } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | cinder_v2.create_volume | 8.073 | 8.073 | 8.073 | 8.073 | 8.073 | 8.073 | 100.0% | 1 | | nova.boot_server | 8.243 | 8.243 | 8.243 | 8.243 | 8.243 | 8.243 | 100.0% | 1 | | nova.delete_server | 2.711 | 2.711 | 2.711 | 2.711 | 2.711 | 2.711 | 100.0% | 1 | | total | 19.028 | 19.028 | 19.028 | 19.028 | 19.028 | 19.028 | 100.0% | 1 | | -> duration | 18.028 | 18.028 | 18.028 | 18.028 | 18.028 | 18.028 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +-------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 18.027553 Full duration: 39.588181 -------------------------------------------------------------------------------- test scenario NovaServers.pause_and_unpause_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "force_delete": false, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": { "networks_per_tenant": 1, "start_cidr": "100.1.0.0/25" }, "quotas": { "neutron": { "network": -1, "port": -1, "subnet": -1 }, "nova": { "cores": -1, "floating_ips": -1, "instances": -1, "ram": -1, "security_group_rules": -1, "security_groups": -1 } } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 7.391 | 7.391 | 7.391 | 7.391 | 7.391 | 7.391 | 100.0% | 1 | | nova.pause_server | 2.599 | 2.599 | 2.599 | 2.599 | 2.599 | 2.599 | 100.0% | 1 | | nova.unpause_server | 2.445 | 2.445 | 2.445 | 2.445 | 2.445 | 2.445 | 100.0% | 1 | | nova.delete_server | 7.138 | 7.138 | 7.138 | 7.138 | 7.138 | 7.138 | 100.0% | 1 | | total | 19.573 | 19.573 | 19.573 | 19.573 | 19.573 | 19.573 | 100.0% | 1 | | -> duration | 14.573 | 14.573 | 14.573 | 14.573 | 14.573 | 14.573 | 100.0% | 1 | | -> idle_duration | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 5.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 14.572725 Full duration: 36.982205 -------------------------------------------------------------------------------- test scenario NovaServers.boot_and_migrate_server args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "nics": [ { "net-id": "290ecdf3-eee5-491f-9897-262e8d1f460e" } ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 7.661 | 7.661 | 7.661 | 7.661 | 7.661 | 7.661 | 100.0% | 1 | | nova.migrate | 42.54 | 42.54 | 42.54 | 42.54 | 42.54 | 42.54 | 100.0% | 1 | | nova.resize_confirm | 2.553 | 2.553 | 2.553 | 2.553 | 2.553 | 2.553 | 100.0% | 1 | | nova.delete_server | 4.507 | 4.507 | 4.507 | 4.507 | 4.507 | 4.507 | 100.0% | 1 | | total | 57.261 | 57.261 | 57.261 | 57.261 | 57.261 | 57.261 | 100.0% | 1 | | -> duration | 56.261 | 56.261 | 56.261 | 56.261 | 56.261 | 56.261 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +---------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 56.261016 Full duration: 65.949149 -------------------------------------------------------------------------------- test scenario NovaServers.boot_server_and_list_interfaces args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "flavor": { "name": "rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "image": { "name": "Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70" }, "auto_assign_nic": true }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 }, "network": {} } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.boot_server | 7.949 | 7.949 | 7.949 | 7.949 | 7.949 | 7.949 | 100.0% | 1 | | nova.list_interfaces | 0.099 | 0.099 | 0.099 | 0.099 | 0.099 | 0.099 | 100.0% | 1 | | total | 8.048 | 8.048 | 8.048 | 8.048 | 8.048 | 8.048 | 100.0% | 1 | | -> duration | 7.048 | 7.048 | 7.048 | 7.048 | 7.048 | 7.048 | 100.0% | 1 | | -> idle_duration | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 7.048241 Full duration: 32.149527 -------------------------------------------------------------------------------- test scenario NovaServerGroups.create_and_delete_server_group args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "policies": [ "affinity" ] }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task a3106a39-5e0d-462b-b711-9567159a1bf9 has 0 error(s) -------------------------------------------------------------------------------- +-----------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | nova.create_server_group | 0.595 | 0.595 | 0.595 | 0.595 | 0.595 | 0.595 | 100.0% | 1 | | nova.delete_server_group | 0.068 | 0.068 | 0.068 | 0.068 | 0.068 | 0.068 | 100.0% | 1 | | total | 0.663 | 0.663 | 0.663 | 0.663 | 0.663 | 0.663 | 100.0% | 1 | | -> duration | 0.663 | 0.663 | 0.663 | 0.663 | 0.663 | 0.663 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +--------------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.6631 Full duration: 8.480905 HINTS: * To plot HTML graphics with this data, run: rally task report a3106a39-5e0d-462b-b711-9567159a1bf9 --out output.html * To generate a JUnit report, run: rally task export a3106a39-5e0d-462b-b711-9567159a1bf9 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report a3106a39-5e0d-462b-b711-9567159a1bf9 --json --out output.json 2018-05-25 07:08:30,118 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', 'a3106a39-5e0d-462b-b711-9567159a1bf9'] 2018-05-25 07:08:31,170 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 07:08:31,171 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', 'a3106a39-5e0d-462b-b711-9567159a1bf9', '--out', u'/home/opnfv/functest/results/rally/opnfv-nova.html'] 2018-05-25 07:08:31,179 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "nova" OK. 2018-05-25 07:08:31,179 - functest.opnfv_tests.openstack.rally.rally - INFO - Starting test scenario "quotas" ... 2018-05-25 07:08:31,180 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Scenario fetched from : /usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/opnfv-quotas.yaml 2018-05-25 07:08:31,180 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Applying blacklist... 2018-05-25 07:08:31,196 - functest.opnfv_tests.openstack.rally.rally - DEBUG - Blacklisted tests: [u'Quotas.nova_update_and_delete'] 2018-05-25 07:08:31,197 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'start', '--abort-on-sla-failure', '--task', '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/task.yaml', '--task-args', "{'smoke': True, 'tmpl_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/templates', 'floating_network': 'external', 'service_list': ['quotas'], 'concurrency': 4, 'netid': '290ecdf3-eee5-491f-9897-262e8d1f460e', 'tenants_amount': 3, 'image_name': 'Cirros-0.4.0-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'flavor_name': 'rally-tiny-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'glance_image_location': '/home/opnfv/functest/images/cirros-0.4.0-x86_64-disk.img', 'use_existing_users': False, 'flavor_alt_name': 'rally-mini-072c6be1-d397-4e5f-8d5e-7eaca68dfe70', 'iterations': 10, 'users_amount': 2, 'sup_dir': '/usr/lib/python2.7/site-packages/functest/opnfv_tests/openstack/rally/scenario/support', 'glance_image_format': 'qcow2'}"] 2018-05-25 07:09:17,065 - functest.opnfv_tests.openstack.rally.rally - DEBUG - task_id : 3a111eb4-151f-4ca9-b1ea-c902c6926349 2018-05-25 07:09:17,066 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'detailed', '3a111eb4-151f-4ca9-b1ea-c902c6926349'] 2018-05-25 07:09:18,147 - functest.opnfv_tests.openstack.rally.rally - INFO - -------------------------------------------------------------------------------- Task 3a111eb4-151f-4ca9-b1ea-c902c6926349: finished -------------------------------------------------------------------------------- test scenario Quotas.cinder_update_and_delete args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 3a111eb4-151f-4ca9-b1ea-c902c6926349 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 0.738 | 0.738 | 0.738 | 0.738 | 0.738 | 0.738 | 100.0% | 1 | | quotas.delete_quotas | 0.161 | 0.161 | 0.161 | 0.161 | 0.161 | 0.161 | 100.0% | 1 | | total | 0.899 | 0.899 | 0.899 | 0.899 | 0.899 | 0.899 | 100.0% | 1 | | -> duration | 0.899 | 0.899 | 0.899 | 0.899 | 0.899 | 0.899 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.89878 Full duration: 7.695507 -------------------------------------------------------------------------------- test scenario Quotas.cinder_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 3a111eb4-151f-4ca9-b1ea-c902c6926349 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 100.0% | 1 | | total | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 100.0% | 1 | | -> duration | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 0.676 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.675766 Full duration: 7.538598 -------------------------------------------------------------------------------- test scenario Quotas.neutron_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 3a111eb4-151f-4ca9-b1ea-c902c6926349 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 0.279 | 0.279 | 0.279 | 0.279 | 0.279 | 0.279 | 100.0% | 1 | | total | 0.667 | 0.667 | 0.667 | 0.667 | 0.667 | 0.667 | 100.0% | 1 | | -> duration | 0.667 | 0.667 | 0.667 | 0.667 | 0.667 | 0.667 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.666718 Full duration: 7.607895 -------------------------------------------------------------------------------- test scenario Quotas.nova_update args position 0 args values: { "runner": { "concurrency": 1, "times": 1 }, "hooks": [], "args": { "max_quota": 1024 }, "sla": { "failure_rate": { "max": 0 } }, "context": { "users": { "tenants": 1, "users_per_tenant": 1 } } } -------------------------------------------------------------------------------- Task 3a111eb4-151f-4ca9-b1ea-c902c6926349 has 0 error(s) -------------------------------------------------------------------------------- +-------------------------------------------------------------------------------------------------------------------------+ | Response Times (sec) | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | Action | Min (sec) | Median (sec) | 90%ile (sec) | 95%ile (sec) | Max (sec) | Avg (sec) | Success | Count | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ | quotas.update_quotas | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 100.0% | 1 | | total | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 100.0% | 1 | | -> duration | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 0.961 | 100.0% | 1 | | -> idle_duration | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 100.0% | 1 | +----------------------+-----------+--------------+--------------+--------------+-----------+-----------+---------+-------+ Load duration: 0.961254 Full duration: 8.122691 HINTS: * To plot HTML graphics with this data, run: rally task report 3a111eb4-151f-4ca9-b1ea-c902c6926349 --out output.html * To generate a JUnit report, run: rally task export 3a111eb4-151f-4ca9-b1ea-c902c6926349 --type junit --to output.xml * To get raw JSON output of task results, run: rally task report 3a111eb4-151f-4ca9-b1ea-c902c6926349 --json --out output.json 2018-05-25 07:09:18,147 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'results', '3a111eb4-151f-4ca9-b1ea-c902c6926349'] 2018-05-25 07:09:19,218 - functest.opnfv_tests.openstack.rally.rally - DEBUG - saving json file 2018-05-25 07:09:19,219 - functest.opnfv_tests.openstack.rally.rally - DEBUG - running command: ['rally', 'task', 'report', '3a111eb4-151f-4ca9-b1ea-c902c6926349', '--out', u'/home/opnfv/functest/results/rally/opnfv-quotas.html'] 2018-05-25 07:09:19,226 - functest.opnfv_tests.openstack.rally.rally - INFO - Test scenario: "quotas" OK. 2018-05-25 07:09:19,230 - functest.opnfv_tests.openstack.rally.rally - INFO - Rally Summary Report: +----------------+------------+----------------+-----------+ | Module | Duration | nb. Test Run | Success | +----------------+------------+----------------+-----------+ | authenticate | 00:39 | 6 | 100.00% | | glance | 01:11 | 4 | 100.00% | | cinder | 02:32 | 10 | 100.00% | | heat | 00:57 | 4 | 100.00% | | keystone | 02:35 | 11 | 100.00% | | neutron | 03:52 | 11 | 100.00% | | nova | 05:31 | 9 | 100.00% | | quotas | 00:30 | 4 | 100.00% | | | | | | | TOTAL: | 00:17:50 | 59 | 100.00% | +----------------+------------+----------------+-----------+ 2018-05-25 07:09:19,230 - functest.opnfv_tests.openstack.rally.rally - INFO - Rally 'rally_sanity' success_rate is 100.00% 2018-05-25 07:09:24,536 - xtesting.energy.energy - DEBUG - Restoring previous scenario (default/default) 2018-05-25 07:09:24,536 - xtesting.energy.energy - DEBUG - Submitting scenario (default/default) 2018-05-25 07:09:25,066 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 07:09:25,067 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | rally_sanity | functest | 21:16 | PASS | +----------------------+------------------+------------------+----------------+ 2018-05-25 07:09:25,071 - xtesting.ci.run_tests - INFO - Running test case 'refstack_defcore'... 2018-05-25 07:09:25,163 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-25 07:09:25,163 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-25 07:09:25,163 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-25 07:09:27,597 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-25 07:09:27.273 11176 INFO rally.deployment.engine [-] Deployment 814d9c0f-dc1c-4d5e-acad-54e99d5557b6 | Starting: Destroy cloud and free allocated resources. 2018-05-25 07:09:27.351 11176 INFO rally.deployment.engine [-] Deployment 814d9c0f-dc1c-4d5e-acad-54e99d5557b6 | Completed: Destroy cloud and free allocated resources. 2018-05-25 07:09:27.376 11176 INFO rally.api [-] Deleting deployment-specific data for verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97). 2018-05-25 07:09:27.380 11176 INFO rally.api [-] Deployment-specific data has been successfully deleted! 2018-05-25 07:09:29,888 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally 2018-05-25 07:09:29.596 11179 INFO rally.deployment.engines.existing [-] Save deployment 'opnfv-rally' (uuid=542071c6-423d-485f-b4dd-829486c73d05) with 'openstack' platform. +--------------------------------------+---------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+---------------------+-------------+------------------+--------+ | 542071c6-423d-485f-b4dd-829486c73d05 | 2018-05-25T07:09:29 | opnfv-rally | deploy->finished | | +--------------------------------------+---------------------+-------------+------------------+--------+ Using deployment: 542071c6-423d-485f-b4dd-829486c73d05 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-25 07:09:33,197 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | key-manager | Available | | __unknown__ | placement | Available | | __unknown__ | policy | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-25 07:09:33,197 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-25 07:09:35,500 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify delete-verifier --id opnfv-tempest --force 2018-05-25 07:09:35.232 11185 INFO rally.api [-] Deleting verifier 'opnfv-tempest' (UUID=958d68a1-b4b4-4151-b0d6-568036179a97). 2018-05-25 07:09:35.351 11185 INFO rally.api [-] Verifier has been successfully deleted! 2018-05-25 07:09:38,880 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide 2018-05-25 07:09:37.564 11188 INFO rally.api [-] Creating verifier 'opnfv-tempest'. 2018-05-25 07:09:37.693 11188 INFO rally.verification.manager [-] Cloning verifier repo from /src/tempest. 2018-05-25 07:09:38.698 11188 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5) has been successfully created! Using verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5) as the default verifier for the future CLI operations. 2018-05-25 07:09:41,681 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-5cf071d7-ebc8-4146-b16b-ea87b7995600' 2018-05-25 07:09:42,458 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T07:09:42Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'3f5b421b-4bc8-4ced-9725-19e27689b25d', u'provider:segmentation_id': 74, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'tempest-net-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'created_at': u'2018-05-25T07:09:42Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:09:42,841 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T07:09:42Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'1832a087-2e7d-4bb1-beb4-cf8598e52187', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'enable_dhcp': True, u'network_id': u'3f5b421b-4bc8-4ced-9725-19e27689b25d', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T07:09:42Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:09:42,841 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-25 07:09:42,841 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-5cf071d7-ebc8-4146-b16b-ea87b7995600' 2018-05-25 07:09:44,404 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:09:43Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/93ee8ebe-fb64-428d-9e6c-d26eb9f71a87/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'93ee8ebe-fb64-428d-9e6c-d26eb9f71a87', u'size': None, u'name': u'Cirros-0.4.0-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'checksum': None, u'self': u'/v2/images/93ee8ebe-fb64-428d-9e6c-d26eb9f71a87', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:09:43Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:09:44,404 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-5cf071d7-ebc8-4146-b16b-ea87b7995600' 2018-05-25 07:09:45,381 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:09:44Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/cc6c92cc-b16f-4920-a23c-bfe9712a6bcf/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'cc6c92cc-b16f-4920-a23c-bfe9712a6bcf', u'size': None, u'name': u'Cirros-0.4.0-1-5cf071d7-ebc8-4146-b16b-ea87b7995600', u'checksum': None, u'self': u'/v2/images/cc6c92cc-b16f-4920-a23c-bfe9712a6bcf', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:09:44Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:09:45,381 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-25 07:09:45,601 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-5cf071d7-ebc8-4146-b16b-ea87b7995600', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'41c2f96d-ed3d-4871-8dd9-b0053466fffb', 'swap': 0}) 2018-05-25 07:09:45,670 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-5cf071d7-ebc8-4146-b16b-ea87b7995600', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'63342606-5a73-4dfa-960f-9571d8f0a628', 'swap': 0}) 2018-05-25 07:09:48,633 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-25 07:09:47.791 11207 INFO rally.api [-] Configuring verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5) for deployment 'opnfv-rally' (UUID=542071c6-423d-485f-b4dd-829486c73d05). 2018-05-25 07:09:48.464 11207 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5) has been successfully configured for deployment 'opnfv-rally' (UUID=542071c6-423d-485f-b4dd-829486c73d05)! 2018-05-25 07:09:48,633 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-25 07:09:48,634 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-25 07:09:48,636 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-25 07:09:51,853 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', u'/home/opnfv/functest/results/refstack/tempest-list.txt']'. 2018-05-25 07:09:53,993 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:09:53.991 11219 INFO rally.api [-] Starting verification (UUID=911a2543-4c7c-4634-81bd-633bb7be8618) for deployment 'opnfv-rally' (UUID=542071c6-423d-485f-b4dd-829486c73d05) by verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5). 2018-05-25 07:09:53,993 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Verification UUID: 911a2543-4c7c-4634-81bd-633bb7be8618 2018-05-25 07:10:03,022 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:03.021 11219 INFO opnfv-tempest [-] {8} tempest.api.compute.servers.test_availability_zone.AZV2TestJSON.test_get_availability_zone_list_with_non_admin_user ... success [0.466s] 2018-05-25 07:10:03,113 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:03.113 11219 INFO opnfv-tempest [-] {14} tempest.api.identity.v3.test_tokens.TokensV3Test.test_create_token ... success [0.371s] 2018-05-25 07:10:03,151 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:03.151 11219 INFO opnfv-tempest [-] {4} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors ... success [0.455s] 2018-05-25 07:10:03,261 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:03.260 11219 INFO opnfv-tempest [-] {9} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_delete_image ... success [0.646s] 2018-05-25 07:10:04,180 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:04.179 11219 INFO opnfv-tempest [-] {6} tempest.api.image.v2.test_images_tags_negative.ImagesTagsNegativeTest.test_delete_non_existing_tag ... success [1.383s] 2018-05-25 07:10:04,199 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:04.198 11219 INFO opnfv-tempest [-] {6} tempest.api.image.v2.test_images_tags_negative.ImagesTagsNegativeTest.test_update_tags_for_non_existing_image ... success [0.020s] 2018-05-25 07:10:04,292 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:04.292 11219 INFO opnfv-tempest [-] {4} tempest.api.compute.flavors.test_flavors.FlavorsV2TestJSON.test_list_flavors_with_detail ... success [1.140s] 2018-05-25 07:10:05,235 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:05.235 11219 INFO opnfv-tempest [-] {9} tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image ... success [1.974s] 2018-05-25 07:10:07,497 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:07.496 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.servers.test_servers.ServersTestJSON.test_create_server_with_admin_password ... success [4.990s] 2018-05-25 07:10:08,754 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:08.754 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_actions.VolumesActionsTest.test_reserve_unreserve_volume ... success [3.418s] 2018-05-25 07:10:09,569 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:09.569 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_actions.VolumesActionsTest.test_volume_bootable ... success [0.814s] 2018-05-25 07:10:10,443 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:10.442 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_actions.VolumesActionsTest.test_volume_readonly_update ... success [0.873s] 2018-05-25 07:10:12,886 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:12.886 11219 INFO opnfv-tempest [-] {4} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_media_types ... success [0.188s] 2018-05-25 07:10:13,013 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:13.012 11219 INFO opnfv-tempest [-] {4} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_resources ... success [0.127s] 2018-05-25 07:10:13,199 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:13.198 11219 INFO opnfv-tempest [-] {4} tempest.api.identity.v3.test_api_discovery.TestApiDiscovery.test_api_version_statuses ... success [0.185s] 2018-05-25 07:10:13,296 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:13.295 11219 INFO opnfv-tempest [-] {14} tempest.api.network.test_subnetpools_extensions.SubnetPoolsTestJSON.test_create_list_show_update_delete_subnetpools ... success [2.547s] 2018-05-25 07:10:14,677 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:14.677 11219 INFO opnfv-tempest [-] {9} tempest.api.volume.test_volumes_snapshots_negative.VolumesSnapshotNegativeTestJSON.test_create_snapshot_with_nonexistent_volume_id ... success [0.696s] 2018-05-25 07:10:14,912 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:14.912 11219 INFO opnfv-tempest [-] {9} tempest.api.volume.test_volumes_snapshots_negative.VolumesSnapshotNegativeTestJSON.test_create_snapshot_without_passing_volume_id ... success [0.235s] 2018-05-25 07:10:15,529 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:15.529 11219 INFO opnfv-tempest [-] {8} tempest.api.volume.test_volume_metadata.VolumesMetadataTest.test_crud_volume_metadata ... success [2.090s] 2018-05-25 07:10:16,261 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:16.260 11219 INFO opnfv-tempest [-] {8} tempest.api.volume.test_volume_metadata.VolumesMetadataTest.test_update_show_volume_metadata_item ... success [0.731s] 2018-05-25 07:10:17,519 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:17.519 11219 INFO opnfv-tempest [-] {3} tempest.api.compute.servers.test_instance_actions.InstanceActionsTestJSON.test_get_instance_action ... success [0.232s] 2018-05-25 07:10:18,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:18.242 11219 INFO opnfv-tempest [-] {4} tempest.api.image.v2.test_images_negative.ImagesNegativeTest.test_delete_image_null_id ... success [0.154s] 2018-05-25 07:10:18,310 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:18.309 11219 INFO opnfv-tempest [-] {4} tempest.api.image.v2.test_images_negative.ImagesNegativeTest.test_delete_non_existing_image ... success [0.068s] 2018-05-25 07:10:18,899 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:18.898 11219 INFO opnfv-tempest [-] {4} tempest.api.image.v2.test_images_negative.ImagesNegativeTest.test_get_delete_deleted_image ... success [0.587s] 2018-05-25 07:10:19,183 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:19.182 11219 INFO opnfv-tempest [-] {4} tempest.api.image.v2.test_images_negative.ImagesNegativeTest.test_get_image_null_id ... success [0.285s] 2018-05-25 07:10:19,202 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:19.202 11219 INFO opnfv-tempest [-] {4} tempest.api.image.v2.test_images_negative.ImagesNegativeTest.test_get_non_existent_image ... success [0.020s] 2018-05-25 07:10:20,156 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:20.156 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_snapshot_metadata.SnapshotMetadataTestJSON.test_crud_snapshot_metadata ... success [1.458s] 2018-05-25 07:10:21,020 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:21.020 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_snapshot_metadata.SnapshotMetadataTestJSON.test_update_show_snapshot_metadata_item ... success [0.864s] 2018-05-25 07:10:27,188 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:27.188 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_delete_server.DeleteServersTestJSON.test_delete_active_server ... success [24.471s] 2018-05-25 07:10:34,055 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:34.055 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.servers.test_servers.ServersTestJSON.test_create_specify_keypair ... success [26.552s] 2018-05-25 07:10:36,339 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.339 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_changes_since_future_date ... success [0.067s] 2018-05-25 07:10:36,350 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.350 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_changes_since_invalid_date ... success [0.011s] 2018-05-25 07:10:36,525 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.525 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_limits_greater_than_actual_count ... success [0.174s] 2018-05-25 07:10:36,537 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.536 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_limits_pass_negative_value ... success [0.011s] 2018-05-25 07:10:36,549 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.549 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_limits_pass_string ... success [0.012s] 2018-05-25 07:10:36,586 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.586 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_non_existing_flavor ... success [0.036s] 2018-05-25 07:10:36,652 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.652 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_non_existing_image ... success [0.066s] 2018-05-25 07:10:36,715 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:36.715 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_by_non_existing_server_name ... success [0.063s] 2018-05-25 07:10:37,150 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.150 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list ... success [0.046s] 2018-05-25 07:10:37,184 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.184 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_detail_server_is_deleted ... success [0.468s] 2018-05-25 07:10:37,188 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.187 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_by_name ... success [0.037s] 2018-05-25 07:10:37,199 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.199 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_status_non_existing ... success [0.015s] 2018-05-25 07:10:37,232 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.232 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_details_by_name ... success [0.044s] 2018-05-25 07:10:37,264 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.263 11219 INFO opnfv-tempest [-] {0} tempest.api.compute.servers.test_list_servers_negative.ListServersNegativeTestJSON.test_list_servers_with_a_deleted_server ... success [0.064s] 2018-05-25 07:10:37,526 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.525 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_details_pagination ... success [0.292s] 2018-05-25 07:10:37,628 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.628 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_details_with_multiple_params ... success [0.102s] 2018-05-25 07:10:37,990 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:37.990 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_pagination ... success [0.361s] 2018-05-25 07:10:38,027 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.026 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_param_display_name_and_status ... success [0.036s] 2018-05-25 07:10:38,072 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.072 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_with_detail_param_display_name_and_status ... success [0.045s] 2018-05-25 07:10:38,124 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.124 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_with_detail_param_metadata ... success [0.051s] 2018-05-25 07:10:38,170 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.170 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_with_details ... success [0.045s] 2018-05-25 07:10:38,209 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.209 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volume_list_with_param_metadata ... success [0.038s] 2018-05-25 07:10:38,281 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.281 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volumes_list_by_availability_zone ... success [0.071s] 2018-05-25 07:10:38,485 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.485 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volumes_list_by_status ... success [0.203s] 2018-05-25 07:10:38,538 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.538 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volumes_list_details_by_availability_zone ... success [0.053s] 2018-05-25 07:10:38,584 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:38.583 11219 INFO opnfv-tempest [-] {6} tempest.api.volume.test_volumes_list.VolumesListTestJSON.test_volumes_list_details_by_status ... success [0.045s] 2018-05-25 07:10:45,053 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:45.053 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_host_name_is_same_as_server_name ... success [15.149s] 2018-05-25 07:10:45,126 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:45.126 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers ... success [0.073s] 2018-05-25 07:10:45,357 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:45.356 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers_with_detail ... success [0.229s] 2018-05-25 07:10:45,456 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:45.455 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_created_server_vcpus ... success [0.097s] 2018-05-25 07:10:45,457 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:45.457 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details ... success [0.001s] 2018-05-25 07:10:59,295 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:59.295 11219 INFO opnfv-tempest [-] {3} tempest.api.compute.servers.test_instance_actions.InstanceActionsTestJSON.test_list_instance_actions ... success [41.764s] 2018-05-25 07:10:59,364 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:10:59.363 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_numeric_server_name ... success [1.016s] 2018-05-25 07:11:00,436 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:00.436 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_server_metadata_exceeds_length_limit ... success [1.070s] 2018-05-25 07:11:01,449 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:01.448 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_server_name_length_exceeds_256 ... success [1.011s] 2018-05-25 07:11:02,465 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:02.464 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_with_invalid_flavor ... success [1.015s] 2018-05-25 07:11:03,400 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:03.399 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_with_invalid_image ... success [0.934s] 2018-05-25 07:11:04,374 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:04.374 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_create_with_invalid_network_uuid ... success [0.973s] 2018-05-25 07:11:04,621 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:04.621 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_delete_server_pass_id_exceeding_length_limit ... success [0.248s] 2018-05-25 07:11:04,855 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:04.855 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_delete_server_pass_negative_id ... success [0.233s] 2018-05-25 07:11:05,111 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:05.110 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_get_non_existent_server ... success [0.255s] 2018-05-25 07:11:06,076 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:06.075 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_invalid_ip_v6_address ... success [0.962s] 2018-05-25 07:11:06,315 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:06.314 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_reboot_non_existent_server ... success [0.240s] 2018-05-25 07:11:06,573 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:06.572 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_rebuild_deleted_server ... success [0.255s] 2018-05-25 07:11:06,823 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:06.823 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_rebuild_non_existent_server ... success [0.249s] 2018-05-25 07:11:07,880 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:07.880 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_server_name_blank ... success [1.055s] 2018-05-25 07:11:08,160 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:08.159 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_stop_non_existent_server ... success [0.280s] 2018-05-25 07:11:08,451 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:08.450 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_update_name_of_non_existent_server ... success [0.290s] 2018-05-25 07:11:08,692 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:08.691 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_update_server_name_length_exceeds_256 ... success [0.240s] 2018-05-25 07:11:08,955 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:08.955 11219 INFO opnfv-tempest [-] {11} tempest.api.compute.servers.test_servers_negative.ServersNegativeTestJSON.test_update_server_set_empty_name ... success [0.263s] 2018-05-25 07:11:12,800 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:12.800 11219 INFO opnfv-tempest [-] {3} tempest.api.image.v2.test_images_tags.ImagesTagsTest.test_update_delete_tags_for_image ... success [1.808s] 2018-05-25 07:11:15,769 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:15.768 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.servers.test_servers.ServersTestJSON.test_create_with_existing_server_name ... success [41.705s] 2018-05-25 07:11:16,172 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:16.172 11219 INFO opnfv-tempest [-] {15} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_lock_unlock_server ... success [43.736s] 2018-05-25 07:11:18,777 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:18.777 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.images.test_images_oneserver.ImagesOneServerTestJSON.test_create_delete_image ... success [60.350s] 2018-05-25 07:11:19,445 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:19.445 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.images.test_images_oneserver.ImagesOneServerTestJSON.test_create_image_specify_multibyte_character_image_name ... success [0.673s] 2018-05-25 07:11:23,567 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:23.566 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_create_bulk_port ... success [4.237s] 2018-05-25 07:11:25,951 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:25.950 11219 INFO opnfv-tempest [-] {15} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard ... success [9.779s] 2018-05-25 07:11:26,520 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:26.519 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.servers.test_servers.ServersTestJSON.test_update_access_server_address ... success [10.751s] 2018-05-25 07:11:27,912 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:27.911 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_all_attributes ... success [3.222s] 2018-05-25 07:11:27,955 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:27.955 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_actions.VolumesActionsTest.test_volume_upload ... success [77.502s] 2018-05-25 07:11:29,127 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:29.127 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_create_port_in_allowed_allocation_pools ... success [5.560s] 2018-05-25 07:11:30,601 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:30.600 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_with_allocation_pools ... success [2.688s] 2018-05-25 07:11:31,411 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:31.410 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_create_update_delete_port ... success [2.283s] 2018-05-25 07:11:31,621 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:31.621 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_list_ports ... success [0.210s] 2018-05-25 07:11:31,688 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:31.688 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_list_ports_fields ... success [0.067s] 2018-05-25 07:11:31,771 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:31.771 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_show_port ... success [0.082s] 2018-05-25 07:11:31,822 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:31.822 11219 INFO opnfv-tempest [-] {3} tempest.api.network.test_ports.PortsTestJSON.test_show_port_fields ... success [0.051s] 2018-05-25 07:11:34,874 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:34.873 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_with_dhcp_enabled ... success [4.272s] 2018-05-25 07:11:36,860 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:36.860 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_with_gw ... success [1.986s] 2018-05-25 07:11:39,247 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:39.246 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_with_gw_and_allocation_pools ... success [2.385s] 2018-05-25 07:11:41,918 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:41.918 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_snapshots_list.VolumesSnapshotListTestJSON.test_snapshots_list_details_with_params ... success [0.276s] 2018-05-25 07:11:41,922 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:41.922 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_with_host_routes_and_dns_nameservers ... success [2.675s] 2018-05-25 07:11:42,115 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:42.115 11219 INFO opnfv-tempest [-] {13} tempest.api.volume.test_volumes_snapshots_list.VolumesSnapshotListTestJSON.test_snapshots_list_with_params ... success [0.198s] 2018-05-25 07:11:44,927 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:44.927 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_delete_subnet_without_gateway ... success [3.004s] 2018-05-25 07:11:46,676 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:46.676 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_delete_server_metadata_item ... success [0.639s] 2018-05-25 07:11:46,887 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:46.887 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_get_server_metadata_item ... success [0.210s] 2018-05-25 07:11:47,101 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:47.101 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_list_server_metadata ... success [0.214s] 2018-05-25 07:11:47,529 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:47.529 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_set_server_metadata ... success [0.428s] 2018-05-25 07:11:47,922 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:47.922 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_set_server_metadata_item ... success [0.392s] 2018-05-25 07:11:48,056 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:48.056 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_create_update_delete_network_subnet ... success [3.128s] 2018-05-25 07:11:48,408 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:48.408 11219 INFO opnfv-tempest [-] {1} tempest.api.compute.servers.test_server_metadata.ServerMetadataTestJSON.test_update_server_metadata ... success [0.485s] 2018-05-25 07:11:50,864 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:50.863 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_delete_network_with_subnet ... success [2.806s] 2018-05-25 07:11:51,072 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.071 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_networks ... success [0.208s] 2018-05-25 07:11:51,211 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.211 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_networks_fields ... success [0.140s] 2018-05-25 07:11:51,283 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.283 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_subnets ... success [0.070s] 2018-05-25 07:11:51,332 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.332 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_list_subnets_fields ... success [0.049s] 2018-05-25 07:11:51,434 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.434 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_network ... success [0.101s] 2018-05-25 07:11:51,526 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.526 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_network_fields ... success [0.091s] 2018-05-25 07:11:51,706 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.706 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_subnet ... success [0.179s] 2018-05-25 07:11:51,762 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:51.762 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_show_subnet_fields ... success [0.056s] 2018-05-25 07:11:53,490 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:53.490 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.servers.test_servers.ServersTestJSON.test_update_server_name ... success [26.962s] 2018-05-25 07:11:54,353 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:54.353 11219 INFO opnfv-tempest [-] {0} tempest.api.network.test_networks.NetworksTest.test_update_subnet_gw_dns_host_routes_dhcp ... success [2.590s] 2018-05-25 07:11:58,058 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:58.057 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.test_quotas.QuotasTestJSON.test_get_default_quotas ... success [0.388s] 2018-05-25 07:11:58,469 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:58.469 11219 INFO opnfv-tempest [-] {2} tempest.api.compute.test_quotas.QuotasTestJSON.test_get_quotas ... success [0.411s] 2018-05-25 07:11:59,435 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:59.434 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_host_name_is_same_as_server_name ... success [9.328s] 2018-05-25 07:11:59,521 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:59.521 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers ... success [0.087s] 2018-05-25 07:11:59,775 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:59.775 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers_with_detail ... success [0.253s] 2018-05-25 07:11:59,874 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:59.873 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_created_server_vcpus ... success [0.096s] 2018-05-25 07:11:59,875 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:11:59.875 11219 INFO opnfv-tempest [-] {5} tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details ... success [0.002s] 2018-05-25 07:12:03,589 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:03.589 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_list_update_show_delete_security_group ... success [2.045s] 2018-05-25 07:12:04,872 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:04.871 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_security_group_rule_with_additional_args ... success [1.282s] 2018-05-25 07:12:05,085 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:05.085 11219 INFO opnfv-tempest [-] {15} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_rebuild_server ... success [39.129s] 2018-05-25 07:12:05,704 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:05.704 11219 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_snapshots.VolumesSnapshotTestJSON.test_snapshot_create_get_list_update_delete ... success [5.132s] 2018-05-25 07:12:08,181 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:08.181 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_security_group_rule_with_icmp_type_code ... success [3.305s] 2018-05-25 07:12:09,068 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:09.068 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_security_group_rule_with_protocol_integer_value ... success [0.889s] 2018-05-25 07:12:10,353 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.352 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_security_group_rule_with_remote_group_id ... success [1.281s] 2018-05-25 07:12:10,846 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.846 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_get_image_schema ... success [0.183s] 2018-05-25 07:12:10,855 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.855 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_get_images_schema ... success [0.008s] 2018-05-25 07:12:10,893 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.893 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_container_format ... success [0.038s] 2018-05-25 07:12:10,922 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.922 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_disk_format ... success [0.029s] 2018-05-25 07:12:10,949 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.949 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_limit ... success [0.027s] 2018-05-25 07:12:10,997 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:10.996 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_min_max_size ... success [0.045s] 2018-05-25 07:12:11,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.039 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_size ... success [0.043s] 2018-05-25 07:12:11,078 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.078 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_status ... success [0.039s] 2018-05-25 07:12:11,115 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.115 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_images_param_visibility ... success [0.036s] 2018-05-25 07:12:11,159 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.159 11219 INFO opnfv-tempest [-] {2} tempest.api.image.v2.test_images.ListUserImagesTest.test_list_no_params ... success [0.044s] 2018-05-25 07:12:11,525 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.524 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_security_group_rule_with_remote_ip_prefix ... success [1.173s] 2018-05-25 07:12:11,920 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:11.920 11219 INFO opnfv-tempest [-] {5} tempest.api.volume.test_availability_zone.AvailabilityZoneTestJSON.test_get_availability_zone_list ... success [0.147s] 2018-05-25 07:12:12,427 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:12.426 11219 INFO opnfv-tempest [-] {0} tempest.api.volume.test_volumes_snapshots.VolumesSnapshotTestJSON.test_volume_from_snapshot ... success [6.721s] 2018-05-25 07:12:13,924 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:13.924 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_create_show_delete_security_group_rule ... success [2.395s] 2018-05-25 07:12:13,962 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:13.962 11219 INFO opnfv-tempest [-] {1} tempest.api.network.test_security_groups.SecGroupTest.test_list_security_groups ... success [0.042s] 2018-05-25 07:12:21,034 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:21.034 11219 INFO opnfv-tempest [-] {15} tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_stop_start_server ... success [15.946s] 2018-05-25 07:12:36,541 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:36.540 11219 INFO opnfv-tempest [-] {15} tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete ... success [6.014s] 2018-05-25 07:12:43,458 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:43.457 11219 INFO opnfv-tempest [-] {15} tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_as_clone ... success [6.914s] 2018-05-25 07:12:53,895 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:12:53.894 11219 INFO opnfv-tempest [-] {15} tempest.api.volume.test_volumes_get.VolumesGetTest.test_volume_create_get_update_delete_from_image ... success [10.437s] 2018-05-25 07:12:58,232 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', '911a2543-4c7c-4634-81bd-633bb7be8618']'. 2018-05-25 07:12:59,240 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | 911a2543-4c7c-4634-81bd-633bb7be8618 | 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | finished | 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-25 07:09:53 | 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-25 07:12:57 | 2018-05-25 07:12:59,241 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:03:04 | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 882c2df4-1b4d-474a-b012-4046bb1070b5) | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: 542071c6-423d-485f-b4dd-829486c73d05) | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 155 | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 174.032 | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 155 | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 0 | 2018-05-25 07:12:59,242 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-25 07:12:59,243 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-25 07:12:59,243 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 0 | 2018-05-25 07:12:59,243 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:12:59,243 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +--------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-25 07:12:59,351 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest refstack_defcore success_rate is 100.0% 2018-05-25 07:13:02,432 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 07:13:02,432 - xtesting.ci.run_tests - INFO - Test result: +--------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +--------------------------+------------------+------------------+----------------+ | refstack_defcore | functest | 03:21 | PASS | +--------------------------+------------------+------------------+----------------+ 2018-05-25 07:13:02,437 - xtesting.ci.run_tests - INFO - Running test case 'patrole'... 2018-05-25 07:13:02,531 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-25 07:13:02,531 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-25 07:13:02,531 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-25 07:13:05,025 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-25 07:13:04.620 11325 INFO rally.deployment.engine [-] Deployment 542071c6-423d-485f-b4dd-829486c73d05 | Starting: Destroy cloud and free allocated resources. 2018-05-25 07:13:04.700 11325 INFO rally.deployment.engine [-] Deployment 542071c6-423d-485f-b4dd-829486c73d05 | Completed: Destroy cloud and free allocated resources. 2018-05-25 07:13:04.746 11325 INFO rally.api [-] Deleting all verifications created by verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5) for deployment 'opnfv-rally'. 2018-05-25 07:13:04.767 11325 INFO rally.api [-] Deleting verification (UUID=911a2543-4c7c-4634-81bd-633bb7be8618). 2018-05-25 07:13:04.798 11325 INFO rally.api [-] Verification has been successfully deleted! 2018-05-25 07:13:04.799 11325 INFO rally.api [-] Deleting deployment-specific data for verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5). 2018-05-25 07:13:04.811 11325 INFO rally.api [-] Deployment-specific data has been successfully deleted! 2018-05-25 07:13:07,373 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally 2018-05-25 07:13:07.087 11328 INFO rally.deployment.engines.existing [-] Save deployment 'opnfv-rally' (uuid=3d3efed7-87c5-4baa-8dea-fa97532582cb) with 'openstack' platform. +--------------------------------------+---------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+---------------------+-------------+------------------+--------+ | 3d3efed7-87c5-4baa-8dea-fa97532582cb | 2018-05-25T07:13:07 | opnfv-rally | deploy->finished | | +--------------------------------------+---------------------+-------------+------------------+--------+ Using deployment: 3d3efed7-87c5-4baa-8dea-fa97532582cb ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-25 07:13:10,748 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | key-manager | Available | | __unknown__ | placement | Available | | __unknown__ | policy | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-25 07:13:10,749 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-25 07:13:13,265 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify delete-verifier --id opnfv-tempest --force 2018-05-25 07:13:12.946 11334 INFO rally.api [-] Deleting verifier 'opnfv-tempest' (UUID=882c2df4-1b4d-474a-b012-4046bb1070b5). 2018-05-25 07:13:13.071 11334 INFO rally.api [-] Verifier has been successfully deleted! 2018-05-25 07:13:16,619 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide 2018-05-25 07:13:15.336 11337 INFO rally.api [-] Creating verifier 'opnfv-tempest'. 2018-05-25 07:13:15.479 11337 INFO rally.verification.manager [-] Cloning verifier repo from /src/tempest. 2018-05-25 07:13:16.478 11337 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d) has been successfully created! Using verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d) as the default verifier for the future CLI operations. 2018-05-25 07:13:19,663 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-bc3683ca-bfb2-497c-8651-ee967665ebc2' 2018-05-25 07:13:20,421 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T07:13:20Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'df0949c7-0ebf-4e21-8185-485a9c1cc176', u'provider:segmentation_id': 21, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'tempest-net-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'created_at': u'2018-05-25T07:13:20Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:13:20,798 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T07:13:20Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'bbddb479-85de-49a2-a2b5-62ddf8d88e9b', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'enable_dhcp': True, u'network_id': u'df0949c7-0ebf-4e21-8185-485a9c1cc176', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T07:13:20Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:13:20,798 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-25 07:13:20,798 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-bc3683ca-bfb2-497c-8651-ee967665ebc2' 2018-05-25 07:13:21,664 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:13:21Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/781ee7f9-19c2-41be-88f7-b2ec3533489b/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'781ee7f9-19c2-41be-88f7-b2ec3533489b', u'size': None, u'name': u'Cirros-0.4.0-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'checksum': None, u'self': u'/v2/images/781ee7f9-19c2-41be-88f7-b2ec3533489b', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:13:21Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:13:21,664 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-bc3683ca-bfb2-497c-8651-ee967665ebc2' 2018-05-25 07:13:22,718 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:13:21Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/d57234de-4826-43c8-ba36-2b1308e9c996/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'd57234de-4826-43c8-ba36-2b1308e9c996', u'size': None, u'name': u'Cirros-0.4.0-1-bc3683ca-bfb2-497c-8651-ee967665ebc2', u'checksum': None, u'self': u'/v2/images/d57234de-4826-43c8-ba36-2b1308e9c996', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:13:21Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:13:22,718 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-25 07:13:22,918 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-bc3683ca-bfb2-497c-8651-ee967665ebc2', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'0398c9dc-a162-4e47-96d0-37211a67f746', 'swap': 0}) 2018-05-25 07:13:22,987 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-bc3683ca-bfb2-497c-8651-ee967665ebc2', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'114966e2-e4cf-43f6-a867-3758ad8a4b8a', 'swap': 0}) 2018-05-25 07:13:25,958 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-25 07:13:25.123 11356 INFO rally.api [-] Configuring verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d) for deployment 'opnfv-rally' (UUID=3d3efed7-87c5-4baa-8dea-fa97532582cb). 2018-05-25 07:13:25.799 11356 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d) has been successfully configured for deployment 'opnfv-rally' (UUID=3d3efed7-87c5-4baa-8dea-fa97532582cb)! 2018-05-25 07:13:25,958 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-25 07:13:25,959 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-25 07:13:25,961 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-25 07:13:25,972 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-25 07:13:29,334 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-74be5511-2b5e-4aa5-9526-cca07253bf4d/repo; testr list-tests '(?!.*test_networks_multiprovider_rbac)(?=patrole_tempest_plugin.tests.api.(image|network|volume))' >/home/opnfv/functest/results/patrole/tempest-list.txt 2>/dev/null) 2018-05-25 07:13:29,334 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', u'/home/opnfv/functest/results/patrole/tempest-list.txt']'. 2018-05-25 07:13:31,381 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:13:31.380 11365 INFO rally.api [-] Starting verification (UUID=ab095671-3324-4edd-97c0-35088a3a09f3) for deployment 'opnfv-rally' (UUID=3d3efed7-87c5-4baa-8dea-fa97532582cb) by verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d). 2018-05-25 07:13:31,381 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Verification UUID: ab095671-3324-4edd-97c0-35088a3a09f3 2018-05-25 07:15:56,003 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', 'ab095671-3324-4edd-97c0-35088a3a09f3']'. 2018-05-25 07:15:57,038 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | ab095671-3324-4edd-97c0-35088a3a09f3 | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | failed | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-25 07:13:31 | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-25 07:15:55 | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:02:24 | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 74be5511-2b5e-4aa5-9526-cca07253bf4d) | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-25 07:15:57,039 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: 3d3efed7-87c5-4baa-8dea-fa97532582cb) | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 287 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 136.281 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 114 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 166 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 7 | 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:15:57,040 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-25 07:15:57,237 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest patrole success_rate is 94.2148760331% 2018-05-25 07:16:00,587 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 07:16:00,587 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | patrole | functest | 02:41 | FAIL | +-------------------+------------------+------------------+----------------+ 2018-05-25 07:16:00,591 - xtesting.ci.run_tests - ERROR - The test case 'patrole' failed. 2018-05-25 07:16:00,591 - xtesting.ci.run_tests - INFO - Running test case 'snaps_smoke'... 2018-05-25 07:16:01,657 - functest.opnfv_tests.openstack.snaps.snaps_test_runner - INFO - Using flavor metadata 'None' 2018-05-25 07:57:46,510 - xtesting.core.unit - DEBUG - test_add_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_delete_group (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_admin_user_to_new_project (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_new_user_to_admin_project (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_one_complex_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_one_simple_rule (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_with_several_rules (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_group_without_rules (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_remove_rule_by_id (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_remove_rule_by_setting (snaps.openstack.tests.create_security_group_tests.CreateSecurityGroupTests) ... ok test_create_delete_image (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_file (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_url (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_image_clean_url_properties (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_same_image (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_create_same_image_new_settings (snaps.openstack.tests.create_image_tests.CreateImageSuccessTests) ... ok test_bad_image_file (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_image_type (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_name (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_bad_image_url (snaps.openstack.tests.create_image_tests.CreateImageNegativeTests) ... ok test_create_three_part_image_from_file_3_creators (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_three_part_image_from_url (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_three_part_image_from_url_3_creators (snaps.openstack.tests.create_image_tests.CreateMultiPartImageTests) ... ok test_create_delete_keypair (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_from_file (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_large_key (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_only (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_save_both (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_save_pub_only (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsTests) ... ok test_create_keypair_exist_files_delete (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_exist_files_keep (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_delete_1 (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_delete_2 (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_keypair_gen_files_keep (snaps.openstack.tests.create_keypairs_tests.CreateKeypairsCleanupTests) ... ok test_create_delete_network (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_router_admin_user_to_new_project (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_router_new_user_to_admin_project (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_with_router (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_network_without_router (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_networks_same_name (snaps.openstack.tests.create_network_tests.CreateNetworkSuccessTests) ... ok test_create_delete_router (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_state_True (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_state_false (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_user_to_new_project (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_external_network (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_new_user_as_admin_project (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_private_network (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_vanilla (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_with_ext_port (snaps.openstack.tests.create_router_tests.CreateRouterSuccessTests) ... ok test_create_router_admin_ports (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_router_invalid_gateway_name (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_router_noname (snaps.openstack.tests.create_router_tests.CreateRouterNegativeTests) ... ok test_create_delete_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_same_qos (snaps.openstack.tests.create_qos_tests.CreateQoSTests) ... ok test_create_delete_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_create_same_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_create_volume_type (snaps.openstack.tests.create_volume_type_tests.CreateSimpleVolumeTypeSuccessTests) ... ok test_volume_type_with_encryption (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_volume_type_with_qos (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_volume_type_with_qos_and_encryption (snaps.openstack.tests.create_volume_type_tests.CreateVolumeTypeComplexTests) ... ok test_create_delete_volume (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_same_volume (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_volume_simple (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeSuccessTests) ... ok test_create_volume_bad_image (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_create_volume_bad_size (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_create_volume_bad_type (snaps.openstack.tests.create_volume_tests.CreateSimpleVolumeFailureTests) ... ok test_bad_volume_type (snaps.openstack.tests.create_volume_tests.CreateVolumeWithTypeTests) ... ok test_valid_volume_type (snaps.openstack.tests.create_volume_tests.CreateVolumeWithTypeTests) ... ok test_bad_image_name (snaps.openstack.tests.create_volume_tests.CreateVolumeWithImageTests) ... ok test_valid_volume_image (snaps.openstack.tests.create_volume_tests.CreateVolumeWithImageTests) ... ok test_check_vm_ip_dhcp (snaps.openstack.tests.create_instance_tests.SimpleHealthCheck) ... ok test_ping_via_router (snaps.openstack.tests.create_instance_tests.CreateInstanceTwoNetTests) ... ok test_create_admin_instance (snaps.openstack.tests.create_instance_tests.CreateInstanceSimpleTests) ... ok test_create_delete_instance (snaps.openstack.tests.create_instance_tests.CreateInstanceSimpleTests) ... ok test_set_allowed_address_pairs (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_allowed_address_pairs_bad_ip (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_allowed_address_pairs_bad_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_invalid_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_invalid_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_mac_and_ip (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_valid_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_custom_valid_mac (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_one_port_two_ip_one_subnet (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_set_one_port_two_ip_two_subnets (snaps.openstack.tests.create_instance_tests.CreateInstancePortManipulationTests) ... ok test_add_invalid_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_add_same_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_add_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_remove_security_group (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_remove_security_group_never_added (snaps.openstack.tests.create_instance_tests.InstanceSecurityGroupTests) ... ok test_deploy_vm_to_each_compute_node (snaps.openstack.tests.create_instance_tests.CreateInstanceOnComputeHost) ... ok test_create_instance_from_three_part_image (snaps.openstack.tests.create_instance_tests.CreateInstanceFromThreePartImage) ... ok test_create_instance_with_one_volume (snaps.openstack.tests.create_instance_tests.CreateInstanceVolumeTests) ... ok test_create_instance_with_two_volumes (snaps.openstack.tests.create_instance_tests.CreateInstanceVolumeTests) ... ok test_create_delete_stack (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_same_stack (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_short_timeout (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_template_dict (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_create_stack_template_file (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_network_creators (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_vm_inst_creators (snaps.openstack.tests.create_stack_tests.CreateStackSuccessTests) ... ok test_retrieve_volume_creator (snaps.openstack.tests.create_stack_tests.CreateStackVolumeTests) ... ok test_retrieve_volume_type_creator (snaps.openstack.tests.create_stack_tests.CreateStackVolumeTests) ... ok test_retrieve_flavor_creator (snaps.openstack.tests.create_stack_tests.CreateStackFlavorTests) ... ok test_retrieve_keypair_creator (snaps.openstack.tests.create_stack_tests.CreateStackKeypairTests) ... ok test_retrieve_security_group_creator (snaps.openstack.tests.create_stack_tests.CreateStackSecurityGroupTests) ... ok test_bad_stack_file (snaps.openstack.tests.create_stack_tests.CreateStackNegativeTests) ... ok test_missing_dependencies (snaps.openstack.tests.create_stack_tests.CreateStackNegativeTests) ... ok test_single_port_static (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_active (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_init (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_after_reboot (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_before_active (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_reverse_engineer (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_ssh_client_fip_second_creator (snaps.openstack.tests.create_instance_tests.CreateInstanceSingleNetworkTests) ... ok test_connect_via_ssh_heat_vm (snaps.openstack.tests.create_stack_tests.CreateStackFloatingIpTests) ... ok test_connect_via_ssh_heat_vm_derived (snaps.openstack.tests.create_stack_tests.CreateStackFloatingIpTests) ... ok test_apply_simple_playbook (snaps.provisioning.tests.ansible_utils_tests.AnsibleProvisioningTests) ... ok test_apply_template_playbook (snaps.provisioning.tests.ansible_utils_tests.AnsibleProvisioningTests) ... ok ---------------------------------------------------------------------- Ran 119 tests in 2504.783s OK 2018-05-25 07:57:46,626 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 07:57:46,626 - xtesting.ci.run_tests - INFO - Test result: +---------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +---------------------+------------------+------------------+----------------+ | snaps_smoke | functest | 41:45 | PASS | +---------------------+------------------+------------------+----------------+ 2018-05-25 07:57:46,630 - xtesting.ci.run_tests - INFO - Running test case 'neutron_trunk'... 2018-05-25 07:57:46,719 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - cloud: 2018-05-25 07:57:46,720 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - domain: Default 2018-05-25 07:57:46,720 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Creating Rally environment... 2018-05-25 07:57:49,289 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment destroy --deployment opnfv-rally 2018-05-25 07:57:48.920 11594 INFO rally.deployment.engine [-] Deployment 3d3efed7-87c5-4baa-8dea-fa97532582cb | Starting: Destroy cloud and free allocated resources. 2018-05-25 07:57:49.002 11594 INFO rally.deployment.engine [-] Deployment 3d3efed7-87c5-4baa-8dea-fa97532582cb | Completed: Destroy cloud and free allocated resources. 2018-05-25 07:57:49.045 11594 INFO rally.api [-] Deleting all verifications created by verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d) for deployment 'opnfv-rally'. 2018-05-25 07:57:49.069 11594 INFO rally.api [-] Deleting verification (UUID=ab095671-3324-4edd-97c0-35088a3a09f3). 2018-05-25 07:57:49.105 11594 INFO rally.api [-] Verification has been successfully deleted! 2018-05-25 07:57:49.105 11594 INFO rally.api [-] Deleting deployment-specific data for verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d). 2018-05-25 07:57:49.113 11594 INFO rally.api [-] Deployment-specific data has been successfully deleted! 2018-05-25 07:57:51,870 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment create --fromenv --name opnfv-rally 2018-05-25 07:57:51.578 11597 INFO rally.deployment.engines.existing [-] Save deployment 'opnfv-rally' (uuid=c50651c3-da94-4d0d-8a81-977bf31bf872) with 'openstack' platform. +--------------------------------------+---------------------+-------------+------------------+--------+ | uuid | created_at | name | status | active | +--------------------------------------+---------------------+-------------+------------------+--------+ | c50651c3-da94-4d0d-8a81-977bf31bf872 | 2018-05-25T07:57:51 | opnfv-rally | deploy->finished | | +--------------------------------------+---------------------+-------------+------------------+--------+ Using deployment: c50651c3-da94-4d0d-8a81-977bf31bf872 ~/.rally/openrc was updated HINTS: * To use standard OpenStack clients, set up your env by running: source ~/.rally/openrc OpenStack clients are now configured, e.g run: openstack image list 2018-05-25 07:57:55,164 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally deployment check -------------------------------------------------------------------------------- Platform openstack: -------------------------------------------------------------------------------- Available services: +-------------+----------------+-----------+ | Service | Service Type | Status | +-------------+----------------+-----------+ | __unknown__ | alarming | Available | | __unknown__ | key-manager | Available | | __unknown__ | placement | Available | | __unknown__ | policy | Available | | __unknown__ | volumev2 | Available | | __unknown__ | volumev3 | Available | | ceilometer | metering | Available | | cinder | volume | Available | | cloud | cloudformation | Available | | glance | image | Available | | gnocchi | metric | Available | | heat | orchestration | Available | | keystone | identity | Available | | neutron | network | Available | | nova | compute | Available | +-------------+----------------+-----------+ 2018-05-25 07:57:55,164 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - Create verifier from existing repo... 2018-05-25 07:57:57,744 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify delete-verifier --id opnfv-tempest --force 2018-05-25 07:57:57.437 11603 INFO rally.api [-] Deleting verifier 'opnfv-tempest' (UUID=74be5511-2b5e-4aa5-9526-cca07253bf4d). 2018-05-25 07:57:57.565 11603 INFO rally.api [-] Verifier has been successfully deleted! 2018-05-25 07:58:01,002 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify create-verifier --source /src/tempest --name opnfv-tempest --type tempest --system-wide 2018-05-25 07:57:59.825 11606 INFO rally.api [-] Creating verifier 'opnfv-tempest'. 2018-05-25 07:57:59.954 11606 INFO rally.verification.manager [-] Cloning verifier repo from /src/tempest. 2018-05-25 07:58:00.864 11606 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=6ec18acc-973d-4cbb-a4f3-63b47c4670fb) has been successfully created! Using verifier 'opnfv-tempest' (UUID=6ec18acc-973d-4cbb-a4f3-63b47c4670fb) as the default verifier for the future CLI operations. 2018-05-25 07:58:03,993 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating network with name: 'tempest-net-58b2e3da-7fb1-4c57-bca9-2a434c699382' 2018-05-25 07:58:04,820 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - network: Munch({u'status': u'ACTIVE', u'subnets': [], u'description': u'', u'provider:physical_network': None, u'tags': [], u'ipv6_address_scope': None, u'updated_at': u'2018-05-25T07:58:04Z', u'is_default': False, u'revision_number': 3, u'port_security_enabled': True, u'provider:network_type': u'geneve', u'id': u'1ea4db87-55bb-41b1-bb3f-3a29ed1ded99', u'provider:segmentation_id': 30, u'router:external': False, u'availability_zone_hints': [], u'availability_zones': [], u'qos_policy_id': None, u'admin_state_up': True, u'name': u'tempest-net-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'created_at': u'2018-05-25T07:58:04Z', u'mtu': 1442, u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'ipv4_address_scope': None, u'shared': False, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:58:05,461 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - subnet: Munch({u'description': u'', u'tags': [], u'updated_at': u'2018-05-25T07:58:05Z', u'ipv6_ra_mode': None, u'allocation_pools': [{u'start': u'192.168.150.2', u'end': u'192.168.150.254'}], u'host_routes': [], u'revision_number': 0, u'ipv6_address_mode': None, u'cidr': u'192.168.150.0/24', u'id': u'620fd9c3-609f-4fb2-87f8-7f85dd2c59dc', u'subnetpool_id': None, u'service_types': [], u'name': u'tempest-subnet-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'enable_dhcp': True, u'network_id': u'1ea4db87-55bb-41b1-bb3f-3a29ed1ded99', u'tenant_id': u'51534bd63d854b6c878cd0603da66c99', u'created_at': u'2018-05-25T07:58:05Z', u'dns_nameservers': [u'8.8.8.8'], u'gateway_ip': u'192.168.150.1', u'ip_version': 4, u'project_id': u'51534bd63d854b6c878cd0603da66c99'}) 2018-05-25 07:58:05,461 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Creating two images for Tempest suite 2018-05-25 07:58:05,461 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-58b2e3da-7fb1-4c57-bca9-2a434c699382' 2018-05-25 07:58:06,384 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:58:05Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/ae93284e-a970-4ae5-888b-9e7e1167e9aa/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'ae93284e-a970-4ae5-888b-9e7e1167e9aa', u'size': None, u'name': u'Cirros-0.4.0-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'checksum': None, u'self': u'/v2/images/ae93284e-a970-4ae5-888b-9e7e1167e9aa', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:58:05Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:58:06,384 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating image with name: 'Cirros-0.4.0-1-58b2e3da-7fb1-4c57-bca9-2a434c699382' 2018-05-25 07:58:07,455 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - image: Munch({u'status': u'queued', u'owner_specified.shade.object': u'images/Cirros-0.4.0-1-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'tags': [], u'container_format': u'bare', u'min_ram': 0, u'updated_at': u'2018-05-25T07:58:06Z', u'owner_specified.shade.sha256': u'a8dd75ecffd4cdd96072d60c2237b448e0c8b2bc94d57f10fdbc8c481d9005b8', u'locations': [], u'min_disk': 0, u'visibility': u'public', u'file': u'/v2/images/ec031c2e-e402-4b9d-aac4-d61eedfa14b6/file', u'owner': u'51534bd63d854b6c878cd0603da66c99', u'virtual_size': None, u'owner_specified.shade.md5': u'443b7623e27ecf03dc9e01ee93f67afe', u'id': u'ec031c2e-e402-4b9d-aac4-d61eedfa14b6', u'size': None, u'name': u'Cirros-0.4.0-1-58b2e3da-7fb1-4c57-bca9-2a434c699382', u'checksum': None, u'self': u'/v2/images/ec031c2e-e402-4b9d-aac4-d61eedfa14b6', u'disk_format': u'qcow2', u'protected': False, u'created_at': u'2018-05-25T07:58:06Z', u'schema': u'/v2/schemas/image'}) 2018-05-25 07:58:07,455 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Creating two flavors for Tempest suite 2018-05-25 07:58:07,680 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor-58b2e3da-7fb1-4c57-bca9-2a434c699382', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'012ae384-2b2d-4681-832e-3ce464fe4550', 'swap': 0}) 2018-05-25 07:58:07,770 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - flavor: Munch({'name': u'opnfv_flavor_1-58b2e3da-7fb1-4c57-bca9-2a434c699382', 'ephemeral': 0, 'ram': 512, 'is_disabled': False, 'properties': Munch({u'OS-FLV-DISABLED:disabled': False, u'OS-FLV-EXT-DATA:ephemeral': 0, u'os-flavor-access:is_public': True}), u'OS-FLV-DISABLED:disabled': False, 'vcpus': 1, 'extra_specs': Munch({}), 'location': Munch({'project': Munch({'domain_id': None, 'id': u'51534bd63d854b6c878cd0603da66c99', 'name': 'admin', 'domain_name': 'Default'}), 'zone': None, 'region_name': 'regionOne', 'cloud': 'envvars'}), u'os-flavor-access:is_public': True, 'rxtx_factor': 1.0, 'is_public': True, u'OS-FLV-EXT-DATA:ephemeral': 0, 'disk': 1, 'id': u'1a224103-6b09-46a0-b259-a9d19ef15c25', 'swap': 0}) 2018-05-25 07:58:10,673 - functest.opnfv_tests.openstack.tempest.conf_utils - INFO - rally verify configure-verifier --reconfigure --id opnfv-tempest 2018-05-25 07:58:09.834 11625 INFO rally.api [-] Configuring verifier 'opnfv-tempest' (UUID=6ec18acc-973d-4cbb-a4f3-63b47c4670fb) for deployment 'opnfv-rally' (UUID=c50651c3-da94-4d0d-8a81-977bf31bf872). 2018-05-25 07:58:10.511 11625 INFO rally.api [-] Verifier 'opnfv-tempest' (UUID=6ec18acc-973d-4cbb-a4f3-63b47c4670fb) has been successfully configured for deployment 'opnfv-rally' (UUID=c50651c3-da94-4d0d-8a81-977bf31bf872)! 2018-05-25 07:58:10,674 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Looking for tempest.conf file... 2018-05-25 07:58:10,674 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Updating selected tempest.conf parameters... 2018-05-25 07:58:10,677 - functest.opnfv_tests.openstack.tempest.conf_utils - DEBUG - Add/Update required params defined in tempest_conf.yaml into tempest.conf file 2018-05-25 07:58:10,689 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Generating test case list... 2018-05-25 07:58:13,787 - functest.opnfv_tests.openstack.tempest.tempest - INFO - (cd /root/.rally/verification/verifier-6ec18acc-973d-4cbb-a4f3-63b47c4670fb/repo; testr list-tests 'neutron.tests.tempest.(api|scenario).test_trunk' >/home/opnfv/functest/results/neutron_trunk/test_list.txt 2>/dev/null) 2018-05-25 07:58:13,788 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Applying tempest blacklist... 2018-05-25 07:58:13,789 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Tempest blacklist file does not exist. 2018-05-25 07:58:13,789 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Starting Tempest test suite: '['rally', 'verify', 'start', '--load-list', u'/home/opnfv/functest/results/neutron_trunk/test_list.txt']'. 2018-05-25 07:58:16,155 - functest.opnfv_tests.openstack.tempest.tempest - INFO - 2018-05-25 07:58:16.153 11634 INFO rally.api [-] Starting verification (UUID=d496171e-d827-4ea2-ae7c-801ed25791bb) for deployment 'opnfv-rally' (UUID=c50651c3-da94-4d0d-8a81-977bf31bf872) by verifier 'opnfv-tempest' (UUID=6ec18acc-973d-4cbb-a4f3-63b47c4670fb). 2018-05-25 07:58:16,155 - functest.opnfv_tests.openstack.tempest.tempest - DEBUG - Verification UUID: d496171e-d827-4ea2-ae7c-801ed25791bb 2018-05-25 07:59:26,191 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Showing result for a verification: '['rally', 'verify', 'show', '--uuid', 'd496171e-d827-4ea2-ae7c-801ed25791bb']'. 2018-05-25 07:59:27,174 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------+ 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verification | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | UUID | d496171e-d827-4ea2-ae7c-801ed25791bb | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Status | failed | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Started at | 2018-05-25 07:58:16 | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Finished at | 2018-05-25 07:59:25 | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Duration | 0:01:09 | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Run arguments | load_list: (value is too long, use 'detailed' flag to display it) | 2018-05-25 07:59:27,175 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tags | - | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier name | opnfv-tempest (UUID: 6ec18acc-973d-4cbb-a4f3-63b47c4670fb) | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Verifier type | tempest (platform: openstack) | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Deployment name | opnfv-rally (UUID: c50651c3-da94-4d0d-8a81-977bf31bf872) | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests count | 52 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Tests duration, sec | 61.149 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Success | 4 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Skipped | 9 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Expected failures | 0 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Unexpected success | 0 | 2018-05-25 07:59:27,176 - functest.opnfv_tests.openstack.tempest.tempest - INFO - | Failures | 39 | 2018-05-25 07:59:27,177 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +---------------------+-------------------------------------------------------------------+ 2018-05-25 07:59:27,177 - functest.opnfv_tests.openstack.tempest.tempest - INFO - +-----------------------------------------------------------------------------------------------------------------------------------------------------+ 2018-05-25 07:59:27,215 - functest.opnfv_tests.openstack.tempest.tempest - INFO - Tempest neutron_trunk success_rate is 9.3023255814% 2018-05-25 07:59:31,404 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 07:59:31,404 - xtesting.ci.run_tests - INFO - Test result: +-----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-----------------------+------------------+------------------+----------------+ | neutron_trunk | functest | 01:27 | FAIL | +-----------------------+------------------+------------------+----------------+ 2018-05-25 07:59:31,409 - xtesting.ci.run_tests - ERROR - The test case 'neutron_trunk' failed. 2018-05-25 07:59:31,409 - xtesting.ci.run_tests - INFO - Xtesting report: +------------------------------+------------------+---------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +------------------------------+------------------+---------------+------------------+----------------+ | vping_ssh | functest | smoke | 00:32 | PASS | | vping_userdata | functest | smoke | 00:26 | PASS | | tempest_smoke_serial | functest | smoke | 12:35 | FAIL | | rally_sanity | functest | smoke | 21:16 | PASS | | refstack_defcore | functest | smoke | 03:21 | PASS | | patrole | functest | smoke | 02:41 | FAIL | | snaps_smoke | functest | smoke | 41:45 | PASS | | neutron_trunk | functest | smoke | 01:27 | FAIL | | odl | functest | smoke | 00:00 | SKIP | +------------------------------+------------------+---------------+------------------+----------------+ 2018-05-25 07:59:31,416 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_ERROR 2018-05-25 07:59:34,702 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | apex | | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod1 | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 07:59:34,706 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file # Clear any old environment that may conflict. for key in $( set | awk '{FS="="} /^OS_/ {print $1}' ); do unset $key ; done export OS_USERNAME=admin export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_DOMAIN_NAME=Default export OS_BAREMETAL_API_VERSION=1.34 export NOVA_VERSION=1.1 export OS_PROJECT_NAME=admin export OS_PASSWORD=EZhwZcgCD6CaJWGE7BDRjvxtq export OS_NO_CACHE=True export COMPUTE_API_VERSION=1.1 export no_proxy=,172.30.9.26,192.30.9.4 export OS_VOLUME_API_VERSION=3 export OS_CLOUDNAME=overcloud export OS_AUTH_URL=http://172.30.9.26:5000/v3 export IRONIC_API_VERSION=1.34 export OS_IDENTITY_API_VERSION=3 export OS_IMAGE_API_VERSION=2 export OS_AUTH_TYPE=password export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available" # Add OS_CLOUDNAME to PS1 if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then export PS1=${PS1:-""} export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1 export CLOUDPROMPT_ENABLED=1 fi export SDN_CONTROLLER_IP=192.30.9.4 export OS_REGION_NAME=regionOne 2018-05-25 07:59:34,707 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-25 07:59:34,707 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+-----------------+---------------------+-------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+-----------------+---------------------+-------------------+ +---------------+---------------+-----------------+---------------------+-------------------+ 2018-05-25 07:59:34,708 - xtesting.ci.run_tests - INFO - Xtesting report: +-----------------------------+------------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +-----------------------------+------------------------+------------------+------------------+----------------+ | doctor-notification | doctor | features | 00:00 | SKIP | | bgpvpn | sdnvpn | features | 00:00 | SKIP | | functest-odl-sfc | sfc | features | 00:00 | SKIP | | barometercollectd | barometer | features | 00:00 | SKIP | | fds | fastdatastacks | features | 00:00 | SKIP | +-----------------------------+------------------------+------------------+------------------+----------------+ 2018-05-25 07:59:34,710 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK 2018-05-25 07:59:37,172 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | apex | | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod1 | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 07:59:37,174 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file # Clear any old environment that may conflict. for key in $( set | awk '{FS="="} /^OS_/ {print $1}' ); do unset $key ; done export OS_USERNAME=admin export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_DOMAIN_NAME=Default export OS_BAREMETAL_API_VERSION=1.34 export NOVA_VERSION=1.1 export OS_PROJECT_NAME=admin export OS_PASSWORD=EZhwZcgCD6CaJWGE7BDRjvxtq export OS_NO_CACHE=True export COMPUTE_API_VERSION=1.1 export no_proxy=,172.30.9.26,192.30.9.4 export OS_VOLUME_API_VERSION=3 export OS_CLOUDNAME=overcloud export OS_AUTH_URL=http://172.30.9.26:5000/v3 export IRONIC_API_VERSION=1.34 export OS_IDENTITY_API_VERSION=3 export OS_IMAGE_API_VERSION=2 export OS_AUTH_TYPE=password export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available" # Add OS_CLOUDNAME to PS1 if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then export PS1=${PS1:-""} export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1 export CLOUDPROMPT_ENABLED=1 fi export SDN_CONTROLLER_IP=192.30.9.4 export OS_REGION_NAME=regionOne 2018-05-25 07:59:37,174 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-25 07:59:37,175 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ | vnf | 4 | (daily)|(weekly) | Collection of VNF test cases. | cloudify_ims vyos_vrouter juju_epc | +---------------+---------------+--------------------------+---------------------------------------+--------------------------------------------+ 2018-05-25 07:59:37,176 - xtesting.ci.run_tests - INFO - Running tier 'vnf' 2018-05-25 07:59:37,176 - xtesting.ci.run_tests - INFO - Running test case 'cloudify_ims'... 2018-05-25 07:59:37,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Orchestrator configuration {'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'm1.medium'}, u'os_image': u'cloudify_manager_4.0'}} 2018-05-25 07:59:37,632 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - VNF configuration: {'inputs': {u'vellum_cluster_size': 1, u'agent_user': u'ubuntu', u'image_id': u'ubuntu_14.04', u'external_network_name': u'', u'dime_cluster_size': 1, u'key_pair_name': u'cloudify_ims_kp', u'bono_cluster_size': 1, u'flavor_id': u'm1.small', u'public_domain': u'clearwater.opnfv', u'homer_cluster_size': 1, u'release': u'repo122', u'private_key_path': u'/etc/cloudify/cloudify_ims.pem', u'sprout_cluster_size': 1}, 'requirements': {u'flavor': {u'ram_min': 2048, u'name': u'm1.small'}, u'network_quotas': {u'security_group': 20, u'security_group_rule': 100, u'port': 50}, u'compute_quotas': {u'cores': 50, u'instances': 15}}, 'descriptor': {u'file_name': u'/src/vims/openstack-blueprint.yaml', u'version': u'122', u'name': u'clearwater-opnfv'}} 2018-05-25 07:59:37,653 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Images needed for vIMS: {u'cloudify_manager_4.0': u'/home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2', u'ubuntu_14.04': u'/home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img'} 2018-05-25 07:59:38,048 - xtesting.energy.energy - INFO - API recorder available at : http://energy.opnfv.fr/resources/recorders/environment/lf-pod1 2018-05-25 07:59:38,049 - xtesting.energy.energy - DEBUG - Getting current scenario 2018-05-25 07:59:38,526 - xtesting.energy.energy - DEBUG - Starting recording 2018-05-25 07:59:38,526 - xtesting.energy.energy - DEBUG - Submitting scenario (cloudify_ims/running) 2018-05-25 07:59:38,945 - functest.core.vnf - INFO - Prepare VNF: cloudify_ims, description: Created by OPNFV Functest: cloudify_ims 2018-05-25 07:59:41,549 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=cloudify_ims-3064ac24-f4ce-4a67-b927-98de88abb58b, password=6652f3cf-f231-41f8-a263-86fc6d51ea3f, auth_url=http://172.30.9.26:5000/v3, project_name=cloudify_ims-3064ac24-f4ce-4a67-b927-98de88abb58b, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=public, region_name=regionOne, proxy_settings=None, cacert=False 2018-05-25 07:59:41,552 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Additional pre-configuration steps 2018-05-25 07:59:43,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Upload some OS images if it doesn't exist 2018-05-25 07:59:43,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - image: cloudify_manager_4.0, file: /home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2 2018-05-25 08:01:04,759 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - image: ubuntu_14.04, file: /home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img 2018-05-25 08:01:10,890 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating keypair ... 2018-05-25 08:01:11,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating full network ... 2018-05-25 08:01:18,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating security group for cloudify manager vm 2018-05-25 08:01:20,196 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Get or create flavor for cloudify manager vm ... 2018-05-25 08:01:20,935 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Creating cloudify manager VM 2018-05-25 08:04:14,305 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Set creds for cloudify manager {'keystone_password': '6652f3cf-f231-41f8-a263-86fc6d51ea3f', 'keystone_tenant_name': 'cloudify_ims-3064ac24-f4ce-4a67-b927-98de88abb58b', 'region': 'regionOne', 'keystone_url': u'http://172.30.9.26:5000', 'user_domain_name': 'Default', 'keystone_username': 'cloudify_ims-3064ac24-f4ce-4a67-b927-98de88abb58b', 'project_domain_name': 'Default'} 2018-05-25 08:04:14,306 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Attemps running status of the Manager 2018-05-25 08:04:18,502 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - status {u'status': u'running', u'services': [{u'instances': [{u'LoadState': u'loaded', u'Description': u'InfluxDB Service', u'MainPID': 813, u'state': u'running', u'Id': u'cloudify-influxdb.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Management Worker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-mgmtworker.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Celery Management'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'LSB: Starts Logstash as a daemon.', u'MainPID': 0, u'state': u'running', u'Id': u'logstash.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Logstash'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'RabbitMQ Service', u'MainPID': 0, u'state': u'start-pre', u'Id': u'cloudify-rabbitmq.service', u'ActiveState': u'activating', u'SubState': u'start-pre'}], u'display_name': u'RabbitMQ'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify AMQP InfluxDB Broker Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-amqpinflux.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'AMQP InfluxDB'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'PostgreSQL 9.5 database server', u'MainPID': 873, u'state': u'running', u'Id': u'postgresql-9.5.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'PostgreSQL'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify REST Service', u'MainPID': 799, u'state': u'running', u'Id': u'cloudify-restservice.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Manager Rest-Service'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Cloudify Stage Service', u'MainPID': 807, u'state': u'running', u'Id': u'cloudify-stage.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Cloudify Stage'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'Riemann Service', u'MainPID': 0, u'state': u'dead', u'Id': u'cloudify-riemann.service', u'ActiveState': u'inactive', u'SubState': u'dead'}], u'display_name': u'Riemann'}, {u'instances': [{u'LoadState': u'loaded', u'Description': u'nginx - high performance web server', u'MainPID': 841, u'state': u'running', u'Id': u'nginx.service', u'ActiveState': u'active', u'SubState': u'running'}], u'display_name': u'Webserver'}]} 2018-05-25 08:04:18,616 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - The current manager status is running 2018-05-25 08:04:48,636 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Cloudify Manager is up and running 2018-05-25 08:04:48,637 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Put OpenStack creds in manager 2018-05-25 08:04:50,138 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Put private keypair in manager 2018-05-25 08:04:51,507 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo cp ~/cloudify_ims.pem /etc/cloudify/ stdout: 2018-05-25 08:04:51,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo chmod 444 /etc/cloudify/cloudify_ims.pem stdout: 2018-05-25 08:05:27,639 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH sudo yum install -y gcc python-devel stdout: Loaded plugins: fastestmirror Determining fastest mirrors * base: mirror.web-ster.com * extras: mirrors.cat.pdx.edu * updates: mirror.chpc.utah.edu Resolving Dependencies --> Running transaction check ---> Package gcc.x86_64 0:4.8.5-28.el7_5.1 will be installed --> Processing Dependency: libgomp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: cpp = 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libgcc >= 4.8.5-28.el7_5.1 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: glibc-devel >= 2.2.90-12 for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpfr.so.4()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 --> Processing Dependency: libmpc.so.3()(64bit) for package: gcc-4.8.5-28.el7_5.1.x86_64 ---> Package python-devel.x86_64 0:2.7.5-68.el7 will be installed --> Processing Dependency: python(x86-64) = 2.7.5-68.el7 for package: python-devel-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package cpp.x86_64 0:4.8.5-28.el7_5.1 will be installed ---> Package glibc-devel.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: glibc-headers = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc = 2.17-222.el7 for package: glibc-devel-2.17-222.el7.x86_64 --> Processing Dependency: glibc-headers for package: glibc-devel-2.17-222.el7.x86_64 ---> Package libgcc.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgcc.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libgomp.x86_64 0:4.8.5-11.el7 will be updated ---> Package libgomp.x86_64 0:4.8.5-28.el7_5.1 will be an update ---> Package libmpc.x86_64 0:1.0.1-3.el7 will be installed ---> Package mpfr.x86_64 0:3.1.1-4.el7 will be installed ---> Package python.x86_64 0:2.7.5-48.el7 will be updated ---> Package python.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: python-libs(x86-64) = 2.7.5-68.el7 for package: python-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc.x86_64 0:2.17-157.el7_3.1 will be updated --> Processing Dependency: glibc = 2.17-157.el7_3.1 for package: glibc-common-2.17-157.el7_3.1.x86_64 ---> Package glibc.x86_64 0:2.17-222.el7 will be an update ---> Package glibc-headers.x86_64 0:2.17-222.el7 will be installed --> Processing Dependency: kernel-headers >= 2.2.1 for package: glibc-headers-2.17-222.el7.x86_64 --> Processing Dependency: kernel-headers for package: glibc-headers-2.17-222.el7.x86_64 ---> Package python-libs.x86_64 0:2.7.5-48.el7 will be updated ---> Package python-libs.x86_64 0:2.7.5-68.el7 will be an update --> Processing Dependency: libcrypto.so.10(OPENSSL_1.0.2)(64bit) for package: python-libs-2.7.5-68.el7.x86_64 --> Running transaction check ---> Package glibc-common.x86_64 0:2.17-157.el7_3.1 will be updated ---> Package glibc-common.x86_64 0:2.17-222.el7 will be an update ---> Package kernel-headers.x86_64 0:3.10.0-862.3.2.el7 will be installed ---> Package openssl-libs.x86_64 1:1.0.1e-60.el7_3.1 will be updated --> Processing Dependency: openssl-libs(x86-64) = 1:1.0.1e-60.el7_3.1 for package: 1:openssl-1.0.1e-60.el7_3.1.x86_64 ---> Package openssl-libs.x86_64 1:1.0.2k-12.el7 will be an update --> Running transaction check ---> Package openssl.x86_64 1:1.0.1e-60.el7_3.1 will be updated ---> Package openssl.x86_64 1:1.0.2k-12.el7 will be an update --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: gcc x86_64 4.8.5-28.el7_5.1 updates 16 M python-devel x86_64 2.7.5-68.el7 base 397 k Installing for dependencies: cpp x86_64 4.8.5-28.el7_5.1 updates 5.9 M glibc-devel x86_64 2.17-222.el7 base 1.1 M glibc-headers x86_64 2.17-222.el7 base 678 k kernel-headers x86_64 3.10.0-862.3.2.el7 updates 7.1 M libmpc x86_64 1.0.1-3.el7 base 51 k mpfr x86_64 3.1.1-4.el7 base 203 k Updating for dependencies: glibc x86_64 2.17-222.el7 base 3.6 M glibc-common x86_64 2.17-222.el7 base 11 M libgcc x86_64 4.8.5-28.el7_5.1 updates 101 k libgomp x86_64 4.8.5-28.el7_5.1 updates 156 k openssl x86_64 1:1.0.2k-12.el7 base 492 k openssl-libs x86_64 1:1.0.2k-12.el7 base 1.2 M python x86_64 2.7.5-68.el7 base 93 k python-libs x86_64 2.7.5-68.el7 base 5.6 M Transaction Summary ================================================================================ Install 2 Packages (+6 Dependent packages) Upgrade ( 8 Dependent packages) Total download size: 54 M Downloading packages: Delta RPMs disabled because /usr/bin/applydeltarpm not installed. -------------------------------------------------------------------------------- Total 20 MB/s | 54 MB 00:02 Running transaction check Running transaction test Transaction test succeeded Running transaction Updating : libgcc-4.8.5-28.el7_5.1.x86_64 1/24 Updating : glibc-common-2.17-222.el7.x86_64 2/24 Updating : glibc-2.17-222.el7.x86_64 3/24 warning: /etc/nsswitch.conf created as /etc/nsswitch.conf.rpmnew Installing : mpfr-3.1.1-4.el7.x86_64 4/24 Installing : libmpc-1.0.1-3.el7.x86_64 5/24 Updating : 1:openssl-libs-1.0.2k-12.el7.x86_64 6/24 Updating : python-libs-2.7.5-68.el7.x86_64 7/24 Updating : python-2.7.5-68.el7.x86_64 8/24 Installing : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Updating : libgomp-4.8.5-28.el7_5.1.x86_64 10/24 Installing : kernel-headers-3.10.0-862.3.2.el7.x86_64 11/24 Installing : glibc-headers-2.17-222.el7.x86_64 12/24 Installing : glibc-devel-2.17-222.el7.x86_64 13/24 Installing : gcc-4.8.5-28.el7_5.1.x86_64 14/24 Installing : python-devel-2.7.5-68.el7.x86_64 15/24 Updating : 1:openssl-1.0.2k-12.el7.x86_64 16/24 Cleanup : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Cleanup : python-2.7.5-48.el7.x86_64 18/24 Cleanup : python-libs-2.7.5-48.el7.x86_64 19/24 Cleanup : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 20/24 Cleanup : libgomp-4.8.5-11.el7.x86_64 21/24 Cleanup : glibc-common-2.17-157.el7_3.1.x86_64 22/24 Cleanup : glibc-2.17-157.el7_3.1.x86_64 23/24 Cleanup : libgcc-4.8.5-11.el7.x86_64 24/24 Verifying : python-libs-2.7.5-68.el7.x86_64 1/24 Verifying : glibc-devel-2.17-222.el7.x86_64 2/24 Verifying : glibc-headers-2.17-222.el7.x86_64 3/24 Verifying : 1:openssl-libs-1.0.2k-12.el7.x86_64 4/24 Verifying : libgomp-4.8.5-28.el7_5.1.x86_64 5/24 Verifying : gcc-4.8.5-28.el7_5.1.x86_64 6/24 Verifying : glibc-2.17-222.el7.x86_64 7/24 Verifying : libgcc-4.8.5-28.el7_5.1.x86_64 8/24 Verifying : cpp-4.8.5-28.el7_5.1.x86_64 9/24 Verifying : python-devel-2.7.5-68.el7.x86_64 10/24 Verifying : libmpc-1.0.1-3.el7.x86_64 11/24 Verifying : glibc-common-2.17-222.el7.x86_64 12/24 Verifying : python-2.7.5-68.el7.x86_64 13/24 Verifying : mpfr-3.1.1-4.el7.x86_64 14/24 Verifying : 1:openssl-1.0.2k-12.el7.x86_64 15/24 Verifying : kernel-headers-3.10.0-862.3.2.el7.x86_64 16/24 Verifying : 1:openssl-1.0.1e-60.el7_3.1.x86_64 17/24 Verifying : 1:openssl-libs-1.0.1e-60.el7_3.1.x86_64 18/24 Verifying : glibc-common-2.17-157.el7_3.1.x86_64 19/24 Verifying : glibc-2.17-157.el7_3.1.x86_64 20/24 Verifying : python-libs-2.7.5-48.el7.x86_64 21/24 Verifying : libgcc-4.8.5-11.el7.x86_64 22/24 Verifying : python-2.7.5-48.el7.x86_64 23/24 Verifying : libgomp-4.8.5-11.el7.x86_64 24/24 Installed: gcc.x86_64 0:4.8.5-28.el7_5.1 python-devel.x86_64 0:2.7.5-68.el7 Dependency Installed: cpp.x86_64 0:4.8.5-28.el7_5.1 glibc-devel.x86_64 0:2.17-222.el7 glibc-headers.x86_64 0:2.17-222.el7 kernel-headers.x86_64 0:3.10.0-862.3.2.el7 libmpc.x86_64 0:1.0.1-3.el7 mpfr.x86_64 0:3.1.1-4.el7 Dependency Updated: glibc.x86_64 0:2.17-222.el7 glibc-common.x86_64 0:2.17-222.el7 libgcc.x86_64 0:4.8.5-28.el7_5.1 libgomp.x86_64 0:4.8.5-28.el7_5.1 openssl.x86_64 1:1.0.2k-12.el7 openssl-libs.x86_64 1:1.0.2k-12.el7 python.x86_64 0:2.7.5-68.el7 python-libs.x86_64 0:2.7.5-68.el7 Complete! 2018-05-25 08:05:29,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - SSH cfy status stdout: Retrieving manager services status... [ip=127.0.0.1] Services: +--------------------------------+---------+ | service | status | +--------------------------------+---------+ | InfluxDB | running | | Celery Management | running | | Logstash | running | | RabbitMQ | running | | AMQP InfluxDB | running | | PostgreSQL | running | | Manager Rest-Service | running | | Cloudify Stage | running | | Riemann | running | | Webserver | running | +--------------------------------+---------+ 2018-05-25 08:05:29,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Upload VNFD 2018-05-25 08:05:32,252 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Get or create flavor for all clearwater vm 2018-05-25 08:05:32,935 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Create VNF Instance 2018-05-25 08:05:39,736 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'create_deployment_environment' workflow execution 2018-05-25 08:05:39,736 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:05:39,736 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:05:39,736 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing deployment plugins 2018-05-25 08:06:05,533 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:06:05,604 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Start the VNF Instance deployment 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'install' workflow execution 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:10,953 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:10,954 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,257 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,257 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,257 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,257 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,258 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.keypair.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.create' 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:16,259 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-25 08:06:21,463 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.keypair.create' 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,464 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.keypair.create' 2018-05-25 08:06:21,465 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:21,466 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.create' 2018-05-25 08:06:26,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:26,763 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:26,764 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:32,106 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.create' 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:32,107 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:37,489 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:37,490 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:37,491 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:37,492 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:37,492 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,691 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.create' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:42,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' 2018-05-25 08:06:42,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:06:47,851 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:06:47,851 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] 2018-05-25 08:07:09,176 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:09,176 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:09,176 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:09,176 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:09,176 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,334 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:14,335 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:19,480 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:19,480 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 1/60] 2018-05-25 08:07:19,480 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:19,480 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 1/60] 2018-05-25 08:07:40,479 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:40,479 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:40,479 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:40,479 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:45,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:51,051 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:51,051 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 2/60] 2018-05-25 08:07:51,051 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:51,051 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:51,051 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:51,052 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:07:51,052 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 2/60] 2018-05-25 08:08:12,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:12,243 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,418 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:17,419 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:23,372 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 3/60] 2018-05-25 08:08:54,297 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,297 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,297 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,298 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,299 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,299 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:08:54,299 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:08:54,299 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:09:00,731 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 4/60] 2018-05-25 08:09:00,731 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:00,731 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:00,731 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:07,239 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:07,239 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:07,239 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 4/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 5/60] 2018-05-25 08:09:25,027 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:32,413 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 5/60] 2018-05-25 08:09:42,024 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,024 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,024 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.6 on port 22: Unable to connect to port 22 on 10.67.79.6 (tried 1 time) 2018-05-25 08:09:42,025 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 5/60] 2018-05-25 08:09:42,026 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 5/60] 2018-05-25 08:09:49,127 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:49,127 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:09:49,127 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 5/60] 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:09:56,390 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:09:56,391 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:09:56,391 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:10:03,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:03,916 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 6/60] 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:11,129 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 6/60] 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:16,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,560 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'nova_plugin.server.start' -> Waiting for server to be in ACTIVE state but is in BUILD:spawning state. Retrying... [retry_after=30] [retry 6/60] 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 6/60] 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:16,561 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:16,562 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,992 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task rescheduled 'cloudify_agent.installer.operations.create' -> Low level socket error connecting to host 10.67.79.5 on port 22: Unable to connect to port 22 on 10.67.79.5 (tried 1 time) 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:21,993 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:21,994 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:21,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:21,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:21,995 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:27,534 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:10:27,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:10:27,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:27,537 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,891 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:32,892 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,468 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,469 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,470 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:38,470 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:38,470 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:10:38,470 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:38,470 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:44,352 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,352 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,352 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,352 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_floatingip' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_floatingip' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:44,353 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:44,354 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:44,354 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:44,354 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' [retry 1/60] 2018-05-25 08:10:50,410 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:10:50,410 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:50,410 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:50,410 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:10:50,410 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.start' [retry 7/60] 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating Agent 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:50,411 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_floatingip' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:50,412 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:55,747 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:55,747 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:55,747 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:55,747 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:10:55,748 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:10:55,749 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:55,749 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:10:55,749 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.create' 2018-05-25 08:11:01,152 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring Agent 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.configure' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.configure' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:01,153 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting Agent 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.configure' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:01,154 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:01,155 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.start' 2018-05-25 08:11:06,692 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Installing plugins 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:06,693 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:11:11,984 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:11:11,984 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:11:11,984 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,984 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.operations.install_plugins' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.install' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.install' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.install' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.start' 2018-05-25 08:11:11,985 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.start' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.start' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:11,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:11:17,526 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.add_collectors' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:11:17,527 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:11:22,761 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:11:22,761 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:22,761 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:22,761 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:22,761 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:22,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_security_group' 2018-05-25 08:11:22,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:22,762 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:28,447 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.connect_floatingip' 2018-05-25 08:11:45,409 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:11:45,409 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:11:45,409 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:11:45,409 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,436 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,437 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Creating node 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,438 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:13:15,439 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:15,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:13:15,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:15,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:21,276 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:26,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:26,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:13:26,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:13:26,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:13:26,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:13:26,690 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:14:07,329 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:14:12,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:14:12,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Configuring node 2018-05-25 08:14:12,535 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:14:44,144 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:02 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:03 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:04 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:07 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:08 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:09 --:--:-- 0sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm 0 0 0 0 0 0 0 0 --:--:-- 0:00:10 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:11 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:12 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:13 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:14 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:15 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:16 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:17 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:18 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:19 --:--:-- 0curl: (6) Could not resolve host: repo.cw-ngv.com gpg: no valid OpenPGP data found. 2018-05-25 08:14:59,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 1/60] 2018-05-25 08:14:59,669 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 1/60] 2018-05-25 08:16:04,814 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 1/60] 2018-05-25 08:16:15,456 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 2/60] 2018-05-25 08:16:15,456 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 2/60] 2018-05-25 08:16:57,396 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:16:57,396 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:17:02,746 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:02,746 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:02,746 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:08,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:08,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:17:14,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:14,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:14,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:14,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:17:14,483 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:14,484 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:19,923 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:19,923 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:25,232 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 2/60] 2018-05-25 08:17:36,131 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 3/60] 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 3/60] 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:17:41,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:18:08,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:18:08,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting node 2018-05-25 08:18:08,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:18:08,443 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:18:41,934 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 3/60] 2018-05-25 08:18:57,539 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 4/60] 2018-05-25 08:18:57,539 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 4/60] 2018-05-25 08:19:43,949 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:19:43,949 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:19:43,949 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:19:43,949 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:19:59,440 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 4/60] 2018-05-25 08:20:14,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 5/60] 2018-05-25 08:20:14,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 5/60] 2018-05-25 08:21:22,010 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 5/60] 2018-05-25 08:21:32,292 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 6/60] 2018-05-25 08:21:37,422 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 6/60] 2018-05-25 08:22:39,689 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 6/60] 2018-05-25 08:22:55,101 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 7/60] 2018-05-25 08:22:55,101 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 7/60] 2018-05-25 08:23:58,115 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 7/60] 2018-05-25 08:24:13,572 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 8/60] 2018-05-25 08:24:13,572 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 8/60] 2018-05-25 08:25:15,189 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 8/60] 2018-05-25 08:25:30,601 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 9/60] 2018-05-25 08:25:30,602 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 9/60] 2018-05-25 08:26:32,688 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 9/60] 2018-05-25 08:26:48,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 10/60] 2018-05-25 08:26:48,222 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 10/60] 2018-05-25 08:27:56,014 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 10/60] 2018-05-25 08:28:06,594 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 11/60] 2018-05-25 08:28:11,979 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 11/60] 2018-05-25 08:29:15,666 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 11/60] 2018-05-25 08:29:26,229 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 12/60] 2018-05-25 08:29:31,403 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 12/60] 2018-05-25 08:30:34,687 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/InRelease W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/InRelease W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/InRelease W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/trusty-security/Release.gpg Could not resolve 'security.ubuntu.com' W: Failed to fetch http://repo.cw-ngv.com/archive/repo122/binary/Release.gpg Could not resolve 'repo.cw-ngv.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-updates/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Failed to fetch http://nova.clouds.archive.ubuntu.com/ubuntu/dists/trusty-backports/Release.gpg Could not resolve 'nova.clouds.archive.ubuntu.com' W: Some index files failed to download. They have been ignored, or old ones used instead. sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm E: Unable to locate package ellis [retry 12/60] 2018-05-25 08:30:45,086 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' [retry 13/60] 2018-05-25 08:30:50,325 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' [retry 13/60] 2018-05-25 08:31:05,988 - functest.core.vnf - ERROR - Exception on VNF testing Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/xtesting/core/vnf.py", line 75, in run self.deploy_vnf() and File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/ims/cloudify_ims.py", line 341, in deploy_vnf execution = wait_for_execution(cfy_client, execution, self.__logger) File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/ims/cloudify_ims.py", line 493, in wait_for_execution execution.deployment_id)) RuntimeError: execution of operation install for deployment clearwater-opnfv timed out 2018-05-25 08:31:05,988 - xtesting.energy.energy - DEBUG - Restoring previous scenario (default/default) 2018-05-25 08:31:05,988 - xtesting.energy.energy - DEBUG - Submitting scenario (default/default) 2018-05-25 08:31:06,532 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 08:31:06,533 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | cloudify_ims | functest | 31:27 | FAIL | +----------------------+------------------+------------------+----------------+ 2018-05-25 08:31:06,536 - functest.opnfv_tests.vnf.ims.cloudify_ims - INFO - Deleting the current deployment 2018-05-25 08:31:12,568 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Starting 'uninstall' workflow execution 2018-05-25 08:31:12,568 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,568 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,568 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,568 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,569 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> Group 'dime' not found 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> Group 'vellum' not found 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> There is no service named "clearwater_cluster_manager" 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,570 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,571 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,571 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:12,571 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> There is no service named "clearwater_config_manager" 2018-05-25 08:31:12,571 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:12,571 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> There is no service named "clearwater_cluster_manager" 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,715 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,716 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:17,717 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'script_runner.tasks.run' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:22,853 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm sudo: monit: command not found 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,854 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,855 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> u'ellis' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:31:22,856 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'script_runner.tasks.run' 2018-05-25 08:31:22,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:22,857 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'script_runner.tasks.run' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,083 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:28,084 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:33,319 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'diamond_agent.tasks.stop' -> timeout after 10 seconds (pid=5880) 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:31:33,320 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' -> sudo: unable to resolve host server-clearwater-opnfv-ellis-host-dmipqm clearwater-etcd: unrecognized service 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,658 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:31:38,659 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:31:38,660 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,660 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:38,660 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:31:38,660 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:43,826 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:43,826 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,826 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,827 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:31:43,828 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,337 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:31:49,338 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:49,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:54,557 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:54,557 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:54,557 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:54,558 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:31:54,559 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:31:59,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:31:59,986 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:31:59,987 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,288 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,289 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:32:05,290 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-25 08:32:05,290 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:32:36,349 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:32:36,349 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:32:36,349 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:32:36,349 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:33:22,730 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:33:33,173 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:33:33,174 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:33,174 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:33,174 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:33,174 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:33,174 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:38,648 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:33:38,648 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:33:38,648 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:33:38,649 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:33:43,832 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:33:43,833 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:33:43,833 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:33:48,965 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:33:48,966 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-25 08:34:14,626 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task failed 'script_runner.tasks.run' 2018-05-25 08:34:19,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:19,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:19,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,753 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.stop' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:19,754 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'diamond_agent.tasks.uninstall' 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping agent 2018-05-25 08:34:24,914 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:24,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:24,915 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:30,111 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.stop' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting agent 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'cloudify_agent.installer.operations.delete' 2018-05-25 08:34:30,112 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.stop' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:34:30,113 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:35,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_floatingip' 2018-05-25 08:34:35,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:35,339 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:35,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:35,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:35,340 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:40,471 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:34:40,472 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:34:40,472 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.disconnect_security_group' 2018-05-25 08:34:40,472 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:40,472 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.server.delete' 2018-05-25 08:34:40,472 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.server.delete' 2018-05-25 08:34:50,945 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.server.delete' 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:50,946 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Stopping node 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Deleting node 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.floatingip.delete' 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.floatingip.delete' 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,947 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,948 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'neutron_plugin.security_group.delete' 2018-05-25 08:34:50,948 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Sending task 'nova_plugin.keypair.delete' 2018-05-25 08:34:50,948 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task started 'nova_plugin.keypair.delete' 2018-05-25 08:34:56,149 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'nova_plugin.keypair.delete' 2018-05-25 08:34:56,149 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.floatingip.delete' 2018-05-25 08:34:56,149 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:34:56,149 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - Task succeeded 'neutron_plugin.security_group.delete' 2018-05-25 08:34:56,149 - functest.opnfv_tests.vnf.ims.cloudify_ims - DEBUG - 'uninstall' workflow execution succeeded 2018-05-25 08:34:57,158 - functest.core.vnf - INFO - Removing the VNF resources .. 2018-05-25 08:35:16,566 - xtesting.ci.run_tests - ERROR - The test case 'cloudify_ims' failed. 2018-05-25 08:35:16,567 - xtesting.ci.run_tests - INFO - Running test case 'vyos_vrouter'... 2018-05-25 08:35:16,685 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-25 08:35:17,085 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - Orchestrator configuration {'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'm1.medium'}, u'os_image': u'cloudify_manager_4.0'}} 2018-05-25 08:35:17,085 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - name = functest.opnfv_tests.vnf.router.cloudify_vrouter 2018-05-25 08:35:17,113 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - VNF configuration: {'inputs': {u'region': u'RegionOne', u'external_network_name': u'admin_floating_net'}, 'requirements': {u'flavor': {u'ram_min': 2048, u'name': u'm1.medium'}}, 'descriptor': {u'file_name': u'/src/opnfv-vnf-vyos-blueprint/function-test-openstack-blueprint.yaml', u'version': u'fraser', u'name': u'vrouter-opnfv'}} 2018-05-25 08:35:17,119 - functest.opnfv_tests.vnf.router.utilvnf - DEBUG - Downloading the test data. 2018-05-25 08:35:17,142 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Images needed for vrouter: {u'cloudify_manager_4.0': u'/home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2', u'vyos1.1.7': u'/home/opnfv/functest/images/vyos-1.1.7.img'} 2018-05-25 08:35:17,142 - functest.core.vnf - INFO - Prepare VNF: vyos_vrouter, description: Created by OPNFV Functest: vyos_vrouter 2018-05-25 08:35:19,769 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=vyos_vrouter-b26f1312-41de-42d3-9eda-6452a86a8078, password=b088302d-1404-4154-9686-cf4f73218da2, auth_url=http://172.30.9.26:5000/v3, project_name=vyos_vrouter-b26f1312-41de-42d3-9eda-6452a86a8078, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=public, region_name=regionOne, proxy_settings=None, cacert=False 2018-05-25 08:35:19,769 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Additional pre-configuration steps 2018-05-25 08:35:19,769 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Upload some OS images if it doesn't exist 2018-05-25 08:35:19,769 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - image: cloudify_manager_4.0, file: /home/opnfv/functest/images/cloudify-manager-premium-4.0.1.qcow2 2018-05-25 08:36:42,048 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - image: vyos1.1.7, file: /home/opnfv/functest/images/vyos-1.1.7.img 2018-05-25 08:36:51,561 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating keypair ... 2018-05-25 08:36:52,288 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating full network ... 2018-05-25 08:36:59,668 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating security group for cloudify manager vm 2018-05-25 08:37:01,602 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Get or create flavor for cloudify manager vm ... 2018-05-25 08:37:02,259 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Creating cloudify manager VM 2018-05-25 08:39:55,429 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Attemps running status of the Manager 2018-05-25 08:40:01,279 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - The current manager status is running 2018-05-25 08:40:31,309 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Cloudify Manager is up and running 2018-05-25 08:40:31,310 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Put private keypair in manager 2018-05-25 08:40:52,528 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo cp ~/cloudify_vrouter.pem /etc/cloudify/ stdout: 2018-05-25 08:40:52,604 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo chmod 444 /etc/cloudify/cloudify_vrouter.pem stdout: 2018-05-25 08:41:13,512 - functest.opnfv_tests.vnf.router.cloudify_vrouter - DEBUG - SSH sudo yum install -y gcc python-devel stdout: Loaded plugins: fastestmirror Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=genclo error was 14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error" 2018-05-25 08:41:13,512 - functest.opnfv_tests.vnf.router.cloudify_vrouter - ERROR - SSH sudo yum install -y gcc python-devel stderr: One of the configured repositories failed (Unknown), and yum doesn't have enough cached data to continue. At this point the only safe thing yum can do is fail. There are a few ways to work "fix" this: 1. Contact the upstream for the repository and get them to fix the problem. 2. Reconfigure the baseurl/etc. for the repository, to point to a working upstream. This is most often useful if you are using a newer distribution release than is supported by the repository (and the packages for the previous distribution release still work). 3. Run the command with the repository temporarily disabled yum --disablerepo= ... 4. Disable the repository permanently, so yum won't use it by default. Yum will then just ignore the repository until you permanently enable it again or use --enablerepo for temporary usage: yum-config-manager --disable or subscription-manager repos --disable= 5. Configure the failing repository to be skipped, if it is unavailable. Note that yum will try to contact the repo. when it runs most commands, so will have to try and fail each time (and thus. yum will be be much slower). If it is a very temporary problem though, this is often a nice compromise: yum-config-manager --save --setopt=.skip_if_unavailable=true Cannot find a valid baseurl for repo: base/7/x86_64 2018-05-25 08:41:13,513 - functest.core.vnf - ERROR - Exception on VNF testing Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/xtesting/core/vnf.py", line 74, in run if (self.deploy_orchestrator() and File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/router/cloudify_vrouter.py", line 289, in deploy_orchestrator ssh, cmd, "Unable to install packages on manager") File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/router/cloudify_vrouter.py", line 131, in run_blocking_ssh_command raise Exception(error_msg) Exception: Unable to install packages on manager 2018-05-25 08:41:13,635 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 08:41:13,635 - xtesting.ci.run_tests - INFO - Test result: +----------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +----------------------+------------------+------------------+----------------+ | vyos_vrouter | functest | 05:56 | FAIL | +----------------------+------------------+------------------+----------------+ 2018-05-25 08:41:13,639 - functest.opnfv_tests.vnf.router.cloudify_vrouter - INFO - Deleting the current deployment 2018-05-25 08:41:13,725 - functest.opnfv_tests.vnf.router.cloudify_vrouter - ERROR - Some issue during the undeployment .. Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/functest/opnfv_tests/vnf/router/cloudify_vrouter.py", line 411, in clean exec_list = cfy_client.executions.list(dep_name) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/executions.py", line 125, in list response = self.api.get(uri, params=params, _include=_include) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/client.py", line 229, in get timeout=timeout) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/client.py", line 211, in do_request verify=self.get_request_verify(), timeout=timeout) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/client.py", line 154, in _do_request self._raise_client_error(response, request_url) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/client.py", line 116, in _raise_client_error server_traceback=server_traceback) File "/usr/lib/python2.7/site-packages/cloudify_rest_client/client.py", line 127, in _prepare_and_raise_exception status_code, error_code=error_code) CloudifyClientError: 404: Requested `Deployment` with ID `vrouter-opnfv` was not found 2018-05-25 08:41:13,726 - functest.core.vnf - INFO - Removing the VNF resources .. 2018-05-25 08:41:33,016 - xtesting.ci.run_tests - ERROR - The test case 'vyos_vrouter' failed. 2018-05-25 08:41:33,016 - xtesting.ci.run_tests - INFO - Running test case 'juju_epc'... 2018-05-25 08:41:33,105 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - VNF configuration: {'descriptor': {u'file_name': u'/src/epc-requirements/abot_charm/functest-abot-epc-bundle/bundle.yaml', u'version': u'1', u'name': u'abot-oai-epc'}, 'requirements': {u'flavor': {u'ram_min': 4096, u'name': u'm1.medium.juju'}}} 2018-05-25 08:41:33,123 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Additional pre-configuration steps 2018-05-25 08:41:33,124 - functest.core.vnf - INFO - Prepare VNF: juju_epc, description: Created by OPNFV Functest: juju_epc 2018-05-25 08:41:35,748 - functest.core.vnf - DEBUG - snaps creds: OSCreds - username=juju_epc-cd7b5f71-b6f5-4d52-b6d0-e1d43ce8c8f3, password=99c7e82f-167f-4ebb-b08c-d725a9fa9e6b, auth_url=http://172.30.9.26:5000/v3, project_name=juju_epc-cd7b5f71-b6f5-4d52-b6d0-e1d43ce8c8f3, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=public, region_name=regionOne, proxy_settings=None, cacert=False 2018-05-25 08:41:35,749 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - ENV: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | SDN_CONTROLLER_IP | 192.30.9.4 | | ENERGY_RECORDER_API_PASSWORD | | | INSTALLER_TYPE | apex | | NAMESERVER | 8.8.8.8 | | POD_ARCH | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | NODE_NAME | lf-pod1 | | VOLUME_DEVICE_NAME | vdb | | EXTERNAL_NETWORK | | | ENERGY_RECORDER_API_USER | | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 08:41:36,116 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Cloud for Abot-epc ..... 2018-05-25 08:41:37,105 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju add-cloud abot-epc -f /home/opnfv/functest/results/juju_epc/clouds.yaml --replace Since Juju 2 is being run for the first time, downloading latest cloud information. Fetching latest public cloud list... Updated your list of public clouds with 6 cloud regions added: added cloud region: - aws/eu-west-3 - google/asia-south1 - google/europe-west2 - google/europe-west3 - google/southamerica-east1 - google/us-east4 2018-05-25 08:41:37,106 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Credentials for Abot-epc ..... 2018-05-25 08:41:39,013 - functest.opnfv_tests.vnf.epc.juju_epc - DEBUG - snaps creds: OSCreds - username=juju_network_discovery_bug, password=c498bbef-709a-4b0c-94f8-e6db45bd5b74, auth_url=http://172.30.9.26:5000/v3, project_name=juju_epc-cd7b5f71-b6f5-4d52-b6d0-e1d43ce8c8f3, identity_api_version=3.0, image_api_version=2.0, network_api_version=2.0, compute_api_version=2.0, heat_api_version=1, user_domain_id=default, user_domain_name=Default, project_domain_id=default, project_domain_name=Default, interface=public, region_name=regionOne, proxy_settings=None, cacert=False 2018-05-25 08:41:39,122 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju add-credential abot-epc -f /home/opnfv/functest/results/juju_epc/credentials.yaml --replace Credentials updated for cloud "abot-epc". 2018-05-25 08:41:39,122 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Deploying Juju Orchestrator 2018-05-25 08:41:39,122 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating full network with nameserver: 8.8.8.8 2018-05-25 08:41:41,613 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating network Router .... 2018-05-25 08:41:46,043 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Creating Flavor .... 2018-05-25 08:41:46,670 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Upload some OS images if it doesn't exist 2018-05-25 08:41:46,682 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Images needed for vEPC: {u'trusty': u'/home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img', u'xenial': u'/home/opnfv/functest/images/ubuntu-16.04-server-cloudimg-amd64-disk1.img'} 2018-05-25 08:41:46,682 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - image: trusty, file: /home/opnfv/functest/images/ubuntu-14.04-server-cloudimg-amd64-disk1.img 2018-05-25 08:41:52,724 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju metadata generate-image -d /root -i a926abc6-fa0a-42fc-bb29-383f507be1ad -s trusty -r regionOne -u http://172.30.9.26:5000/v3 WARNING model could not be opened: No controllers registered. Please either create a new controller using "juju bootstrap" or connect to another controller that you have been given access to using "juju register". Image metadata files have been written to: /root/images/streams/v1. For Juju to use this metadata, the files need to be put into the image metadata search path. There are 2 options: 1. Use the --metadata-source parameter when bootstrapping: juju bootstrap --metadata-source /root 2. Use image-metadata-url in $JUJU_DATA/environments.yaml (if $JUJU_DATA is not set it will try $XDG_DATA_HOME/juju and if not set either default to ~/.local/share/juju) Configure a http server to serve the contents of /root and set the value of image-metadata-url accordingly. 2018-05-25 08:41:52,724 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - image: xenial, file: /home/opnfv/functest/images/ubuntu-16.04-server-cloudimg-amd64-disk1.img 2018-05-25 08:41:59,887 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - juju metadata generate-image -d /root -i 75eedffa-bed4-42a1-aa7f-9284f0552c6c -s xenial -r regionOne -u http://172.30.9.26:5000/v3 WARNING model could not be opened: No controllers registered. Please either create a new controller using "juju bootstrap" or connect to another controller that you have been given access to using "juju register". Image metadata files have been written to: /root/images/streams/v1. For Juju to use this metadata, the files need to be put into the image metadata search path. There are 2 options: 1. Use the --metadata-source parameter when bootstrapping: juju bootstrap --metadata-source /root 2. Use image-metadata-url in $JUJU_DATA/environments.yaml (if $JUJU_DATA is not set it will try $XDG_DATA_HOME/juju and if not set either default to ~/.local/share/juju) Configure a http server to serve the contents of /root and set the value of image-metadata-url accordingly. 2018-05-25 08:41:59,888 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Network ID : 7fbd4134-8a7c-46b3-8be5-448e6e7a188c 2018-05-25 08:41:59,888 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Starting Juju Bootstrap process... 2018-05-25 08:44:14,556 - functest.opnfv_tests.vnf.epc.juju_epc - ERROR - Exception with Juju Bootstrap: ['timeout', '-t', '3600', 'juju', 'bootstrap', 'abot-epc', 'abot-controller', '--metadata-source', '/root', '--constraints', 'mem=2G', '--bootstrap-series', 'xenial', '--config', 'network=7fbd4134-8a7c-46b3-8be5-448e6e7a188c', '--config', 'ssl-hostname-verification=false', '--config', 'use-floating-ip=true', '--config', 'use-default-secgroup=true', '--debug'] 08:41:59 INFO juju.cmd supercommand.go:63 running juju [2.2.5 gc go1.9.4] 08:41:59 DEBUG juju.cmd supercommand.go:64 args: []string{"juju", "bootstrap", "abot-epc", "abot-controller", "--metadata-source", "/root", "--constraints", "mem=2G", "--bootstrap-series", "xenial", "--config", "network=7fbd4134-8a7c-46b3-8be5-448e6e7a188c", "--config", "ssl-hostname-verification=false", "--config", "use-floating-ip=true", "--config", "use-default-secgroup=true", "--debug"} 08:41:59 DEBUG juju.cmd.juju.commands bootstrap.go:804 authenticating with region "" and credential "abot-epc" () 08:41:59 DEBUG juju.cmd.juju.commands bootstrap.go:932 provider attrs: map[use-floating-ip:true use-default-secgroup:true network:7fbd4134-8a7c-46b3-8be5-448e6e7a188c external-network:] 08:42:00 INFO cmd authkeys.go:114 Adding contents of "/root/.local/share/juju/ssh/juju_id_rsa.pub" to authorized-keys 08:42:00 DEBUG juju.cmd.juju.commands bootstrap.go:988 preparing controller with config: map[use-default-secgroup:true ssl-hostname-verification:false ftp-proxy: apt-http-proxy: max-status-history-age:336h http-proxy: type:openstack resource-tags: provisioner-harvest-mode:destroyed apt-no-proxy: logging-config: image-metadata-url: apt-mirror: image-stream:released https-proxy: update-status-hook-interval:5m agent-stream:released ignore-machine-addresses:false max-action-results-age:336h logforward-enabled:false firewall-mode:instance use-floating-ip:true automatically-retry-hooks:true apt-https-proxy: external-network: max-status-history-size:5G uuid:00a356a0-621b-42c3-8d62-48b2ef36e8f4 name:controller test-mode:false apt-ftp-proxy: default-series:xenial enable-os-upgrade:true proxy-ssh:false network:7fbd4134-8a7c-46b3-8be5-448e6e7a188c no-proxy:127.0.0.1,localhost,::1 enable-os-refresh-update:true development:false transmit-vendor-metrics:true max-action-results-size:5G authorized-keys:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCuEX7XaHTw2/NRfug0BtT5AXdU4UxBvFIEd+BG9nybYL6T5UfKL8AFLVPNTr8tBBkH03oSUwUYL12O5oZsY01OufDiPfXJtox2eXc0NaWWUkZekPfoZOXBBozf2Px9Pbb+nHKZ6SfHOHDMo9cs99bY1aGNyY2u+bEyrfKnIhShXpi/zIRnC+/2Z15c9r7NwuhqVdhQRvU7G8AwOflXwLHyajjpzwvsVWS92Gof7oWMSK3RGc5l5Eqbr+YiIkFWUu/aU49uM5DjeW9DE8DPUdczbhCfA2TGF5Xm3gc+TaGzczQRfU11DsVLk/tHpFMYFpbYp/nFMdekOGFUG6fmHteT juju-client-key disable-network-management:false net-bond-reconfigure-delay:17 agent-metadata-url:] 08:42:00 INFO juju.provider.openstack provider.go:144 opening model "controller" 08:42:00 DEBUG goose :1 auth details: &{Token:gAAAAABbB8xY2clIwJ5fCT4HzHhmnHZYXuP56lACgqPn0CRxo1fkqnwQ0TLc_5w8Vi1J_CbcSoIdg7nRMCBzDfGWE69uDSl6-ozQwCSSiQQ7xiAHhz7W73MKwteVm3FiVC0PtfOiRqCqeCHfGMEkptb6TZj3lGFe-a5gYm8MQNZGITS621V-7cU TenantId:2a2be3eeddc54804ac57eda85987d389 UserId:56cc843a75ef4d7b85a1bb3751d8132d Domain: RegionServiceURLs:map[regionOne:map[volumev3:http://172.30.9.26:8776/v3/2a2be3eeddc54804ac57eda85987d389 orchestration:http://172.30.9.26:8004/v1/2a2be3eeddc54804ac57eda85987d389 key-manager:http://172.30.9.26:9311 network:http://172.30.9.26:9696 metric:http://172.30.9.26:8041 volume:http://172.30.9.26:8776/v1/2a2be3eeddc54804ac57eda85987d389 placement:http://172.30.9.26:8778/placement volumev2:http://172.30.9.26:8776/v2/2a2be3eeddc54804ac57eda85987d389 image:http://172.30.9.26:9292 policy:http://172.30.9.26:1789 cloudformation:http://172.30.9.26:8000/v1 alarming:http://172.30.9.26:8042 compute:http://172.30.9.26:8774/v2.1 identity:http://172.30.9.26:5000]]} 08:42:01 INFO cmd bootstrap.go:482 Creating Juju controller "abot-controller" on abot-epc/regionOne 08:42:01 DEBUG goose :1 performing API version discovery for "http://172.30.9.26:8774/" 08:42:01 DEBUG goose :1 discovered API versions: [{Version:{major:2 minor:0} Links:[{Href:http://172.30.9.26:8774/v2/ Rel:self}] Status:SUPPORTED} {Version:{major:2 minor:1} Links:[{Href:http://172.30.9.26:8774/v2.1/ Rel:self}] Status:CURRENT}] 08:42:01 INFO juju.cmd.juju.commands bootstrap.go:540 combined bootstrap constraints: mem=2048M 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:199 model "controller" supports service/machine networks: true 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:201 network management by juju enabled: true 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:685 no agent directory found, using default agent binary metadata source: https://streams.canonical.com/juju/tools 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:710 setting default image metadata source: /root/images 08:42:01 DEBUG juju.environs imagemetadata.go:46 new user image datasource registered: bootstrap metadata 08:42:01 INFO juju.environs.bootstrap bootstrap.go:728 custom image metadata added to search path 08:42:01 INFO cmd bootstrap.go:233 Loading image metadata 08:42:01 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "bootstrap metadata" 08:42:01 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "default cloud images" 08:42:01 DEBUG juju.environs imagemetadata.go:112 obtained image datasource "default ubuntu cloud images" 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:576 constraints for image metadata lookup &{{{regionOne http://172.30.9.26:5000/v3} [win2016nano yakkety win7 raring genericlinux win8 win2016hv win10 win2008r2 quantal win81 precise trusty utopic zesty win2012hv win2016 saucy win2012r2 win2012 centos7 opensuseleap win2012hvr2 wily xenial vivid] [amd64 i386 armhf arm64 ppc64el s390x] released}} 08:42:01 DEBUG juju.environs.bootstrap bootstrap.go:588 found 2 image metadata in bootstrap metadata 08:42:03 DEBUG juju.environs.bootstrap bootstrap.go:588 found 0 image metadata in default cloud images 08:42:04 DEBUG juju.environs.simplestreams simplestreams.go:457 skipping index "http://cloud-images.ubuntu.com/releases/streams/v1/index.sjson" because of missing information: index file has no data for cloud {regionOne http://172.30.9.26:5000/v3} not found 08:42:04 DEBUG juju.environs.bootstrap bootstrap.go:584 ignoring image metadata in default ubuntu cloud images: index file has no data for cloud {regionOne http://172.30.9.26:5000/v3} not found 08:42:04 DEBUG juju.environs.bootstrap bootstrap.go:592 found 2 image metadata from all image data sources 08:42:04 INFO cmd bootstrap.go:296 Looking for packaged Juju agent version 2.2.5 for amd64 08:42:04 INFO juju.environs.bootstrap tools.go:72 looking for bootstrap agent binaries: version=2.2.5 08:42:04 DEBUG juju.environs.tools tools.go:101 finding agent binaries in stream "released" 08:42:04 DEBUG juju.environs.tools tools.go:103 reading agent binaries with major.minor version 2.2 08:42:04 DEBUG juju.environs.tools tools.go:111 filtering agent binaries by version: 2.2.5 08:42:04 DEBUG juju.environs.tools tools.go:114 filtering agent binaries by series: xenial 08:42:04 DEBUG juju.environs.tools tools.go:117 filtering agent binaries by architecture: amd64 08:42:04 DEBUG juju.environs.tools urls.go:109 trying datasource "keystone catalog" 08:42:05 DEBUG juju.environs.simplestreams simplestreams.go:683 using default candidate for content id "com.ubuntu.juju:released:tools" are {20161007 mirrors:1.0 content-download streams/v1/cpc-mirrors.sjson []} 08:42:08 INFO juju.environs.bootstrap tools.go:74 found 1 packaged agent binaries 08:42:08 INFO cmd bootstrap.go:357 Starting new instance for initial controller Launching controller instance(s) on abot-epc/regionOne... 08:42:09 DEBUG juju.environs.instances image.go:64 instance constraints {region: regionOne, series: xenial, arches: [amd64], constraints: mem=2048M, storage: []} 08:42:09 DEBUG juju.environs.instances image.go:70 matching constraints {region: regionOne, series: xenial, arches: [amd64], constraints: mem=2048M, storage: []} against possible image metadata [{Id:75eedffa-bed4-42a1-aa7f-9284f0552c6c Arch:amd64 VirtType:}] 08:42:09 INFO juju.environs.instances image.go:106 find instance - using image with id: 75eedffa-bed4-42a1-aa7f-9284f0552c6c 08:42:09 DEBUG juju.cloudconfig.instancecfg instancecfg.go:832 Setting numa ctl preference to false 08:42:09 DEBUG juju.service discovery.go:63 discovered init system "systemd" from series "xenial" 08:42:09 DEBUG juju.provider.openstack provider.go:1010 openstack user data; 2485 bytes 08:42:09 DEBUG juju.provider.openstack provider.go:1022 using network id "7fbd4134-8a7c-46b3-8be5-448e6e7a188c" 08:42:09 DEBUG goose :1 performing API version discovery for "http://172.30.9.26:9696/" 08:42:09 DEBUG goose :1 discovered API versions: [{Version:{major:2 minor:0} Links:[{Href:http://172.30.9.26:9696/v2.0/ Rel:self}] Status:CURRENT}] 08:42:11 INFO juju.provider.openstack provider.go:1146 trying to build instance in availability zone "nova" - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 1 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 2 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 3 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 4 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 5 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 6 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 7 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 8 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 9 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 10 - instance "e1719f3f-c9c8-428a-8557-e5aefd467978" has status BUILD, wait 10 seconds before retry, attempt 11 08:44:07 INFO juju.provider.openstack provider.go:1189 started instance "e1719f3f-c9c8-428a-8557-e5aefd467978" 08:44:07 DEBUG juju.provider.openstack provider.go:1193 allocating public IP address for openstack node 08:44:08 ERROR juju.cmd.juju.commands bootstrap.go:496 failed to bootstrap model: cannot start bootstrap instance: cannot allocate a public IP as needed: could not find an external network in availability zone [] 08:44:08 DEBUG juju.cmd.juju.commands bootstrap.go:497 (error details: [{github.com/juju/juju/cmd/juju/commands/bootstrap.go:588: failed to bootstrap model} {github.com/juju/juju/provider/common/bootstrap.go:50: } {github.com/juju/juju/provider/common/bootstrap.go:185: cannot start bootstrap instance} {github.com/juju/juju/provider/openstack/provider.go:1195: cannot allocate a public IP as needed} {github.com/juju/juju/provider/openstack/networking.go:185: could not find an external network in availability zone []}]) 08:44:08 DEBUG juju.cmd.juju.commands bootstrap.go:1095 cleaning up after failed bootstrap 08:44:08 INFO juju.provider.common destroy.go:20 destroying model "controller" 08:44:08 INFO juju.provider.common destroy.go:31 destroying instances 08:44:10 DEBUG juju.provider.openstack provider.go:1231 terminating instances [e1719f3f-c9c8-428a-8557-e5aefd467978] 08:44:10 DEBUG juju.provider.openstack firewaller.go:297 deleting security group "juju-b04edf44-d2d6-4302-8145-ca83871dc609-00a356a0-621b-42c3-8d62-48b2ef36e8f4-0" 08:44:13 INFO juju.provider.common destroy.go:51 destroying storage 08:44:13 DEBUG juju.provider.openstack cinder.go:81 volume URL: http://172.30.9.26:8776/v2/2a2be3eeddc54804ac57eda85987d389 08:44:13 DEBUG juju.provider.openstack firewaller.go:297 deleting security group "juju-b04edf44-d2d6-4302-8145-ca83871dc609-00a356a0-621b-42c3-8d62-48b2ef36e8f4" 08:44:14 INFO cmd supercommand.go:465 command finished 2018-05-25 08:44:14,668 - xtesting.core.testcase - INFO - The results were successfully pushed to DB http://testresults.opnfv.org/test/api/v1/results 2018-05-25 08:44:14,669 - xtesting.ci.run_tests - INFO - Test result: +-------------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | DURATION | RESULT | +-------------------+------------------+------------------+----------------+ | juju_epc | functest | 02:41 | FAIL | +-------------------+------------------+------------------+----------------+ 2018-05-25 08:44:14,750 - functest.opnfv_tests.vnf.epc.juju_epc - ERROR - Exception with Juju Cleanup: ['juju', 'debug-log', '--replay', '--no-tail'] ERROR No controllers registered. Please either create a new controller using "juju bootstrap" or connect to another controller that you have been given access to using "juju register". 2018-05-25 08:44:14,750 - functest.opnfv_tests.vnf.epc.juju_epc - INFO - Remove the Abot_epc OS object .. 2018-05-25 08:44:14,750 - functest.core.vnf - INFO - Removing the VNF resources .. 2018-05-25 08:44:21,577 - xtesting.ci.run_tests - ERROR - The test case 'juju_epc' failed. 2018-05-25 08:44:21,578 - xtesting.ci.run_tests - INFO - Xtesting report: +----------------------+------------------+--------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +----------------------+------------------+--------------+------------------+----------------+ | cloudify_ims | functest | vnf | 31:27 | FAIL | | vyos_vrouter | functest | vnf | 05:56 | FAIL | | juju_epc | functest | vnf | 02:41 | FAIL | +----------------------+------------------+--------------+------------------+----------------+ 2018-05-25 08:44:21,581 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_ERROR 2018-05-25 08:44:24,228 - xtesting.ci.run_tests - INFO - Deployment description: +--------------------------------------+----------------------------------------------------------+ | ENV VAR | VALUE | +--------------------------------------+----------------------------------------------------------+ | BUILD_TAG | jenkins-functest-apex-baremetal-daily-fraser-149 | | ENERGY_RECORDER_API_URL | http://energy.opnfv.fr/resources | | ENERGY_RECORDER_API_PASSWORD | | | CI_LOOP | daily | | TEST_DB_URL | http://testresults.opnfv.org/test/api/v1/results | | INSTALLER_TYPE | apex | | DEPLOY_SCENARIO | os-ovn-nofeature-noha | | ENERGY_RECORDER_API_USER | | | NODE_NAME | lf-pod1 | +--------------------------------------+----------------------------------------------------------+ 2018-05-25 08:44:24,233 - xtesting.ci.run_tests - INFO - Sourcing env file /var/lib/xtesting/conf/env_file # Clear any old environment that may conflict. for key in $( set | awk '{FS="="} /^OS_/ {print $1}' ); do unset $key ; done export OS_USERNAME=admin export OS_USER_DOMAIN_NAME=Default export OS_PROJECT_DOMAIN_NAME=Default export OS_BAREMETAL_API_VERSION=1.34 export NOVA_VERSION=1.1 export OS_PROJECT_NAME=admin export OS_PASSWORD=EZhwZcgCD6CaJWGE7BDRjvxtq export OS_NO_CACHE=True export COMPUTE_API_VERSION=1.1 export no_proxy=,172.30.9.26,192.30.9.4 export OS_VOLUME_API_VERSION=3 export OS_CLOUDNAME=overcloud export OS_AUTH_URL=http://172.30.9.26:5000/v3 export IRONIC_API_VERSION=1.34 export OS_IDENTITY_API_VERSION=3 export OS_IMAGE_API_VERSION=2 export OS_AUTH_TYPE=password export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available" # Add OS_CLOUDNAME to PS1 if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then export PS1=${PS1:-""} export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1 export CLOUDPROMPT_ENABLED=1 fi export SDN_CONTROLLER_IP=192.30.9.4 export OS_REGION_NAME=regionOne 2018-05-25 08:44:24,233 - xtesting.ci.run_tests - DEBUG - Test args: all 2018-05-25 08:44:24,233 - xtesting.ci.run_tests - INFO - TESTS TO BE EXECUTED: +---------------+---------------+-----------------+---------------------+-------------------+ | TIERS | ORDER | CI LOOP | DESCRIPTION | TESTCASES | +---------------+---------------+-----------------+---------------------+-------------------+ +---------------+---------------+-----------------+---------------------+-------------------+ 2018-05-25 08:44:24,234 - xtesting.ci.run_tests - INFO - Xtesting report: +-----------------------+-----------------+------------------+------------------+----------------+ | TEST CASE | PROJECT | TIER | DURATION | RESULT | +-----------------------+-----------------+------------------+------------------+----------------+ | parser-basics | parser | features | 00:00 | SKIP | +-----------------------+-----------------+------------------+------------------+----------------+ 2018-05-25 08:44:24,235 - xtesting.ci.run_tests - INFO - Execution exit value: Result.EX_OK