OPNFV VIM Operations Compliance Test Plan

1. OPNFV VIM Operations Compliance Test Plan

1.1. Introduction

The Virtualized Infrastructure Management (VIM) operations compliance test plan outlines the method for testing VIM operations feature compliance with the OPNFV platform. VIM operations are exercised directly on the exposed VIM interfaces and verified through a combination of interface operations and parallel validations from the toolchain.

1.1.1. Scope

The tests comprising the VIM operations suite are intended to validate that interactions with the VIM result in the realization of key functions expected and required to provide an operational VIM operations management function. This includes validating the VIM operations sufficiently exercise the NFVI and NFVI events are accurately exposed at the VIM.

These test suites do not intend to validate the operations of any functions instantiated on the NFVI, the methods of interaction between the VIM and NFVI, nor the operations of the VIM outside the scope of the stated tests.

1.2. Test suite scope and procedures

This test suite is designed to be executed from a staging node, referred to as the jump server, with operational/management access to the VIM control nodes and network access to the NFVI compute nodes. All tests are designed to interact with the system under test via the control plane interfaces exposed by the VIM. Interaction with the NFVI layer is purely to validate operations requested by the VIM exercising the NFVI as expected.

The test suites assume the operator, or automated test, has the necessary authority to interact with and operate the VIM as required to perform the task outlined in each specific test case.

A test case is structured in such a way that the test case should be exercised in order from the first test to the last. Where any operation required to remove configuration or leverage previous operations is described within the test case as a further step or operation of the overall test. Each test case should begin with the SUT in a stable and ready state, and once the test case is complete the system should have been restored back to it’s previous state.

All test cases provide a description of the expected outcomes of the test, methods of validating the outcomes and the method of recording the results for compliance validation. Logs and traces should be kept and maintained for the duration of the test activity and provided along with the documented test results for evaluation. Where test cases require multiple operations, the failure of any one operation is not necessarily indicative of the test case failing and the test case should be completed as documented for the results to be evaluated.

1.2.1. Using the OPNFV automated self test compliance suite

The dovetail test suite provides an automated test suite able to perform VIM operations compliance testing as described in this document. The dovetail test tool should be run from the “jump host” provided for the staging of the test cases. The test tool will automatically source all required test artifacts, maintain the log of events and provide a secure report of the results of the tests as described in the OPNFV compliance and verification program (CVP) documentation [1].

1.2.2. Third party labs for execution of the test suite

If unable to establish a suitable SUT environment for the execution of the tests you may contact the OPNFV CVP to identify if a third party lab may be able to assist in the execution of the tests.

1.3. Test suite execution

All tests are to be executed from the staging node, or jump host, as described in the test specification. Tests should be run as described in the test specification, if possible with the automated dovetail test suite, in order that the results can be accurately evaluated.

Details of the execution of each test case is provided in the test case specification document for the tests. It is expected that all test specifications should be executed as part of the compliance activity, if you are unable to execute some parts of the test suite please provide an explanation and motivation along with the test logs for trobuleshooting.

[1]http://www.opnfv.org

2. VIM operations test design specification

This document outlines the approach and method for testing VIM operations in the OPNFV compliance test suite. It aims to provide a brief outline of the features to be tested, the methodology for testing, schema’s and criteria.

2.1. Features to be tested

The VIM operations compliance test plan outlines the method for testing VIM operations compliance to the OPNFV platform behaviours and features of VIM operations enabled NFVI platforms. VIM operations Compliance Testing test cases are described as follows:

2.1.1. Test Case 1: XXX

XXX
XXX

2.1.2. Test Case 6: Images v2 update

tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image
tempest.api.image.v2.test_images_tags.ImagesTagsTest.test_update_delete_tags_for_image
tempest.api.image.v2.test_images_tags_negative.ImagesTagsNegativeTest.test_update_tags_for_non_existing_image

2.1.3. Test Case XXX: XXX

XXX
XXX

2.2. Test approach for VIM operations

The most common approach for testing VIM operations capabilities in the test suite is through interaction with the SUT. In this instance the test framework will exercise the NBI provided by the VIM to configure and leverage VIM to operate related features in the platform, instantiate workloads, and invoke behaviours in the platform. The suite may also interact directly with the SUT to exercise platform capabilities and further invoke helper functions on the platform for the same purpose.

2.3. Test result analysis

All functional tests in the VIM operations test suite will provide a pass/fail result on completion of the test. In addition test logs and relevant additional information will be provided as part of the test log, available on test suite completion.

Some tests in the compliance suite measure such metrics as latency and performance. At this time these tests are intended to provide a feature based pass/fail metric not related to system performance. These tests may however provide detailed results of performance and latency in the ‘test report’ document.

2.4. Test identification

TBD: WE need to identify the test naming scheme we will use in DoveTail in order that we can cross reference to the test projects and maintain our suite effectively. This naming scheme needs to be externally relevant to non-OPNFV consumers and as such some consideration is required on the selection.

template:

<dovetail><vimops><Images v2 update>
<dovetail>: the project name
<vimops>: the target test suite
<Images v2 update>: the specific use cases being tested

2.5. Pass/Fail Criteria

For each specific use case, if normal response code 200 is returned the test passes, otherwise the test fails with various error codes. Refer to Images API v2.0 [1]

All the specific use cases can be run independently. If all the specific use cases in the test suite pass, the test suite passes; otherwise, it fails.

[1]http://developer.openstack.org/api-ref/image/v2/

3. Dovetail VIM Operations tc006 specification - Images v2 update

3.1. Test Case Name: Images v2 update

3.2. Test Case ID: dovetail.vimops.tc006

3.3. Objective

Verify the image update tests using the Glance v2 API

3.4. Functionality

Function description or high-level test process

3.5. Test Item

3.5.1. Test item 1

Update an image by image_id, reference: tempest.api.image.v2.test_images.BasicOperationsImagesTest.test_update_image: {idempotent_id(‘f66891a7-a35c-41a8-b590-a065c2a1caa6’)}

3.5.2. Test item 2

Update an image tag, reference: tempest.api.image.v2.test_images_tags.ImagesTagsTest.test_update_delete_tags_for_image: {idempotent_id(‘10407036-6059-4f95-a2cd-cbbbee7ed329’)}

3.5.3. Test item 3

Update tag with non existing image, reference: tempest.api.image.v2.test_images_tags_negative.ImagesTagsNegativeTest. test_update_tags_for_non_existing_image: {idempotent_id(‘8cd30f82-6f9a-4c6e-8034-c1b51fba43d9’)}

3.6. Environmental requirements

Environment can be deployed on bare metal of virtualized infrastructure. Deployment can be HA or non-HA.

3.7. Scenario dependencies

NA

3.8. Preconditions and Procedural requirements

NA

3.9. Input Specifications

The parameters needed to execute Images APIs. Refer to Images API v2.0 [1]

3.10. Output Specifications

The responses after executing Images APIs. Refer to Images API v2.0 [1]

3.11. Pass/Fail criteria

If normal response code 200 is returned the test case passes, otherwise the test case fails with the dedicated error code. Refer to Images API v2.0 [1]

3.12. Test Reporting

The test report for this test case will be generated with links to relevant data sources. This section can be updated once we have a template for the report in place.

[1](1, 2, 3) http://developer.openstack.org/api-ref/image/v2/

4. VIM operations test procedure

Based on the “designspecification” document, the VIM operations test suite includes <the number> specific test cases. For each test case and its test items, please refer to the document named “vimops.<>.specification”. The test procedure for each test case is shown as below.

4.1. Test Case 1: VIM operations XXX

XXX

XXX

4.2. Test Case 6: Images v2 update

4.2.1. Test item 1: Update an image by image_id

  1. Create image
  2. Upload an image file by image id
  3. Update image by image id
  4. Verify updating by image id

4.2.2. Test item 2: Update an image tag

  1. Create image
  2. Create image tag and verify it
  3. Delete image tag and verify it

4.2.3. Test item 3: Update tag with non existing image

  1. Create tag
  2. Update tag with non existing image and verify it

4.3. Test Case XXX: VIM operations XXX

XXX

XXX