Aurore Penault
The document generation plugin offers four main features separated in two groups: on one hand, the features enabling to create test reports, and on the other hand, the features enabling to exchange data between different projects.
The generated or exported data relate to the test plan, the campaigns and the requirements (if the plugin is present).
The access to the document generation plugin features is done with the button « Tools », which can be found in the first three Salomé windows tabs: « Test management », « Campaign managaement » and « Data management ».
Then, a window with tabs pops up.
In this window, the only required field is the one with the name of the html file target for the generation. In this first tab, the user can also choose to generate the test report in a multiframe document. By default, the report is produced as a basic html document, which is the favorite mode for printing. The multiframe mode is better for navigation when browsing the html test report.
A second tab enables to add some formatting to the test report. There are two options: either the formatting is automatically produced after filling a form, either the user gives her own formatting as html pages.
The form is the following:
The third tab of this window proposes different options. A first one enables to select the tests that will be introduced in the report. The two next options deal with graphics: the first one enables to include or not a summary graphic, the second one enables to select a black or white or color mode in view of printing. The last option enables to include attached files.
The button "Test selection..." opens a window which enables to select tests. This window is made of two parts. The left part contains the whole set of tests in the test plan, and the right part contains the selected tests. For adding or suppressing tests in the selection, you select tests (or families, or suites) in one of those two parts, and you click on the corresponding arrow. "Shift" and "Ctrl" can be used for doing multiple selection.
Like for the generation of the test report, the only required fiels is the one which gives the name of the target file. In the main tab of this window, the user can choose to include or not the executions or the executions results in the report. An option enables to generate the report in a multiframe html mode which makes the navigation easier.
The formatting tab is the same than the one for the test report generation. It enables to define a formatting either by filling a form or by using its own html pages.
The last tab offers different options. A filtering option enables to select the different campaigns and executions for integration in the report. Another option enables to select the summary graphiocs for inclusion in the report. Last, the user can choose to include the content of the attached files, depending upon the type of files, in the report.
Then, it is enough to give a name to the XML file for exporting data. The user has two options: either the whole set of data is exported, that is the whole test plan and the campaigns included in the project; either, the user chooses to export only the test plan, and, in this case, she can only select the tests to export.
The main tab enables to define the XML file from which data are imported. It also enables to select only test plan data. In that case, the user can also select the tests that will be imported.
The second tab proposes several options for the data import. The two first options deal with how to suppress the original data in the project which are not present in the XML file. If the first option is checked, the data which are not present in the XML file are suppressed, except for the actions of a manual test and the data relatives to those actions. If the second option is checked, all data are suppressed including the actions of manual tests.
When tests have been executed, some data can not be changed anymore. In this case, the operation will depend on the selected option. Either the user chooses to keep her project's data unchanged and to copy the imported data in the project with a name prefixed by "copy_", either only the data which are not creating a conflict is done; either the uses chooses to keep her project's data unchanged.
Before importing data, the user can select options that will be applied in case of conflict with the data already present in the current project:
For manual tests, if the actions are deleted, the test parameters which are not used anymore because of those deletions will be also deleted.
The deleted data are:
For the environments, the added or updated data are:
Data which need to be added or updated for test suites are their names, descriptions and, if they exist, their attachments.
For the tests themselves, we have two cases:
Before any data import, a check is done, researching whether there exists a conflict or not. A conflict is detected when the campaign which has to be imported has the same name that one campaign of the project, but contains new tests. If a conflict is detected, the import will be done following the options selected by the user.
For campaigns, the additions and the updates deal with the campaign description, its attachments, its datasets and the tests included in the campaign. For datasets, a check is done that the datasets having the same name are effectively the same. For this, it is verified that the valued parameters in the datasets are the same in the project and in the XML file, and that they got the same value. If they happen to be different, a new dataset is created with the same name than the original one, with the suffix "_Bis" added.
When updating executions data, the check is done that the executions which have the same name in the XML file and in the project are the same. A control is done on the dataset, and on the values of each parameter. A check is also done with the environment and the execution scripts. If the executions are different, a new execution is created with the same name than the original execution, with the added suffix "_Bis".
For executions' results, since the tool proposes default names, there is a need to check than the results are effectively the same. For this, a check for differences is done at the level of test execution results, but also at the level of individual actions' results if this is a manual test. Like for datasets and executions, if the executions' results are different, a new execution result is created with the added suffix "_Bis".
This document was generated using the LaTeX2HTML translator Version 2002-2-1 (1.71)
Copyright © 1993, 1994, 1995, 1996,
Nikos Drakos,
Computer Based Learning Unit, University of Leeds.
Copyright © 1997, 1998, 1999,
Ross Moore,
Mathematics Department, Macquarie University, Sydney.
The command line arguments were:
latex2html -split 1 -no_navigation -dir ../.././src_gendoc/plugins/gen_doc_xml/docs/html/en -no_footnode en/gen_doc_xml.tex
The translation was initiated by on 2006-05-17