The paper presents how an assessment system is implemented to evaluate the IT&C audit process quality. Issues regarding theoretical and practical terms are presented together with a brief presentation of the metrics and indicators developed in previous researches. The implementation process of an indicator system is highlighted and linked to specification stated in international standards regarding the measurement process. Also, the effects of an assessment system on the IT&C audit process quality are emphasized to demonstrate the importance of such assessment system. The audit process quality is an iterative process consisting of repetitive improvements based on objective measures established on analytical models of the indicators.
Keywords: Informatics Audit, Assessment, Indicator Implementation
1 Informatics audit assessment based on metric system
In [1] [2] [3] [4] [5] [6] [7] [8] and [9] theoretical and practical issues were presented for the following informatics audit terms:
* IT&C audit process;
* IT&C security audit;
* Distributed informatics system;
* Assessment framework development;
* Quantitative methods characteristics;
* Software development life cycle;
* Quality standards;
* Quality management;
* Project management.
In [1] and [10], quantitative methods for IT&C audit assessment are presented as analytical models and the interpretation of the output for a correct and reliable analysis. Briefly, the indicators and their characteristics are [1] and [10]:
* UM - usability indicator;
* PM - portability indicator;
* FM - functionality indicator;
* CM - maintainability power;
* GCM - graph complexity indicator;
* MM - maintainability index;
* RM - reliability indicator;
* EM - efficiency indicator;
* ISG - indicator for assessment of the meeting of goals; it takes values within [0; 1]; value 1 means that all goals are met, and null value means that the audit team has not reached any goal;
* IPM - indicator for evaluation of the milestone accomplishment; it takes values within [0; 1]; value 1 means that all milestones are accomplishment in time, and a null value means that the audit team has not accomplished any milestone in time;
* ITMP - indicator for assessment of the time exceeding of the milestones; the calculated value means the degree in which the time allocated to audit process must be increased to get the planned results; a null value of ITMP means that all milestones were timely accomplished and the audit process is finished at planned time;
* IWE - indicator for evaluation of the audit team effort as working time to implement the activities planned in audit program; a value of IWE less than 1 means a better use of working time or an overrating of the planned work volume; a value of IWE greater than 1 will lead to increasing of indicator IPM or a poor rating of the work volume within audit team;
* IB - indicator for calculation of the quality for funds spending; a value of IB less than 1 means an efficient use of the budget; if IB is greater than 1, then costs are greater than the planned ones;
* ICS - indicator for assessment the satisfaction of audit customer; it takes values within [0; 1]; when the ICS takes the value 1, the audit team has the total support of the audit customer to accomplish the goals of the audit program;
* IRC - indicator for covering the standards requirements by audit object: product, service, system, process; the possible values are within [0; 1]; when IRC takes value 1, that means audit object accomplishes all the requirements of the standards which the audit team included them in the audit program;
* IAVE - indicator for assessment of the added value average; it can take negative or greater than 0 values;
* ICAVE - a revised form of IAVE;
* ICAT - indicator for calculation of the added value average on each audit team member;
* IEAT - efficiency indicator of the audit team;
* IEMS - other form to evaluate theefficiency indicator of the audit team;
* WAT - productivity of the audit team.
The objective of the assessment system includes:
* Evaluation the effectiveness of the implemented controls in IT&C audit process;
* Evaluation the effectiveness of the IT&C audit process;
* Increasing the performance of the IT&C audit process and audit teams;
* Providing input data for audit process management in decision making processes.
The above indicators compose an assessment system to evaluate the quality of the audit processes. The quality of the audit processes is evaluated from the following points of views [1]:
* Efficiency - evaluation of the results reporting them to the financial costs;
* Effectiveness - evaluation of the results reporting them to the planned goals.
The following issues are stated in [11], to highlight the requirements necessary to develop and implement an assessment system:
* Identifying the indicators based on a methodology;
* Specifying the goal of the assessment system;
* Indicator traceability back to the goals;
* Clear understanding of the type and purpose of each indicator;
* Small start point for assessment;
* Indicators for detecting the trends and hidden tradeoffs;
* Customizing the indicator template;
* Use of definition checklist;
* Dissemination of the unambiguous information;
* Privacy issues of the indicators;
* Respecting the needs of involved people;
* Identifying the adequate solutions available if there is no consensus;
* Using of pilot implementation;
* Planning some assessment on short term;
* Maximizing the relevant information and minimizing the collection effort;
* Testing of the assumptions;
* Taking into account the unintended consequences and the perspectives of different stakeholders.
In accordance with the international standard ISO/IEC 27004, the measurement objectives of the audit process quality can take into consideration the following issues [12]:
* The role of audit quality in support of the organization's overall business activities and the risks it faces;
* Applicable legal, regulatory, and contractual requirements;
* Organizational structure;
* Costs and benefits of implementing information security measures;
* Risk acceptance criteria for the organization;
* A need to compare several audit processes within the same organization.
Also, the international standard ISO/IEC 27004[12] establishes the need of a measurement programme. For an audit process assessment, the evaluation of the audit quality is systematized in an Audit Quality Measurement Programme - AQMP. The AQMP must include the following processes:
* Measures and measurement development;
* Measurement operation;
* Data analysis and measurement results reporting;
* Audit Quality Measurement Programme evaluation and improvement.
The AQMP is implemented on audit quality measurement model that describes the quantifications and conversion of the attributed into indicators. Indicators are basis for decision making regarding the adjustments and improvements of the audit process.
2 Implementation process of the indicator system
In [13], a component of the measurement model is represented by indicators plan and this plan is depicted as indicators process model.
The stages of the indicators process model are:
1. Establishing and validation of the goal;
2. Creating the indicator plan;
3. Reviewing the indicator plan;
4. Implementation of the indicator plan.
Implementation of the indicator plan includes the following activities [13]:
* Collecting data - data are gathered from available data sources: documents, primary metrics, data repositories, legal requirements and reports, and so forth;
* Validation of data-establishing that the data are valid as accuracy, availability, consistency, legality, reliability and timeliness;
* Analyzing data - gathered data are analyzed, compiled and aggregated into indicators, using the analytical models of the indicators; the result are interpreted to identify causes of findings;
* Making decision - it decides on resource allocation, what are the improvements that must be made and the order in which the improvements are made, communication of the analyze conclusions to internal or/and external parties.
In [14], a selection process for a successful implementation of an assessment programme is depicted. The twelve steps of this selection process are grouped into tree classes:
* Identifying the indicators' customers and selection the indicators meeting the information needs;
* Designing and tailoring processes regarding definitions, models, counting criteria, benchmarks, reporting and additional qualifiers;
* Implementation issues as data collection and minimization of the impact of human factor on indicators.
The steps of the selection process are:
1. Identify indicators' customers - indicator's customer is the person or people who will make decisions or take action based on indicator value;
2. Target goals - the goals aim the strategic issues (audit as process) and success factors (audit as project);
3. Ask questions - defining questions to get answers to ensure that each goal is accomplished;
4. Select indicators - the indicators are selected to provide information for answers;
5. Standardize definitions - use of standard definitions for measured attributes;
6. Choose a measurement function - the way in which an indicator is calculated;
7. Establish a measurement method - measurement function is broken down into lowest level base measures;
8. Define decision criteria - aims the indicator result and determines the need for action or further investigation;
9. Define reporting mechanisms - the way in which an indicator is reported;
10. Determine additional qualifiers - additional qualifiers aim various views of the indicator;
11. Collect data - a successful indicator program is accomplished by a good data collection plan;
12. The people side of the indicator equation - successful implementation depends on attitude of the involved people and the assessment system alters the human behavior.
Another example of indicators implementation process is depicted in [15] and goes through the following stages:
1. Creating or updating the indicators;
2. Collecting data;
3. Storing data;
4. Analyzing and compiling data;
5. Reporting indicators ;
6. Use of indicators.
Recommendations to fulfill measurement requirements are presented in the international standard ISO/IES 27004 [12], as it follows:
* Developing measures as base measures, derived measures and indicators;
* Implementing and operating an Measurement Programme; for audit process quality measurement, an Audit Quality Measurement Programme is built;
* Collecting and analyzing data;
* Developing measurement results;
* Communicating results to the relevant stakeholders;
* Using results as contributing factors to decisions related to audit quality responsible;
* Using results to identify needs for improving the implemented audit quality process;
* Facilitating continual improvement Audit Quality Measurement Programme.
The above recommendations are taken over from ISO/IEC 27004 and they are adapted to audit quality process.
There is the possibility that an Audit Quality Measurement Programme fails. The following factors must be considered in order to obtain a successful implementation of a measurement programme [16]:
* Measurement must have a goal - indicators without a practical application for the measurement programme must be eliminated;
* Analysis of the obtained values for the indicators - indicators must be related to objectives and performance;
* Target setting process - history and experience must be considered in addition to numbers;
* Communication and collaboration - analysis and engineering activities can lead to a separation between them; this separation is not recommended.
In [17], the measurement is defined as "the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules".
An entity is represented by a person, place, thing, event or time period.
An attribute is represented by a feature or property of the entity.
Measurement is based on entity, so the entity must be identified firstly. For an audit quality assessment system, the entity is a time period, so the audit process is evaluated.
The audit quality measurement model emphasizes how the attributes are quantified and converted to indicators. Decisions are made on indicators.
Implementation process of an indicator system can be evaluated on [12]:
* Implementation of Audit Quality Measurement Programme;
* Measurement specifications;
* Performing the measurement activities according the calendar;
* Collected data and analysis records;
* Reporting the measurement result to the management or relevant stakeholders.
Techniques to implement indicator systems aim the ways in which the values of measurement results are obtained from input data.
The Audit Quality Measurement Programme succeeds whether there are controls over the audit process and audit staff. To verify that the audit process is a quality one, indicators are implemented to provide quantitative information on audit process quality.
3 Quantitative methods as base of audit process improvement
Carnegie Mellon Software Engineering Institute - SEI [18] states "A description of relationships among attributes of a process and its work products that is developed from historical process-performance data and calibrated using collected process and product or service measures from the project and that are used to predict results by following a process." as definition of process performance models.
The process overview for an integrated measurement programme to control and improve an audit quality process is depicted in Figure 1 [16]:
Audit quality control and improvement process has several repetitive stages [16]:
* Set objectives for products and process;
* Forecast and develop plans both for projects and for departments;
* Compare actual metrics with original objectives;
* Communicate metrics and metrics analyses;
* Coordinate and implement plans;
* Understand and agree to commitments and their changes;
* Motivate people to accomplish plans;
* Measure achievement in projects and budget centers;
* Predict the development direction of process and product relative to goals and control limits;
* Identify and analyze potential risks;
* Evaluate project and process performance;
* Investigate significant deviations;
* Determine if the project is under control and whether the plan is still valid;
* Identify corrective actions and reward/penalize performance;
* Implement corrective actions.
The indicators included in a measurement programme must satisfy quality requirements as presented in [16]:
* Sustainability- it refers to validity and availability over some period of time for indicators included in the measurement programme;
* Timeliness - the considered indicators must be available when they are used; otherwise, the results are not complete, and decisions are not consistent;
* Meaning - the indicators should provide exactly the information asked for; also, they should be available for aggregation in decision making processes;
* Goal-oriented - the selected indicators must address concrete objectives; they are developed and reported if there is a specific need and a decision making process based on these indicators;
* Balancing - the goals and indicators must not be separated because there are hidden spots that must be considered for a successful measurement programme implementation.
Development and implementation of an Audit Quality Measurement Programme over a period of time can lead to large data amount stored in specialized databases. To improve the quality of informatics audit process, the results of indicators applied in audit processes can be used to improve IT&C audit process, so a high-level of quality for organization that implement audit programmes.
To achieve the above goal, statistical methods and other modern tools are used. In modern tools category, data mining can be included. Data mining is an extension of the statistical analysis techniques, including new techniques from statistics, artificial intelligence, database management and increased computer power. To apply data mining technique to improve audit process, some successful factors must be considered [19]:
* Knowledge of audit domain;
* Collection and preparation of good data;
* Data analysis;
* Right questions to ask.
Data mining as process improvement technique is used in the following forms [19]:
* Classification;
* Regression;
* Clustering;
* Association.
Classification - is made based on variables selected to form classes. Classes are grouped on levels, and the result structured has the form of a tree.
The values of variables used to build the tree are criteria to classify future audit processes. This classification is a prediction and the goal of the analysis is to see if there is key factors that can be used for a such classification.
In the way explained above, decision can be taken to improve some characteristics of an IT&C audit process like productivity, budget, audit plan coverage and so forth.
Regression - is used to find predictive factors. It is a good model of data mining when the independent factors are supposed as being correlated. The variables are introduced one by one in the regression model and meanwhile variables are tested for removal.
The best model is built after a certain number of steps in which different factor combinations are tested. The model applying will display: factors in the equation, standards error, confidence intervals.
In addition to regression, correlation is used to identify variables that can be candidates. Correlation has the following types:
* Pearson - ratio data;
* Kendall's Tau-B - ordinal data;
* Chi square test - categorical data.
Also, it can appear non-linear correlation between the variables. The function is a nonlinear combination of the model parameters.
Clustering - is used to detect groups in data. Data series are segmented for further analysis. There are the following methods:
* K-Means - iterative process from initial set of cluster centers to final set of centers; the nearest mean is the criterion to group an observation; each object is assigned to cluster whose center is nearest to object;
* Hierarchical clustering - pair of objects that most resemble each other and adding new object until they are all in the one cluster; it creates a hierarchy of clusters represented in a tree structure; the root of the tree represents all objects, and the leaves are individual objects.
Association - considers any relationship between large numbers by grouping variables into factors. It is used for relationships that are not necessarily causal.
Data mining technique facilitates the process to extract patterns from large data sets resulted from previous implementations of measurement programmes.
Valid judgments on measurement models are made if the historical data sets are complete, correct, consistent, timely and detailed, and they were aggregated into key performance indicators. The quality of the collected and stored data is critical for confidence and granularity of the estimation for the success of the audit measurement model implementation.
Process improvement aims optimized using of the techniques and tools for a better carrying out of the procedures, resource management and other elements to transform the inputs into outputs. To improve a process means to learn the causes and solutions of the bad things that appear during the process. This continuous learning process increases the process managers' experience to select the appropriate techniques and tools for a better process.
As effect, applying of specific techniques and tools becomes rule and then rules become standards within the process field framework. In [20], a basic process improvement model is presented. The model is structured in two parts:
* Selecting the process and establishing improvement objective - first 7 steps;
* PDCA cycle - the last 7 steps of the model; PDCA means Plan-Do-Check-Act;
The steps of the model are [20]:
Step 1. Selecting the process and establishing the process improvement objective - the following activities are included: establishing what is important for the customer, starting with simple process, process can be observed and documented, establishing the starting and stopping points, identifying the factors or problems to be investigated, processes performing poorly or offering a high payback are selected, eliminating the primarily controlled or constrained processes, assigning a single team for each process improvement, listening internal and external customers, identifying problems associated to the process, clear definition of the process;
Step 2. Organizing the "right" team - the following activities must be approached: establishing the people who work inside the boundaries of the process, selecting team members to cover all steps as knowledge, choosing the team leader, writing the charter including: process to be improved, process improvement objective, team leader, team members, team constraints, team's decision-making authority, resources to be provided, reporting requirements; also, this step include the ground rules as requirement for the team, guidelines for team meetings, team training;
Step 3. Developing the process flowchart - aims observation of the work flow, depicting what is happen inside the process, talking to people from other departments or organizational levels;
Step 4. Simplifying the process and making changes - activities like identifying the redundant steps or decision points, unnecessary inspections and old procedures are carried out to identify the resource wasting spots; changes aim elimination of the resource wasting spots and update the flowchart;
Step 5. Development of the data collection plan and collecting the baseline data - requires a more scientific approach, relying on statistical data; the team must identify the characteristic of the product or service to be changed in order to accomplish the process improvement objective; also, measurement points on the flowchart, creating a data collection form and data collectors training must carry out in this stage of the process improvement;
Step 6. Establishing the process stability - baseline data are analyzed using tools as control chart and run chart; control chart is used to establish whether a process is stable and predictability of the future performance; the two kinds of charts are used to identify the special cause variation in the process;
Step 7. Establishing the process capability - based on the same data as step 5; it is determined on a bar graph called histogram; the team establish the capability on histogram built on collected data and the target value from process improvement objective;
Step 8. Identifying the root causes for lack capability -the PDCA cycle is started; the root causes are identifying on a Cause-and-Effect Diagram and they are verified from collected data; the relative importance of the root cause is highlighted by a Pareto chart;
Step 9. Plan to implement the process change - it starts the Plan phase from PDCA cycle; possible root causes identified in step 8 are followed by a plan to change the process in order to remove or reduce the effects of the root causes; the simplified flowchart is changed; changes are made after obtaining permission from authorizing responsible;
Step 10. Modifying the data collection plan - reviewing the data collection plan developed in step 5 to establish whether it is valid for measurement the changed process; if necessary, the data collection plan is modified to assess the performance of the changed process;
Step 11. Testing the change and collecting data - represents the Do stage of PDCA cycle; before testing, some activities must carry out: test planning of the changed process, participants' training and process standardization, distribution of the data collection sheets; after testing of the changed process, data collection sheets are retained and collated;
Step 12. Establish the stability of the changed process - team checks whether the expected results were achieved; the analyze is made from data collection sheets as result of stage 11; the procedures are identical to those in step 6; a control chart or run chart is used to establish the new process stability; if any stability rule is broken, then the process improvement is returned to the earlier state in step 9;
Step 13. Establishing whether the process was improved -completed the Check phase of PDCA cycle; procedures are similar to those in step 7; the team can identify any difference between the planned process improvement and what they obtained; if the expected results were not achieved, the team must investigate the data to find what it did not work and to see other changes that can be implemented in the initial process; also, some questions regarding the planned improvements and data collection are asked within the process improvement team;
Step 14. Standardization and low frequency of data collection -represents the Act stage from PDCA cycle; standardization can be made whether the process is stable and capable, the customers are satisfied and the team have authorization; there are two ways to be approached: standardization of changed process or identifying for making further process changes.
The audit process quality depends on indicator system implementation and history information. Also, the audit process quality has a cost that can be reduced using the historical data from the previous audit process quality measurement.
The audit process is improved and its quality increases as indicator system is widely used and this system uses high data quality.
4 Conclusions
Quantitative process management in IT&C audit processes is developed on audit process customization, audit measurement, history data and indicators together with their statistical control.
Quantitative process management in informatics audit processes consists in:
* Establishing the goals for audit process performance;
* Analyzing the result indicators;
* Implementation of process adjustments to maintain it within acceptable limits as performance.
To achieve the above quantitative management activities, the management staff must obtained process performance data from the audit processes developed within organization or by organization. After that, the management has to use data to establish the process capability. Future processes are variable, but the management will understand the variation and it can predict the future audit processes.
Selection of the appropriate techniques and methods to implement indicator systems for audit process assessment is very important to obtain high quality data about the audit process. Data are stored in databases and computer systems, become historical data for the next audit process assessments and they are critical for statistical analysis of the future audit processes.
Acknowledgment
This work was supported by CNCSIS - UEFISCSU, project number PNII - IDEI 1838/2008, contract no. 923/2009 and the title Implementation of the Quantitative Methods in Distributed Informatics System Audit, financed by The National University Research Council - Ministry of Education, Research, Youth and Sports from Romania.
References
[1] M. Popa, C. Toma and S. Capisizu, "Quantitative Methods Development for IT&C Audit Processes", Economic Computation and Economic Cybernetics Studies and Research, paper submitted to be published, 2010.
[2] M. Popa, "Audit Process during Projects for Development of New Mobile IT Applications", Informatica Economica, vol. 14, no. 3(55), 2010, pp. 34 - 46.
[3] M. Popa andS. Capisizu,"Using Quantitative Methods as Support for Audit of the Distributed Informatics Systems", Informatica Economica, vol. 14, no. 1(53), 2010, pp. 103 - 112.
[4] M. Popa, "Quality Management of the IT Security Audit", inProc. The 3rd International Conference on Security for Information Technology and Communications, 2010, ASE Publishing House, Bucharest, pp. 223 - 232.
[5] M. Popa and C. Toma," Stages for Development the Audit Processes of the Distributed Informatics Systems", Journal of Applied Quantitative Methods, vol. 4, no. 3, 2009, pp. 359 -371.
[6] M. Popa,"Characteristics for Development of an Assessment System for Security Audit Processes", Economy Informatics, vol. 9, no. 1, 2009, pp. 55 -62.
[7] M. Popa, C. Toma and C. Amancei,"Characteristics of the Audit Processes for Distributed Informatics Systems", Informatica Economica, vol. 13, no. 3(51), 2009, pp. 165 - 178.
[8] M. Popa,"Requirements for Development of an Assessment System for IT&C Security Audit", Journal of Information Technology & Communication Security, 2009, ASE Publishing House, Bucharest, pp. 221 - 230.
[9] M. Popa and A. Paraschiv, "Premises for Development of an Assessment System for Security Audit of Distributed Information Systems", inProc. The Ninth International Conference on Informatics in Economy - Section 7: Informatics Security, 2009, ASE Publishing House, Bucharest, pp. 827 - 832.
[10] M. Doinea, "Life Cycle Based Audit Process for Distributed Applications", Journal of Information Systems & Operations Management, vol. 4, no. 2, 2010, pp. 137 - 146.
[11] W. Goethert and W. Hayes, Experiences in Implementing Measurement Programs, Software Engineering Measurement and Analysis Initiative, Carnegie Mellon University, Technical Note, 2001.
[12] International Standard ISO/IEC 27004, Information technology - Security techniques - Information security management - Measurement, 2009.
[13] T. Augustine and C. Schroeder, "An Effective Metrics Process Model", The Journal of Defense Software Engineering, vol. 12, no. 6, 1999, pp. 4 - 7.
[14] L.Westfall, 12 Steps to Useful Software Metrics, The Westfall Team, 2005.
[15] N. Bartol andB. A.Hamilton, Practical Measurement Framework for Software Assurance and Information Security, National Institute of Standards and Technology Software Assurance (SwA), 2008.
[16] C. Ebert, R. Dumke, M. Bundschuh and A. Schmietendorf, Best Practices in Software Measurement, Springer-Verlag, 2005.
[17] N. Fenton, Software Metrics, A Rigorous Approach, Chapman & Hall, London, 1991.
[18] http://www.sei.cmu.edu
[19] P. Below, "Data Mining for Process Improvement", CrossTalk - The Journal of Defense Software Engineering, vol. 24, no. 1, 2011, pp. 10 - 14.
[20] Balanced Scorecard Institute, Handbook for Basic Process Improvement, May 1996.
Marius POPA, PhD
Department of Computer Science in Economics
Academy of Economic Studies, Bucharest, Romania
Marius POPA has graduated the Faculty of Cybernetics, Statistics and Economic Informatics in 2002. He holds a PhD diploma in Economic Cybernetics and Statistics. He joined the staff of Academy of Economic Studies, teaching assistant in 2002 and universitary lecturer in 2006. Currently, he is university lecturer in Economic Informatics field and branches within Department of Computer Science in Economics at Faculty of Cybernetics, Statistics and Economic Informatics from Academy of Economic Studies. He is the author and co-author of 6 books and over 100 articles in journals and proceedings of national and international conferences, symposiums, workshops in the fields of data quality, software quality, informatics security, collaborative information systems, IT project management, software engineering. Also, he was involved in 14 national research projects as team member and 1 national research project as project manager. Now, he is project manager in a national research project supported by National University Research Council of Romania and he is a team member in a European research project. From 2009, he is a member of the editorial team for the Informatica Economica Journal and between 2003 and 2008 he was a member of the editorial team for the journal Economic Computation and Economic Cybernetics Studies and Research.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright INFOREC Association 2011
Abstract
The paper presents how an assessment system is implemented to evaluate the IT&C audit process quality. Issues regarding theoretical and practical terms are presented together with a brief presentation of the metrics and indicators developed in previous researches. The implementation process of an indicator system is highlighted and linked to specification stated in international standards regarding the measurement process. Also, the effects of an assessment system on the IT&C audit process quality are emphasized to demonstrate the importance of such assessment system. The audit process quality is an iterative process consisting of repetitive improvements based on objective measures established on analytical models of the indicators. [PUBLICATION ABSTRACT]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer