Content area
Current software process models (CMM, SPICE, etc.) strongly recommend the application of statistical control and measure guides to define, implement, and evaluate the effects of different process improvements. However, whilst quantitative modeling has been widely used in other fields, it has not been considered enough in the field of software process improvement. During the last decade software process simulation has been used to address a wide diversity of management problems. Some of these problems are related to strategic management, technology adoption, understanding, training and learning, and risk management, among others. In this work a dynamic integrated framework for software process improvement is presented. This framework combines traditional estimation models with an intensive utilization of dynamic simulation models of the software process. The aim of this framework is to support a qualitative and quantitative assessment for software process improvement and decision making to achieve a higher software development process capability according to the Capability Maturity Model. The concepts underlying this framework have been implemented in a software process improvement tool that has been used in a local software organization. The results obtained and the lessons learned are also presented in this paper.
Software Quality Journal, 10, 181194, 2002
2002 Kluwer Academic Publishers, Manufactured in The Netherlands.MERCEDES RUIZ [email protected] of Computer Languages and Systems, Escuela Superior de Ingeniera,
University of Cdiz, SpainISABEL RAMOS AND MIGUEL TORO {isabel.ramos, mtoro}@lsi.us.esDepartment of Computer Languages and Systems, Escuela Tcnica Superior de Ingeniera Informtica
University of Seville, SpainAbstract. Current software process models (CMM, SPICE, etc.) strongly recommend the application
of statistical control and measure guides to dene, implement, and evaluate the effects of different
process improvements. However, whilst quantitative modeling has been widely used in other elds, it
has not been considered enough in the eld of software process improvement. During the last decade
software process simulation has been used to address a wide diversity of management problems. Some
of these problems are related to strategic management, technology adoption, understanding, training
and learning, and risk management, among others. In this work a dynamic integrated framework for
software processimprovement ispresented. Thisframework combinestraditional estimation models
with an intensive utilization of dynamic simulation models of the software process. The aim of this
framework is to support a qualitative and quantitative assessment for software process improvement and
decision making to achieve a higher software development process capability according to the Capability
Maturity Model. The conceptsunderlying thisframework have been implemented in a software process
improvement tool that has been used in a local software organization. The results obtained and the
lessons learned are also presented in this paper.Keywords: software process modelling and simulation, process improvement, process maturity, dynamic
models1. IntroductionOver the past few decades software complexity has signicantly increased in such
a way that software has replaced hardware as having the principal responsibility
for much of the functionality provided by current systems. This increasing role of
software, the problems related to cost and schedule overruns, and the customer
perception of low product quality have changed the focusof attention towards
the maturity of software development practices. Although the software industry
hasreceived signicant help by meansof Computer Aided Software Engineering
(CASE) tools, new programming languages and approaches, and more advanced
and complex machines, there isa lack of processanalysistoolsfor organizations
interested in improving their process performance.Dynamic modeling and simulation asprocessimprovement toolshave been intensively used in the manufacturing area. Currently, software process modeling andA Dynamic Integrated Framework for Software
Process Improvement182 RUIZ, RAMOS, AND TOROsimulation are gaining an increasing interest among researchers and practitioners
as an approach to analyze complex business and solve policy questions.In previouswork (Ruiz, Ramos, and Toro, 2001) we attempted to apply a complex
dynamic model to support process improvement in a local software development
organization. The non-existence of a historical database and of measurement practices inside this organization made it impossible, as there were no numerical drivers
to supply the model parameters and functions. Then, we decided to apply Eberleins
work (Eberlein, 1989) about understanding and simplication of models to obtain
a reduced dynamic model capable of reproducing the software process dynamics,
yet with less initial information required. But simulation is only effective if both
the model and the data used to drive it accurately reect the real world. Thus, the
construction of the model itself points to what metric data must be collected and
helpsasa clear guideline on what to collect.In thispaper an approach isproposed that combinestraditional estimation techniqueswith System Dynamicsmodeling. The aim of thiscombination isto build a
framework to support a qualitative and quantitative assessment for software process improvement and decision making. The purpose of this dynamic framework
isto help organizationsto achieve a higher software development processcapability according to the Capability Maturity Model (Paulk et al., 1993). The dynamic
modelsbuilt inside thisframework provide the capability of gaining insight over the
whole life cycle at different levels of abstraction. The level of abstraction used in a
certain organization will depend on itsmaturity level. For instance, in a level 1 organization the simulator can establish a baseline according to traditional estimation
models from an initial estimate of the size of the project. With this baseline, the
software manager can analyze the results obtained with the simulation of different
process improvements and study the outcomes of over or underestimating cost or
schedule. During the simulation metric data are saved. These data conform to the
SEI core measures (Carleton et al., 1992) recommendation, and are mainly related
to cost, schedule and quality.The structure of the paper isasfollows. Section 2 providesa brief overview of the
work conducted in the eld of software process simulation. In Section 3, the justication found to develop the integrated framework is presented. Section 4 describes
in detail, the fundamental basis and structure of this framework. The implementation and results obtained when applying it inside a local organization are discussed
in Section 5. Finally, Section 6 summarizesthe paper and drawsthe conclusions
and lessons learnt.2. Software process simulationSimulation can be applied in many critical areas in support of software engineering.
It enables one to address issues before these issues become problems. Simulation is
more than just a technology, asit forcesone to think in global termsabout system
behavior, and about the fact that systems are more than the sum of their components (Christie, 1999). A simulation model is a computational model that representsFRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 183an abstraction or a simplied representation of a complex dynamic system. Simulation modelsoffer, asa main advantage, the possibility of experimenting with different management decisions. Thus, it becomes possible to analyze the effect of those
decisions in systems where the cost or risks of experimentation make it unfeasible.
Another important factor isthat simulation providesinsightsinto complex process
behavior which is not possible to analyze by means of stochastic models. Like many
processes, software processes can contain multiple feedback loops, such as those
associated with the correction of defects. Delays resulting from these defects may
range from minutes to years. The resulting complexity makes it almost impossible for mental analysis to predict the consequences. The most frequent sources of
complexity in real software processes are: Uncertainty. Some real processes are characterized by a high degree of uncertainty. Simulation modelsmake it possible to deal with thisuncertainty asthey
can represent it exibly by meansof parametersand functions. Dynamic behavior. Some processes may have a time dependent behavior. There
isno doubt that some software processvariablesvary their behavior asthe time
cycle progresses. With a simulation model it is possible to represent and formalize the structures and causal relationships that dictate the dynamic behavior of
the system. Feedback. In some systems the result of a decision made in a certain moment
can affect their future behavior. For example, in software projects the decision
of reducing the effort assigned to quality assurance activities has different effects
over the whole progress of these projects.Thus, the common objectives of simulation models consist of supplying mechanisms to experiment, predict, learn, and answer questions such as: What if ?
A software process simulation model can be focussed on certain aspects of the
software process or the organization. It is important to bear in mind that a simulation model constitutes an abstraction of the real system, and so it represents only
the parts of the system that have been intended to be modeled. Furthermore, currently available modeling toolssuch asithink[circleR] (High Performance Systems, 2001),
POWER-SIM[circleR] (PowerSim Corporation, 2001), and Vensim[circleR] (Ventana Systems,
2002) help to represent the software development process as a system of differential equations. Thisisa remarkable characteristic, asit makesit possible to formalize and develop a scientic basis for software process modeling and improvement.
Some noticeable applicationsof thisdynamic approach to model software process
can be found in (Kellner, Madachy, and Raffo, 1999).3. JusticationAlthough traditional methodsfor software project management have revealed their
weaknesses during the last decades, there is common agreement that they are still
useful. We think that it isimportant to integrate traditional methodsand process
simulation under a common approach, in order to obtain a valuable tool to design184 RUIZ, RAMOS, AND TOROprocessimprovementsand evaluate their effects. Rodriguesand Bowers(1996),
draw the conclusions obtained after having compared both approaches, and point
out the necessity of integration. Traditional and dynamic approaches have similar
objectives, yet the perspective under which they work is completely different. Traditional methods are normally based on a Top-down decomposition of the software
project, while the dynamic method can be characterized by the aggregation process
it is focussed on, according to which, some features of a project are joined together
under a simulation model. In accordance with what has been said, it can be deduced
that dynamic modelsare suitable to deal with problemsplaced at the strategic level,
while traditional methods are useful at the operational level of software projects.With the development of a Dynamic Integrated Framework for Software Process Improvement (DIFSPI) we offer a methodology and working environment to
join both approachesand to allow project managersand membersof the Software Engineering Improvement Group (SEIG) to design and evaluate new process
improvements. One of the main objectivesof DIFSPI isto support the evolution of
the maturity level of an organization according to the Capability Maturity Model
(Paulk et al., 1993). The process of design and development of both the framework
and the dynamic modelsthat integrate it, allowsone to dene a metricscollection
program. These metrics are necessary to both initialize and validate the dynamic
models. This metrics collection is not only useful for the dynamic models; it also
servesasan invaluable opportunity to obtain a real knowledge of the state of the
software processes inside an organization. This knowledge is essential before tackling any process improvement.4. DIFSPI development4.1. Conceptual approachUsing simulation for process improvement in conjunction with CMM is not a new
idea. As a matter of fact, (Christie, 1999) suggests that CMM is an excellent incremental framework to gain experience through process simulation. Nevertheless,
there is a lack of a dynamic framework capable of assessment in the achievement of
higher process maturity. One of the main features of DIFSPI is that this assessment
is provided not only by using the associated nal tool, but during the development
of the whole dynamic framework. The reason for thisisthat the benetsthat can be
obtained with the utilization of dynamic modelsinside an organization, are directly
related to the knowledge and the empirical information the organization hasabout
its processes. Figure 1 illustrates this idea. It shows the existing causal relationships
among the maturity level of the organization, the utilization of dynamic modelsand
the benetsobtained.The positive feedback loop comes to illustrate the causal relationship that reinforcesthe metricscollection inside the organization. The metricscollected will be
used to calibrate and initialize the dynamic models. Lower maturity organizations
are characterized by the absence of metric programs and historical databases. In
this case, it is necessary to begin by identifying the general processes and the information that hasto be collected about them. The questionsof what to collect, atFRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 185stim atellectionFigure 1. Causal relationships derived from the development and utilization of dynamic models.what frequency, and with what accuracy have to be answered at this moment. The
design process of dynamic models helps to come to a solution to these questions.
When developing a dynamic model it isrequired to know: a) what isintended to
be modeled, b) the scope of the model, and c) what behaviors need to be analyzed. Once the model isdeveloped, it needsto be initialized with a set of initial
conditions in order to execute the runs and obtain the simulated behaviors. These
initial conditionscustomize the model to the project and to the organization to
be simulated and they are effectively implemented by a set of initial parameters.
These parameters that rule the evolution of the model runs answer precisely the
former question of what data collect: those data required to initialize and validate
the model will be the main componentsof the metricscollection program.Once the componentsof the metricscollection program have been dened it can
be implemented inside the organization. This process will lead to the achievement
of a historical database. The data gathered can then be used to simulate and empirically validate the dynamic model. When the dynamic model hasbeen validated, the
results of its runs can be used to generate a simulated database; with this database it
is possible to perform process improvement analyses. An increase in the complexity
of the actionsintended to be analyzed will directly lead to an increase in the complexity of the dynamic model required and, therefore, to a new metricscollection
program for the new simulation modules.The bottom half of Figure 1 illustratesthe effectsderived from the utilization
of dynamic modelsin the context of processimprovement. Using dynamic modelswhich have been designed and calibrated according to an organizationsdata
provides three important benets. Firstly, the data of the simulation runs can be
used to predict the future evolution of the project. The graphical representations
of these data show the evolution of the project from a set of initial conditions
(which have been established by the initialization parameters). By analyzing these
graphics, organizations with a low level of maturity can obtain a useful qualitative knowledge about the evolution of the project. Asthe maturity level of the
organization increases, the knowledge about its processes is also higher and thePrediction186 RUIZ, RAMOS, AND TOROsimulation runs can be used as real quantitative estimates. These estimates help
to predict the future evolution of the project with an accuracy that isintimately
related to the uncertainty of the initial parameters. Secondly, it becomes possible
to dene and experiment with different processimprovementsby analyzing the different simulation runs. This capability helps in the decision-making process, as only
the improvements which gave the best results will be implemented. Moreover, one
of the most remarkable thingshere isthat these experimentsare performed with
no cost and risk to the organization as they use the simulation of scenarios. Thirdly,
the simulation model can also be used to predict the cost of the project; this cost
can be referred to as the overall cost, or to a hierarchical decomposition of the total
cost, as for instance, the cost of quality or revision activities. These three benets
are the main factorsthat lead to the achievement of a higher maturity level inside
an organization according to CMM.4.2. DIFSPI structureProject management iscomposed of activitieswhich are intimately interrelated in
the sense that a certain action performed over a determined area will possibly affect
other areas. For instance, a time delay will always affect the cost of the project but
it may or may not affect the morale of the development team, or the quality of
the product. The interactionsamong the different areasof project management
are so strong that on some occasions the throughput of one of them can only be
achieved by reducing the throughput of another. A clear example of thisbehavior
can be found in the frequent practice of reducing the quality, or the number of
requirementsto be implemented in a certain version of the product with the aim
of meeting the time or cost estimates.Dynamic modelshelp to understand the integrated nature of project management,
as they describe it through different processes, structures, and main interrelationships. In the framework proposed here, project management is considered as a set
of dynamic interrelated processes. Projects are composed of processes. Each process
iscomposed of a seriesof activitiesdesigned for the achievement of an objective
(Paulk et al., 1993). From a general point of view, it could be said that projects are
composed of processes which fall into one of the following categories: Management process. This category collects all those processes related to the
description, organization, and control of the project. Engineering process. All those processes related to the specication and development activitiesof the software product are collected in thiscategory.Both categoriesinteract during the time cycle of the project asFigure 2 shows.
From an initial plan performed by the project management processes, engineering
processes begin to be executed. Using the information gathered about the progress
of this second group of processes, project management processes determine the
modicationsthat need to be made to the plan in order to achieve the project
objectives. The DIFSPI proposed follows this same classication and is structuredFRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 187Figure 2. Classication of processes for software development.to attend to project management and engineering processes. In both levels, the
utilization of dynamic models to simulate real processes and to dene and develop
a historical database, will be the main feature.4.2.1. Engineering processes in the DIFSPI. On thislevel the dynamic modelssimulate the life cycle of the software product. The benets that simulation provides at
thislevel are the following: To build a model it is necessary to improve the knowledge one has about the
software development process, as it is required to establish the limits and scope
of those real behaviors to be modeled and simulated. The parametersrequired by the model and the tableswhich determine itstime
behavior will constitute the main elementsof a metricscollection program to
dene a historical database. The effective application of thismetricsprogram will feed the database. The
historical data gathered will help assess in the validation and calibration of the
model. The dynamic model will nally simulate the software processes with the knowledge and the maturity that the organization hasat the moment. The utilization of the dynamic model allows the establishment of a baseline for
the project, the investigation of possible improvements, and the development of
a historical database which can be fed either by real or simulated data.The dynamic modelsof thislevel at DIFSPI should follow the levelsof visibility
and knowledge of the engineering processes that organizations have at each maturity level. It isobviousthat the complexity of the dynamic model used in level 1
organizationscannot be the same asthat of the modelscapable of simulating the
engineering processes of, for instance, level 4 organizations.4.2.2. Project management processes in the DIFSPI. Management processes are
divided into two main categories: Plan. Thisgroupsthe processesdevoted to the design of the initial plan and
the required modicationswhen the progressreportsindicate the appearance of188 RUIZ, RAMOS, AND TOROproblems. The modelsof thisgroup integrate traditional estimation and planning
techniquestogether with dynamic ones. Control. In thisgroup all the modelsdesigned for the monitoring and tracking
activities are gathered. These models will also have the responsibility of determining the corrective actionsto the project plan. Therefore, the simulation of
processimprovementswill be of enormousimportance.4.3. Elaboration of the dynamic modelsThe approach followed in the construction of the dynamic modelsisbased on two
fundamental principles: The principle of extensibility of dynamic models. According to this principle,
different dynamic modulesare joined to an initial and basic dynamic model.
Thisinitial model modelsthe fundamental behavior of a software project. Each
one of the dynamic modulesmodelseach one of the key processareaswhich
conform the step to evolve to the next level of maturity. These modules can
be either enabled or disabled according to the objectives of the project
manager or the membersof the SEIG. The principle of aggregation/decomposition of tasks according to the level of
abstraction required for the model. Two levels of aggregation/decomposition
are used: Horizontal aggregation/decomposition according to which different sequential
tasks are aggregated into a unique task with a unique schedule. Vertical aggregation/decomposition according to which different and individual, but interrelated and parallel tasks are considered as a unique task with a
unique schedule.The denition of the right level of aggregation and/or decomposition for the tasks
mainly affectsthe modeling of the engineering activitiesand principally dependson
the maturity level of the process intended to be simulated.To dene the initial dynamic model the common feedback loopsamong the software projectswere taken into account. The objective of thisapproach wastrying to
achieve a generic model and avoid modeling certain behaviorsof concrete organizationswhich might limit the exibility of the DIFSPI. To initialize the functionsand
parameters of the initial model, data originating from historical databases collected
in the available literature were used (Putnam, 1992). By replicating some of the
equations of the initial model it is possible to model the progress to higher maturity
levels.Figures 3 and 4 illustrate the former idea of basic modeling and structure replication. The initial model can be used to simulate software projects developed in
organizations progressing to level 2. Figure 3 uses System Dynamics notation to
illustrate the components developed to model the software development activity.
The number of tasksto be developed isdetermined from an initial estimate ofFRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 189revision rate INITIAL SIZE QUALITY development rate Accomplished
tasks Revision
pending
tasks Figure 3. Basic dynamic module for software development modeling.the size of the project. These pending tasks become accomplished tasks according
to the development rate. During this process, errors can be committed. Thus, in
accordance to the desired quality objective for the project, the quality rate and the
revision rate are determined. These two rates govern the number of tasks that are
revised. To model the progress to level 3, the model will make use of a horizontal
decomposition, creating asmany substructuresasphasesor activitiesare present in
the task breakdown structure of the project (analysis, design, code and test, in our
case). According to this approach, each time a complete model or some part of it is
replicated, it will be necessary to dene the new xing mechanisms (dynamic modules) for the new structures. These mechanisms effectively implement the principle
of aggregation/decomposition previously mentioned. The replication of structures
also provides the possibility of replicating the modules related to the project management processes. This replication is especially useful for high maturity level organizations, which will be able to establish process improvement practices for each
certain activity of the life cycle.In Figure 4, the componentsof the dynamic model for level 3 organizationsare
shown. Each one of the four boxes labelled with the name of one phase of the
project is, in fact, a complete dynamic module identical to that shown in Figure 3.
The new features added to the replicated structure dene the coupling structureFigure 4. Replication of the basic dynamic module to model higher maturity processes.190 RUIZ, RAMOS, AND TOROTable 1. Main differential equationsfor the basic dynamic module for
software development modeling (maturity level 12)Pending taskst = [integraltext]Initial size
0 (revision ratet development ratet) dt
Acomplished taskst = [integraltext] (development ratet revision ratet) dt
Revision pending taskst = [integraltext] (quality ratet revision ratet) dt
quality ratet = development ratet (1 QUALITY)that joinstogether the four dynamic modules. Thiscoupling structure ismainly
composed of a matrix that keeps the order of precedence among the phases and
the percentage of a task that have to be accomplished before starting the following
one. The accomplished task fraction of each phase is the value that each dynamic
module shareswith the othersaswell asthe common knowledge of the matrix of
precedence.Table 2 showsthe new form of the equationsfor thislevel. It isimportant to notice
here the appearance of the double index to identify the phase of the project in
contrast with the equations shown in Table 1, and also the fact that these equations
are only calculated when the conditionsare favourable according to the matrix of
precedence.5. DIFSPI implementationThe former conceptual ideashave been implemented to develop a tool using JavaTM
technology (Java 2 SDK v. 1.2 and 1.3). Three groupsof initial parametersare
required to drive a simulation run: parameters related to the organization (delays,
average accepted overwork, etc.), parametersrelated to the project (size, quality
objective, initial staff, etc.), and parameters to drive the simulation run. With these
initial data, it is possible to run a rst simulation to establish a baseline to the
project. The results obtained can be graphically displayed in order to merge in a
single view the static data offered by the traditional models with the dynamic data
provided by the simulation runs. After this, it is possible to experiment with different
processimprovementsand alternative plansby changing the valuesof the parameter(s) required and running the new simulations. All the results obtained are saved
in the database. This database may be used then to feed some machine learning
algorithmsin order to automatically obtain management and processimprovement
rules(Ramoset al., 2000).Table 2. Main differential equationsof the replicated module for software development modeling (maturity level 23)Pending tasksi t = [integraltext]Initial sizei
0 (revision ratei t development ratei t) dt
Acomplished tasksi t = [integraltext] (development ratei t revision ratei t) dt
Revision pending tasksi t = [integraltext] (quality ratei t revision ratei t) dt
quality ratei t = development ratei t (1 QUALITY)FRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 191Table 3. Real and simulated data for the case studySize of the project = 80,000 LOCREAL DATA SIMULATED DATATime 250 days Time 263 days
Initial workforce 8 technician Effort 4,361 technician-day
Effort 4,780 technician-day Quality 80% (tasks revised)Workforce 9 technician5.1. DIFSPI utilizationThe potential applicationsof the DIFSPI have already been mentioned in the former sections. In this section some of the data obtained when DIFSPI was applied
inside a local software development organization are provided. This local organization could be placed at level 1. At rst the software process capability of this organization wasunpredictable because it wasconstantly changed or modied asthe work
progressed. Performance depended on both the capabilities of the project manager
and the technical team. Moreover, there were few stable software processes in evidence. According to level 1 organizations, the software process here was perceived
asan amorphousentity, a black box, and visibility into the projectsprocesses
wasvery limited. Requirementsowed into the software processin an uncontrolled
manner, giving a product asa result. The purpose of thisapplication wasto ensure
that the framework could reproduce the behavior observed in a real project and,
therefore, could trigger a metricscollection program and help in decision making,
predicting, and cost estimating. Table 3 shows the characteristics of the project that
was simulated for this case study together with the data of the baseline reported
by the simulation. It should be noted here that the data reported by the simulation
conformsthe core measuresrecommended by the Software Engineering Institute
(SEI) (Carleton et al., 1992).The scenario shown in Table 4 helps to analyze the impact of the size of the technical staff over the main four variables (time, effort, quality, and overall workforce).
Two different cases were simulated. The rst one (CASE 1) had a schedule of 250
days and 16 part-time technicians. The second case (CASE 2) had a schedule of
150 daysand 16 full-time technicians.The expected behavior for projectswith a high level of personnel isthat the average productivity per technician achieved will be lower. The average productivity
per technician in the baseline was 0.8926 tasks/(technician day). CASE 1 and 2Table 4. Simulated data for scenario analysisCASE 1 CASE 2Time 135 days Time 140 days
Effort 1,396 technician-day Effort 3,596 technician-day
Quality 91% Quality 91%Workforce 18 technician Workforce 16 technician192 RUIZ, RAMOS, AND TOROboth had double the initial workforce than that of the baseline, although schedulesand resource allocation were different between them. The average productivity
obtained for CASE 1 and 2 was, respectively, 0.8277 tasks/(technician day) and0.8142 tasks/(technician/day).6. ConclusionsMotivated by lessons learnt from another System Dynamics application in an industrial environment, the development of a framework to combine the traditional estimation toolswith the dynamic approach hasbeen initiated. The main objective of
thisdynamic framework isto assessproject managersand membersof the SEIG to
dene, evaluate, and implement processimprovementsto achieve higher levelsof
maturity. The whole process of development of the framework also helps to design
a specic metricscollection program which, once implemented, contributesto build
and feed a historical database inside an organization.With the application of DIFSPI in a level 1 organization, important benetswere
obtained. First, it must be mentioned that during the process of model building, the
project manager gained much new insight into those aspects of the development
process that mostly inuence the success of the project (time, cost, and quality).
Second, having the possibility of gaming with the DIFSPI allowed him to better
understand the underlying dynamics of the software process. As a consequence, several process improvement suggestions were easily designed and, most importantly,
analyzed using the simulation of scenarios. Finally, templates and guidelines for
a metricscollection program were almost automatically derived from the requirementsof the dynamic modules.Our future work will mainly concentrate on research towards a full development
of the dynamic modulesthat implement the key processareasof the higher maturity
levels. Once this development has been accomplished, it is intended to validate the
complete DIFSPI in real industrial environments.AcknowledgementsThe authors wish to thank the Comisin Interministerial de Ciencia y Tecnologa,
Spain, (undergrant TIC2001-1143-C03-02) for supporting this research effort.ReferencesCarleton, A., Park, R.E., Goethert, W.B., Florac, W.A., Bailey, E.K., and Peeger, S.L. 1992. Software measurement for DoD systems: Recommendations for initial core measures, Technical Report
CMU/SEI-92-TR-19. Software Engineering Institute, Pennsylvania, Pittsburgh, Carnegie Mellon University.Christie, A.M. 1999. SimulationAn enabling technology in software engineering. http://www.sei.cmu.
edu/publications/articles/christie-apr1999/christie-apr1999.html.Christie, A.M. 1999. Simulation in support of CMM-based process improvement, The J. of Systems andSoftware, 46(2/3): 107112.Eberlein, R.L. 1989. Simplication and understanding of models, Systems Dynamics Review. 1(5): 5168.FRAMEWORK FOR SOFTWARE PROCESS IMPROVEMENT 193High Performance Systems, Inc., 2001. ithink 7.0, New Hampshire, Hanover.
Kellner, M.I., Madachy, R., and Raffo, D. 1999. Software process simulation modeling: Why? What?How? The J. of Systems and Software. 46(2/3): 91105.Paulk, M., Garcia, S.M., Chrissis, M.B., and Bush, M. 1993. Key practices of the capability maturity
model, Version 1.1, Technical Report CMU/SEI-93-TR-25. Software Engineering Institute, Pennsylvania, Pittsburgh, Carnegie Mellon University.PowerSim Corporation, 2001. Powersim Studio 2001. Virginia, Hendon.
Putnam, L.H. 1992. Measures for Excellence: Reliable Software, On Time, within Budget, New York,Prentice-Hall.Ramos, I., Aguilar, J., Riquelme, J.C., and Toro, M. 2000. A new method for obtaining software project
management rules, SQM 2000, United Kingdom, Greenwich, pp. 149160.Rodrigues, A. and Bowers, B. 1996. System dynamics in project management: A comparative analysis
with traditional methods, System Dynamics Review. 12(2): 121139.Ruiz, M., Ramos, I., and Toro, M. 2001. A simplied model of software project dynamics, The J. ofSystems and Software. 59: 299309.Ventana Systems, Inc., 2002. Vensim Version 5. Massachusetts, Harvard.Mercedes Ruiz received the B.Sc. degree in Computer Science from the University of Seville, Seville,
Spain, in 1995. She is currently working towards her Ph.D. at this University. She is an assistant Professor
of Information Systems, Department of Computer Languages and Systems, University of Cdiz. Her
works have appeared in several refereed journals and she is the author of several papers, books and
chapters of books. Her research interests are in software process simulation models, software metrics
and quality, and decision support for dynamic tasks.IsabelRamos wasborn in Seville in 1957. She hasa Ph.D. in Industrial Engineering from the University
of Seville where she works teaching Software Engineering, in the Department of Computer Languages
and Systems, University of Seville. She has specialized in software development projects management,
and currently isthe manager of FIDETIA (Foundation for Research and Development on Information
Technology in Andaluca).194 RUIZ, RAMOS, AND TOROMiguelToro obtained an Industrial Engineerstitle and DoctorsIndustrial Engineer title from the
University of Seville. At the present time, he is a Professor in the Department of Computer Languages
and Systems, University of Seville, and is the manager of the Ofce of Transfer of the Investigation
Results of the University of Seville (OTRI). His research activity is mainly in two lines: Qualitative
Reasoning and Learning, and Software Engineering. He is author or joint author of a large amount of
works, including articles of magazines (national and international), congresses, books and chapters of
books. He ismember of ACM (groupsof interest SIGSOFT, SIGART), IEEE Computer Society and
other scientic societies.
Kluwer Academic Publishers 2002