Abstract. Customer satisfaction and customer feedback are essential to software manufacturers for a better understanding of customer needs.
This paper focuses on one Croatian software development company that relies on its employees to communicate with customers and propose necessary actions for improving customer satisfaction.
A questionnaire was used to collect the data. It aims to determine customer involvement in developing new features and what information can be extracted from customer feedback.
The results show that senior employees can make decisions based on customer feedback. Additionally, these employees involve customers in the development process.
Keywords. Customer satisfaction, customer feedback, user experience, user involvement, data collection
1 Introduction
When creating a new product or improving an existing one, developers and designers must consider the end users, their needs, and how they will or want to use a product. End users care about doing their tasks and achieving their goals, not how the product was made or the company that created it (Saffer, 2010).
For that purpose, user experience metrics are essential. From these metrics, the company and its employees can get a deeper insight into the acceptance of their product. Metrics add structure to the development and evaluation process and inform decision-makers (Tullis & Albert, 2013)
When creating a product, the focus must be on the end users. The user-centered design process emphasizes the needs and perspectives of end users. It is a process in which the needs, wants, and limitations of the end users of a product are given attention. In the user-centered design process, users are involved in the design process (Allen & Chudley, 2012), similar to the involvement of the end-users in Agile software development. Agile focuses on lightweight working practices, frequent deliveries, and customer collaboration (Kupiainen et al., 2015)
The selected information system within the scope ot this research is a private information system for state administration bodies and, as such, is not available to everyone. Depending on the organization, there may be restrictions on data collection and the ability to communicate with users about their feedback in a mass manner. As a result, the software development company has limited options for gathering user feedback and opinions about the information system.
This research paper aims to examine user experience through a proxy, ie. by asking the company's employees who communicate with users about their users' opinions. Additionally, this paper examines employees" positions and their views on their work and work environment to identify potential improvements for enhanced involvement in development and increased customer participation in development.
For this purpose, the following research questions were specified:
* Research Question 1 (RQ1): Are employees in direct communication with users? If so, what positions or functions do they hold within the company?
* Research Question 2 (RQ2): Are users involved in the development process?
* Research Question 3 (RQ3): How is the feedback about user experience being managed?
This paper is structured as follows. Section 2 provides background information about user experience and related work to this study. Section 3 describes the research methodology, when and how the research was conducted. Section 4 presents the study's results. Section 5 contains a discussion. Section 6 describes limitations and section 7 describes threats to validity. Section 8 concludes this paper.
2 Background and Related Work
In the past, computers were used mainly by a core of technically oriented users who were not only willing to accept the challenge of overcoming poor usability but sometimes welcomed it as a barrier to protect the craft from uninitiated "outsiders". Poor usability was good for the mystique, not to mention job security (Hartson & Pyla, 2012).
As more people began to use computers, the public was generally slow to realize they could demand a better user experience. Poor designs can indeed look so bad to users that they are forced to assume they could not be that bad unless they were deliberate. Today, endusers want to use technology to learn, be entertained, and connect with others, looking beyond sheer functionality or usability to achieve emotional satisfaction (Hartson & Pyla, 2012).
In modern software development, user involvement throughout the development process is considered best practice. It leads to increased development productivity and user satisfaction with the product. Users' involvement relates to their participation in activities associated with specifying, elaborating, prioritizing, reviewing, and verifying the requirements, as well as testing and verifying developed features. The effectiveness of user involvement can vary considerably in different projects, and it can be challenging to achieve (Buchan et al., 2017)
Evaluating the usability based on users' reports is a challenging task. Several factors, including an increase in the number of users (clients) or the context of the user task, influence the final report (Alomari et al., 2020).
To track progress and monitor user involvement, metrics need to be chosen. These metrics facilitate a deeper understanding of the project's current state and inform future decision-making.
There are many papers regarding user experience. Also, based on these papers, systematic literature reviews were conducted.
Amirova et al. (2019) examined what factors affect customer satisfaction and how it can be measured. The most significant results were that it is vital to identify customer needs and roles, incorporate customer feedback into planning and development, and foster long-lasting relationships with customers. The most used techniques for measuring customer satisfaction are interviews, feedback reports, questionnaires, and surveys. The authors stated that it is essential to communicate with the customer.
Ntoa et al. (2021) proposed a framework for evaluating user experience in an intelligent environment. The proposed framework contains 103 metrics for measuring the acceptance and usability of information systems and user experience. The research paper (Majumder, 2025) stated that some users do not know how to express their concerns about the product or interface design, and proposes a solution for tracking users when using the product to collect feedback.
The research paper (Hinderks et al., 2022) aimed to investigate user experience management in Agile software development. The conclusion of that paper stated that no approaches directly address user experience management.
The conclusion from published papers on user experience and user involvement in the projects is that user-centered design works well with Agile software development.
3 Research Method
This study used a questionnaire to collect data to determine which employees communicate with customers based on their role and what are employees' conclusions from that communication.
Challenges in creating the information system for state administration bodies and collecting user feedback include limited access to end-users, legal regulations, and security restrictions. Meaning that initiating communication from the manufacturer of the information system to end users is not allowed, and communication is only allowed with selected persons from state administration bodies (customer representatives). For these reasons, a questionnaire was created to determine what can be determined about user experience based on limited access.
The questionnaire consisted of twenty-eight questions divided into four groups (see Appendix A for the complete questionnaire). Before creating the questionnaire, existing literature and published research papers were examined.
The first group consisted of six questions about employees' work experience in the company and their views on their roles. Four questions were closed (multiple-choice), and two were open (employees could write anything). The last two questions in this group were based on the research paper (Buchan et al., 2017).
The second group consisted of seven questions regarding communication with customers and their involvement in the development process. Four questions were multiple-choice, and three were openended. This group's third and fourth questions were selected based on the conclusions drawn from the research paper (Amirova et al., 2019). The fifth and sixth questions are based on the research paper (Buchan et al., 2017).
The third group had twelve questions about employees" conclusions regarding user experience with the information system, based on customer feedback and user experience evaluation. Nine questions were multiple-choice, and three were openended. This group of questions is the most important because it contains questions that determine user options and experiences regarding the user interface and integration with other services, such as automated data collection and notification via email or other communication channels.
The first three questions in the third group are based on (Saffer, 2010). These questions focused on examining user satisfaction with the general usage of the information system, the user interface, and its integration with other services.
The fourth question in the third group is based on the research paper (Alomari et al., 2020) to determine user opinions on the information system immediately after education for new users. The third group's sixth, seventh, and last three questions are based on (Tullis & Albert, 2013). The eighth and ninth questions were chosen according to (Allen & Chudley, 2012)
The fourth group consisted of three questions about customer requests for new features or improvements to existing ones. All three questions were multiple-choice questions. The first question in this last group was chosen according to (Hartson & Pyla, 2012). The second question is the result of the analysis of the research paper (Vanhanen et al., 2018).
In total, twenty questions were closed (multiple-choice), and eight were open questions. For the eight questions, respondents were asked to provide a grade on a scale of one to five. Grade one represents the worst grade, while grade five represents the best grade. Additionally, each of these questions has an alternative answer if the respondent was unable to provide an answer.
The questionnaire was available from October 22, 2023. till November 5, 2023. Employees who worked on the project for at least six months were contacted via email with an explanation of the research and a link to the questionnaire. Based on the criteria, thirty employees were chosen. The questionnaire was anonymous, allowing employees to provide honest answers. However, it is also a limitation because it does not guarantee that all employees will complete the questionnaire or provide an answer to all questions. Ultimately, fourteen employees (46.6 %) completed the questionnaire.
4 Results
This section presents the results of the questionnaire, including answers to the research questions. It is divided into seven subsections. The first four subsections present the results of the questionnaire, while the last three subsections contain answers to the research questions.
4.1 Employee's Work Experience
The first group of questions was designed to gather basic information about employees and their work experience on the project. Most of the employees who completed the questionnaire were male (71.43 %).
Most employees in the company have multiple roles. Of all employees who filled out the questionnaire, 50 % have more than one role. Table 1 lists all roles and the number of employees (respondents) assigned to each role. The most frequently assigned role is Implementer, whose responsibility is to configure and implement new features on clients' servers after deployment. The Product Manager, Product Owner, and Scrum master roles have only one assigned employee, and all three roles are assigned to the same employee.
Fig. 1 shows the number of employees on the project per year. Six respondents have been working on the project for less than a year. Three respondents worked on the project between five and ten years. The majority of the employees on the project are new to the company.
In the questionnaire, respondents were asked to self-evaluate their knowledge of the information system. Fig. 2 shows the number of employees per grade. The lowest grade represents no knowledge of the information system, while the highest grade represents that the employee is thoroughly knowledgeable about the information system. New employees (employed between six and twelve months) self-evaluated their knowledge with grades three and four.
This group contained two open questions regarding employees' work. The first of these questions was about improving their understanding of assigned roles and work assignments. Most answers focused on improving the onboarding process, which is unsurprising given the high number of new employees. Other answers were better definitions of assignments for employees with multiple roles and improved communication between employees.
The second open question was how their work contribution and results are evaluated by their supervisors. The goal of this question is to determine the metrics on which respondents are focused regarding their contribution. For example, are they focused solely on completing their tasks, communicating with users, or working as a team? The majority of answers were time spent on solving assignments. In two answers, it was stated that they do not know how their work is evaluated.
Employees' answers to those two questions suggest that the company does not invest much time into onboarding and documentation for new employees. Additionally, it involves enhancing internal documentation and procedures for work evaluation and communicating these to employees.
4.2 Communication with Users
This subsection presents the results for the second group of questions, which aim to determine whether employees have direct contact with clients and whether users influence the development process.
Regarding direct communication with clients, 85.71% of respondents answered positively. Respondents who answered negatively are developers.
Regarding whether users provide feedback, most answers were positive (78.57%), but 14.29% of employees reported not having information or being unaware of users' feedback. Only 7.14 % of answers were negative.
Most answers were negative (42.86 %) regarding whether users actively participate in the development. Only 21.43 % of answers were positive. About a third of employees (35.71 %) were unsure whether users participate in development or could not answer the question. There is no direct connection between employees who responded positively to this question and their job roles.
To assess employees' knowledge of the chosen information system, a question was asked about its years of use. The results are shown in Fig. 3. The current major version has been in use for seven years (at the time this paper was written), but the information system has been operational for over ten years.
Only one respondent answered that the information system has been in service for less than a year. That respondent also worked on the project for less than a year. The questionnaire does not determine why the respondent chose the answer "less than a year", but it can be assumed that it is due to an inadequate or incomplete onboarding process. Other respondents chose the answer "more than five years", which is in accordance with the actual situation.
This group contained three open questions. The first open question was about the users' role in the information system's development. This question was optional, and only ten respondents answered. All answers can be divided into two groups.
One group contains six answers in which users provide feedback on new features and report defects. The other group includes four answers in which respondents stated that users suggest improvements to the information system or request new functionalities. Four respondents have not answered. The conclusion for this question is that 42 % of respondents consider users to be testers, not customer representatives who have purchased an information system and pay for system maintenance and upgrades.
The second open question was how interaction with the users contributed to the development of the information system. Eleven respondents provided answers to this question. Like the previous question, all answers can be divided into two groups.
The first group, which consisted of eight responses, determined that interaction with users improved their understanding of what users want and how to improve the information system. The second group, comprising three responses, stated that interaction with the users is poor, whether users try to find out how to work around restrictions, or that their feedback is not considered.
Ten respondents answered the third open-ended question, which asked how employees differentiate between good and bad user contributions. Five responses were positive, in which respondents stated that when users understand how the information system works and what can be achieved, their suggestions and feedback are a good contribution. Three responses were negative because users do not want to be involved in the development or lack the necessary knowledge in computer science. Two responses indicated that respondents were unable to answer because they lacked access to the information.
4.3 Employees' Observations and Conclusions
This subsection presents results for employees' conclusions about users' experiences with the information system. All conclusions are based on user feedback and communication with them.
Respondents were asked to grade the user experience for the selected information system in six categories. Table 2 contains the results (all values are displayed as percentages). The first column in Table 2 contains categories. The following five columns show the percentage of answers per grade. The lowest grade 1s one, and the highest grade is five. The last column in Table 2 shows the percentage of answers without a grade.
Employees (respondents) assigned grades to each category based on user feedback and their interactions with users. Depending on the employee's role, they communicate with the user for various purposes. For that reason, it is interesting to see what conclusions can be extracted.
In the "Ease of use" category, none of the respondents stated that the information system is challenging, e.g., none chose the lowest grade. Most answers (57.14 %) were in the middle grade, meaning that the information system is neither too complicated to use nor too easy to use, leaving many possibilities for improvement
The category "User interface" got better grades than the previous category. All grades range from middle to high grade. This category has more respondents who were unable to answer.
Regarding user satisfaction with the notification methods in the information system, 21.43% of respondents were unable to answer. The average grade of respondents who answered this question falls in the middle range, leaving a strong possibility for improvement.
The "Education for new users" category received the highest grades, but also had the most unanswered questions, as respondents were unable to answer. Not all employees are responsible for educating new users. It depends on their role. It is expected that there will be more answers without the grade. Employees who hold an education graded user satisfaction with the education with high grades, resulting in the highest average grade.
The category "User support" has the most answers without a grade after the category "Education for new users" because not all employees support the end users. The same applies to the average grade. It got the second-highest average grade after the "Education for new users" category.
The last category was "Total user satisfaction" which includes employees" options for user satisfaction with the entire information system, its components, and support activities. Only 14.29 % of respondents did not answer this question. Respondents who provided answers gave it a high grade.
This group of questions contained three more multiple-choice questions. Regarding the question of whether users are especially satisfied with certain functionalities of the information system, 35.71 % of respondents answered positively, and 7.14 % of respondents answered negatively. The majority of respondents (57.14 %) did not provide answers because they do not communicate with the users.
In the following question, respondents were asked to name functionalities with which users are especially satisfied if the previous question was answered positively. All answers were about integration with other government services and services from the same company, enabling more accessible work for users and automating business processes.
The question of whether users notice differences between versions of the information system had three possible answers. None of the respondents chose a negative answer. A majority of answers (71.43 %) were positive. Additionally, some respondents (28.57%) did not answer because they either did not communicate with the users or were not in communication with the users long enough to answer the question.
The following question was an open question in which respondents were asked to identify how users learn about new features or improvements in recent versions of the information system. All answers can be summarized into two responses. The first is release notes, which are part of every version and list new features and improvements. The second is by using the information system. Users who regularly use the information system often notice differences without reading the release notes.
The last multiple-choice question in this group was how often users report errors to the company. Figur 4 presents the results for this question. Most answers were for the middle grade, meaning that users sometimes report errors which result with the same errors in multiple versions of the information system before they are found and resolved.
The company has dedicated software testers who test new versions of the information system before it is delivered to the users. Due to the complexity of the information system, it is challenging to simulate all possible combinations and user actions. The result is that sometimes errors are carried over to the new version and are deployed at the user servers. For this reason, it is important to have users who report errors. If not, the same error will be present in multiple versions before the tester or developer discovers it, or when the user eventually reports that error.
The last question in this group was to list the most significant problems faced by users and provide a brief description. Respondents answered as follows:
* User habits,
* Amount of data,
* Users with poor computer knowledge,
* Users fear using information systems,
* Users forgetting how the information system works,
* Users who do not understand error messages when working with the information system.
User habits can be challenging if users are accustomed to specific processes in solving tasks, e.g., using the information system. Some users prefer routine and repetitive tasks, and introducing changes to existing functionalities (e.g., altering the user interface) can lead to user complaints, regardless of whether the change simplifies their job or streamlines business processes. For this group of users, introducing changes to the information system affects their user experience and productivity within the system (Guo, 2022).
The amount of data generated every month is enormous, as every request, response, document, and piece of citizen information is recorded. This can be an issue for end users with high-level permissions in the information system, who have access to more modules than regular end users. Regular end users have limited access to the information system and stored data. Searching and sorting the data is supported in the information system. Still, the users with elevated permissions who use it for the first time can get overwhelmed by the amount of data if they are unfamiliar with the built-in tools.
Another problem is users with poor computer knowledge. These users are unfamiliar with the technology, making it difficult to explain the specific features of the information system. To help solve this problem, the company can simplify its most frequently used features (if possible).
A combination of limited computer knowledge and an overwhelming amount of data can produce fear in users when using an information system. This leads to many inquiries to customer support and even between end users about the correct way to solve a task. The solution to this problem is constant education for all end users (new and existing users).
End users also forget how to perform specific tasks in the information system because they don't do them often, or they are new to the system. The solution is constant education and open communication with the customer support team.
The information system has many validations for every action, and sometimes, end users are overwhelmed with messages. This can produce fear in the users and prevent them from solving their tasks efficiently.
Most user problems can be resolved through ongoing education on information system usage. The responses suggest that the education process requires improvement, with a focus on maintaining customer satisfaction (Vanhanen et al., 2018).
4.4 User Requests
This subsection presents the results for the last group of questions in the questionnaire, composed of three questions. The first question was to determine whether user requests contain functionalities already present in the information system. Nobody chose the answer "No", but the answer "Yes" was selected by 64.29 % of respondents. Some respondents (35.71 %) could not answer. That means users do not understand the functionalities already available in the information system.
The second question was about respondents' opinions on user requests. Table 3 shows the results. The question had predefined answers, but also allowed input of additional answers. Only one respondent provided an additional answer stating that the user had valuable input for improvement. The remaining respondents selected answers from the predefined list. Respondents were allowed to choose more than one answer.
The last question was about converting ideas into tasks for the development team, specifically, what ideas are more straightforward to convert into tasks. Half of the respondents stated that ideas generated within the company are more easily converted into development tasks. That means discussing the idea and defining requirements and acceptance tests is more accessible than the same discussion with clients about their requests. The answer that user requests are more easily converted into tasks was selected by 28.57 % of respondents. Some respondents (21.43 %) could not provide an answer.
4.5 RQ1: Communication with Users
Employees communicate with clients, but not all employees do so. The only employees who do not correspond with the clients are those with only technical roles, for example, "Developer". All roles can be divided into two groups:
* Business role (Product Owner, Product Manager, Business analyst, Management, etc.),
* Technical role (Developer, Tester, etc.).
Employees with technical roles do not need to communicate with the users. The company utilizes the Scrum framework to organize its work and development. According to Scrum, the development team needs to be protected from unnecessary disruptions that could derail their focus during a Sprint Employees with technical roles focus on delivering the Sprint Goal, but meaningful, structured communication with users, when it supports the Sprint Goal, is important. Employees with business roles serve as the interface between the development team and users, and their primary responsibility is to communicate with users (Andrei et al., 2019)
Respondents stated in the questionnaire that they have multiple roles in the company and the development process. Only a few employees have a single role, and these employees have been with the company for a short time, having yet to learn about the development process. This would not be a problem if employees with multiple roles had only business or technical roles, rather than a mixture of conflicting roles.
In this company, it is common for employees to have roles from both groups. The combination of roles can create tensions among employees. The suggestion is to hire new employees and eliminate additional roles for current employees. Employees should have only one role to perform their jobs efficiently.
4.6 RQ2: User Involment
According to the results, users are not directly involved in the development process. However, when employee roles are considered, employees with more critical business roles have answered positively. The remaining respondents hold technical roles, are not in direct contact with users, or have not responded.
This leads us to conclude that users" involvement in the development process depends on the employee. Employees with more extended tenure in the company have greater responsibility (and multiple business roles) and strive to directly involve users in the development process. The rest of the respondents do not directly involve users in the development process.
Company policy on communicating with users should be revised, and employees should undergo additional training to effectively incorporate users into the development process, including testing and discussing new ideas.
4.7 RQ3: User Experience Feedback
As shown in Table 2, it is possible to form an opinion on a specific information system area from user feedback. However, the information that can be extracted depends on the roles of employees and the target users.
End users only see and care about the user interface. They are generally unaware of the inner workings of the information system and typically do not want to know how it functions or what its architecture is. That means user feedback is limited to what they see on the computer or smartphone screen. Additionally, this presents a challenge in responding to user requests for new functionalities or enhancements to existing ones. User requests state the end goal and what they want to see on the screen, but sometimes lack crucial information about how to get specific information or what operations are necessary to perform.
Employee opinions, therefore, are limited to user interface and graphical design. In the questionnaire, respondents were asked to grade specific information system areas. These areas do not include information system architecture or performance.
The first three areas aimed to determine whether it is possible to form an opinion about information system usage and user interface. The following two areas focused on education and customer support. The last area is overall user satisfaction with the information system.
Some respondents did not provide answers to these questions. It depends on their role. These respondents could not give a grade because they are not directly communicating with the users or cannot form an opinion on indirect communication (written reports or development tasks).
As a conclusion of this section, it is possible to extract information from user feedback and form an opinion based on it. However, user feedback is limited to the graphical design and user interface, and therefore, employees' views are also limited. These employees' views and provided grades can be used to further improve the information system.
5 Discussion
The questionnaire was designed for one specific project but can be applied to other projects in the same company. Additionally, it can be applied to any project, regardless of who the manufacturer is.
The questionnaire aims to determine whether it is possible to obtain helpful information from user feedback and what conclusions employees draw based on it. The results confirm that user feedback can provide insight into the project's current state and be used for future decision-making. However, the questionnaire revealed problems within the company regarding employees' work obligations and internal communication.
The primary challenge in understanding user feedback or requests is the inexperience of younger employees. Their limited knowledge and experience lead to internal miscommunication, which prevents helpful information from reaching all relevant parties and fosters an unproductive work environment.
Additionally, a significant issue is the overlap of conflicting roles for senior employees, which hinders the successful transfer of knowledge to new employees.
Regarding measuring user experience, the questionnaire revealed that it is possible to quantify user experience solely based on user feedback. However, this approach is limited to tasks that users need to perform within the information system. In other words, users focus only on the part of the information system that they need and use, discarding everything else.
The company relies on employees" interpretation of user feedback and requests to assess the usefulness of functionalities and inform decision-making for future development. To improve this practice, it would be recommended to implement a system for recording and assessing users' feedback and reports. The benefit would be the accessibility of information to every involved employee and a better understanding of what the user (customer) wants or needs.
The company needs to estimate user experience before development. This way, it can be determined whether the desired outcome was achieved (Hinderks etal, 2022).
The third group of questions in the questionnaire is the most important, as it contains questions that allow employees to evaluate specific parts of the information system based on user feedback.
User experience is not only a beautiful user interface. User experience includes the whole package, ie., user interface, integration services, additional plugins, and support. The questions in this third group aim to evaluate user experience for all previously mentioned components, but depending on the project, it may be unclear which components are considered. End users for this project were unaware of what lies beneath the surface, and in general, they provided feedback on the result, i.e., whether they completed their task or not.
It is the task of employees to separate user feedback by component when possible and deliver that information to other employees.
Everything written in this paper indirectly measures user experience. No method for continuous, direct measurement of user experience is implemented in the company. If the company chooses to continue using only communication between employees and customers (indirect measurement of user experience), it is suggested that employees be better educated in communication to improve their data gathering and extracting relevant information.
Additionally, in such cases, it would be beneficial for the company to encourage junior employees with business roles to contact customers more frequently, rather than waiting for customers to initiate conversations, via phone, email, or any other available communication channel. The results showed that senior employees with business roles incorporate users in the development process.
Employees would need to be more proactive and formulate their questions for users in a way that gathers as much relevant information as possible. They should document the answers received and forward their conclusions to all other relevant employees and management. For this purpose, it is recommended that the company either develop its own solution for recording and documenting communication with users or use an existing solution.
The researchers' opinion is that for the selected project, the questions are adequate. Additionally, questions were selected after reviewing existing research and are grounded in existing research.
While examining existing research papers on this topic, it was found that most research papers focus on customer satisfaction and user interface, i.e., what the user sees on the screen. These papers typically examine user experience through surveys and interviews with users.
The questionnaire included employees and their opinions on user satisfaction with the information system. What is not covered by the questionnaire and what cannot be covered subsequently are the employees' opinions about each user and the organization with which the employees are in contact. It is suggested that a form be created to allow employees to record the reason for and success of each communication after contact with users.
In this way, all other employees would have insight into the course of communication with the individual user and organization and could draw their own conclusions, which may differ from those of the employee who communicated with the user. When only one employee communicates with the same user, it is possible that their prejudices may influence the final opinion, and not all necessary information may be conveyed to other employees. The introduction of the form enables a broader discussion, thereby facilitating better decision-making.
The source of customer feedback is interaction with the users. In the research paper (Fabijan et al., 2015), which contains the results of a literature review in data collection, it is stated that most often, the initial source of customer feedback originates from direct interaction with the customer by using techniques based on active user involvement.
Interaction with customers, such as interviews or face-to-face interactions, is time-consuming and, therefore, challenging to manage in a fast-paced business environment where process efficiency is crucial (Fabijan et al., 2015). To counter this challenge, itis suggested that employees' efforts in completing the form for reporting communication with users be recognized, and, if possible, they should be rewarded for filling out the reports correctly and on time.
The suggested form would enable employees to report all relevant information from users in a unified way. Using the given form, all reports would contain the same structure and relevant information for decision-making.
6 Limitations
As previously mentioned, access to end users is restricted, except for customer representatives and end-user-initiated communication for any reason. This limitation reduces the possibilities for collecting user experience feedback. For this reason, the company needs to collect and analyze every communication with users, whether it be face-to-face, email, or phone.
Limitations in this study include the number of employees working on a project and the number of respondents. At the time of conducting the research, thirty employees were working on the selected project that fulfilled the inclusion criteria. However, not all employees completed the questionnaire.
Fourteen of the thirty employees (46.6 %) filled out the questionnaire. Different results could be obtained if all intended employees had accessed the questionnaire and answered the questions.
Also, if all employees accessed the questionnaire, their answers could confirm the results or disprove the obtained results
7 Threats to Validity
The main threat to validity is the possibility of author bias. The questionnaire author is affiliated with the company chosen for this study, which presents a potential influence on the results. To eliminate bias, the questionnaire was reviewed by another researcher who is not affiliated with the company.
Existing research papers and published books were consulted to minimize bias when creating the questionnaire. In that way, questions would not be suggestive toward a specific conclusion.
The researcher excluded himself from any discussion regarding the questionnaire with other employees.
Analysis of questionnaire results was conducted with only employees' answers included. Everything mentioned in this paper is contained in the employees' answers. The researcher did not add anything outside the obtained answers.
8 Conclusion
This paper presents the results of a questionnaire conducted in a Croatian software development company. The questionnaire aimed to determine user involvement in the development process and to identify the information that can be obtained from user feedback, which influences employees' perceptions regarding future improvements to the information system.
Employees who have direct contact with the users or have access to meeting reports can form an opinion. Employee conclusions can be used to further improve the information system.
Additionally, the results of the questionnaire indicate that some employees interact with users. Employees with specific roles communicate with users and shield the development team from external influences.
Furthermore, this paper offers suggestions for enhancing business processes within the company, which can improve employee efficiency and collaboration.
The conclusion of this paper is that user feedback can be utilized by employees to assist management in making informed decisions about the further development of the information system.
Future research will involve collaborating with other teams within the company that face similar challenges in working with state administration bodies. It will incorporate additional methods, such as employee interviews, for data collection. With the expansion of data collection, a larger amount of data will be obtained. The goal of future research is to propose mechanisms and tools for automating the collection of data, data analysis, and tracing the results of decisions based on the obtained data.
References
Allen, J. & Chudley, J. (2012.). Smashing UX Design: Foundations for Designing Online User Experiences. John Wiley & Sons Ltd.
Alomari, H. W., Ramasamy, V., Kiper, J. D, & Potvin, G. (2020.). A User Interface (UI) and User eXperience (UX) evaluation framework for cyberlearning environments in computer science and software engineering education. Heliyon, 6(5)
Amirova R., Khomyakov, L, Mirgalimova R., & Sillitti, A. (2019.). Software Development and Customer Satisfaction: A Systematic Literature Review. International Conference on Objects, Components, Models and Patterns TOOLS 2019, 136-149
Andrei, B.-A., Casu-Pop, A.-C., Gheorghe, S.-C., & Boianggiu, C.-A. (2019.). A study on using waterfall and agile methods in software project management. Journal of Information Systems & Operations Management, 13(1), 125-135
Buchan, J., Bano, M., Zowghi, D., MacDonell, S., & Shinde, A. (2017.). Alignment of Stakeholder Expectations about User Involvement in Agile Software Development. 2lst International Conference on Evaluation and Assessment in Software Engineering EASE2017, 334-343
Fabijan, A., Olsson, H. H., & Bosch, J. (2015.). Customer Feedback and Data Collection Techniques in Software R&D: A literature review. 6th International Conference on Software Business ICSOB 2015, 139-153
Guo, Y. (2022.). Does User Preference Matter? A Comparative Study on Influencing Factors of User Activity Between Government-Provided and Business-Provided Apps. Frontiers in Psychology, 13
Hartson, R., & Pyla, P. S. (2012.). The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Elsevier Inc.
Hinderks, A., Mayo, F. J. D., Thomaschewski, J., & Escalona, M. J. (2022.). Approaches to manage the user experience process in Agile software development: A systematic literature review. Information and Software Technology, 150
Kupiainen, E., Mántylá, M. V., & Itkonen, J. (2015.). Using metrics in Agile and Lean Software Development - A systematic literature review of industrial studies. Information and Software Technology, 62, 143-163
Majumder, A., S. (2025.). Eye-Tracking and Biometric Feedback in UX Research: Measuring User Engagement and Cognitive Load. arXiv.2505.21982
Ntoa, S., Margetis, G., Antona, M., & Stephanidis, C. (2021.).User Experience Evaluation in Intelligent Environments: A Comprehensive Framework Technologies, 9(2)
Saffer, D. (2010.). Designing for Interaction: Creating Innovative Applications and Devices, Second Edition. New Riders.
Tullis, T., & Albert, B. (2013.). Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics, Second Edition. Elsevier Inc.
Vanhanen, J., Lehtinen, T. O. A., & Lassenius, C. (2018.). Software engineering problems and their relationship to perceived learning and customer satisfaction on a software capstone project. The Journal of Systems and Software, 137, 50-60
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025. This work is published under http://archive.ceciis.foi.hr/app/index.php/ceciis/archive (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Customer satisfaction and customer feedback are essential to software manufacturers for a better understanding of customer needs. This paper focuses on one Croatian software development company that relies on its employees to communicate with customers and propose necessary actions for improving customer satisfaction. A questionnaire was used to collect the data. It aims to determine customer involvement in developing new features and what information can be extracted from customer feedback. The results show that senior employees can make decisions based on customer feedback. Additionally, these employees involve customers in the development process.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 InfoDom Ltd. Andrije Žaje 61, 10000 Zagreb, Croatia [email protected]
2 University of Maribor Faculty of Electrical Engineering and Computer Science Koroska cesta 46, 2000 Maribor, Slovenia [email protected]





