anal A Longitudinal Analysis of the Effects of Instructional Strategies on Student Performance in Traditional and E-Learning Formats By Published On :: Full Article
anal Improving Information Security Risk Analysis Practices for Small- and Medium-Sized Enterprises: A Research Agenda By Published On :: Full Article
anal Novel Phonetic Name Matching Algorithm with a Statistical Ontology for Analysing Names Given in Accordance with Thai Astrology By Published On :: Full Article
anal Analysing Socio-Demographic Differences in Access and Use of ICTs in Nigeria Using the Capability Approach By Published On :: Full Article
anal A Data Driven Conceptual Analysis of Globalization — Cultural Affects and Hofstedian Organizational Frames: The Slovak Republic Example By Published On :: Full Article
anal Meaningful Learning in Discussion Forums: Towards Discourse Analysis By Published On :: Full Article
anal A Framework for Using Cost-Benefit Analysis in Making the Case for Software Upgrade By Published On :: Full Article
anal Distributed Collaborative Learning in Online LIS Education: A Curricular Analysis By Published On :: Full Article
anal Applying a Modified Technology Acceptance Model to Qualitatively Analyse the Factors Affecting E-Portfolio Implementation for Student Teachers’ in Field Experience Placements By Published On :: Full Article
anal Web-based Tutorials and Traditional Face-to-Face Lectures: A Comparative Analysis of Student Performance By Published On :: Full Article
anal Analysis of Student Attitudes towards E-learning: The Case of Engineering Students in Libya By Published On :: Full Article
anal Requirements Elicitation Problems: A Literature Analysis By Published On :: 2015-06-03 Requirements elicitation is the process through which analysts determine the software requirements of stakeholders. Requirements elicitation is seldom well done, and an inaccurate or incomplete understanding of user requirements has led to the downfall of many software projects. This paper proposes a classification of problem types that occur in requirements elicitation. The classification has been derived from a literature analysis. Papers reporting on techniques for improving requirements elicitation practice were examined for the problem the technique was designed to address. In each classification the most recent or prominent techniques for ameliorating the problems are presented. The classification allows the requirements engineer to be sensitive to problems as they arise and the educator to structure delivery of requirements elicitation training. Full Article
anal Benefits of Employing a Personal Response System in a Decision Analysis Course By Published On :: 2016-05-21 This paper describes the employment of a Personal Response System (PRS) during a Decision Analysis course for Management Information Systems (MIS) students. The description shows how the carefully designed PRS-based questions, the delivery, and the follow-up discussions; provided a context for eliciting and exercising central concepts of the course topics as well as central skills required for MIS majors. A sample of PRS-based questions is presented along with a description for each question of its purpose, the way it was delivered, the response rate, the responses and their frequencies, and the respective in-class discussion. Lessons from these findings are discussed. Full Article
anal Executive Higher Education Doctoral Programs in the United States: A Demographic Market-Based Analysis By Published On :: 2017-04-22 Aim/Purpose: Executive doctoral programs in higher education are under-researched. Scholars, administers, and students should be aware of all common delivery methods for higher education graduate programs. Background This paper provides a review and analysis of executive doctoral higher education programs in the United States. Methodology: Executive higher education doctoral programs analyzed utilizing a qualitative demographic market-based analysis approach. Contribution: This review of executive higher education doctoral programs provides one of the first investigations of this segment of the higher education degree market. Findings: There are twelve programs in the United States offering executive higher education degrees, though there are less aggressively marketed programs described as executive-style higher education doctoral programs that could serve students with similar needs. Recommendations for Practitioners: Successful executive higher education doctoral programs require faculty that have both theoretical knowledge and practical experience in higher education. As appropriate, these programs should include tenure-line, clinical-track, and adjunct faculty who have cabinet level experience in higher education. Recommendation for Researchers: Researchers should begin to investigate more closely the small but growing population of executive doctoral degree programs in higher education. Impact on Society: Institutions willing to offer executive degrees in higher education will provide training specifically for those faculty who are one step from an executive position within the higher education sector. Society will be impacted by having someone that is trained in the area who also has real world experience. Future Research: Case studies of students enrolled in executive higher education programs and research documenting university-employer goals for these programs would enhance our understanding of this branch of the higher education degree market. Full Article
anal An Analytical Investigation of the Characteristics of the Dropout Students in Higher Education By Published On :: 2018-05-18 Aim/Purpose: Student dropout in higher education institutions is a universal problem. This study identifies the characteristics of dropout. In addition, it develops a mathematical model to predict students who may dropout. Methodology: The paper develops a mathematical model to predict students who may dropout. The sample includes 555 freshmen in a non-profit private university. The study uses both descriptive statistics, such as cross tabulation, and a binary regression model to predict student dropout. Contribution: There are two major contributions for the paper. First, it identifies the dropout rates of each group, a finding that may be used to better allocate resources at higher education institutions. Second, it develops a predictive model that may be used in order to predict the probability of a student dropping out and take preventive actions. Findings: This study compared dropout rates of one and a half year of enrollment among Traditional Undergraduate Students. Two major findings are the following: (1) Some of the resources designed to assist student are misallocated, and (2) Predictive models can be used to calculate the probability of a student dropping out. Recommendations for Practitioners: The study recommends that institutions must create initiatives to assist freshmen students and have annual assessment to measure the success of the initiatives. Recommendation for Researchers: Two, mathematical models may be used to predict dropout rates, the paper includes a model that predicted with 66.6% accuracy students who will dropout. Full Article
anal The Competencies Required for the BPA Role: An Analysis of the Kenyan Context By Published On :: 2019-05-03 Aim/Purpose: This study aims to answer the research question titled What are the competencies required for the Business Process Analyst (BPA) role in organizations with ERP systems in Kenya. Through 4 hypotheses, this study focuses on two specific aspects: (1) Enhancing BPM Maturity and (2) ERP implementation. Background: The emergence of complex systems and complex processes in organizations in Kenya has given rise to the need to understand the BPM domain as well as a need to analyze the new roles within organizational environments that drive BPM initiatives. The most notable role in this domain is the BPA. Furthermore, many organizations in Kenya and across Africa are making significant investments in ERP systems. Organizations, therefore, need to understand the BPA role for ERP systems implementation projects. Methodology: This study uses a sequential mixed methods approach analyzing quantitative survey data followed by the analysis of qualitative interview data. Contribution: The main contribution of this study is a description of competencies that are critical for the BPA in Kenya both in terms of enhancing BPM maturity and for driving ERP systems implementations. In addition, this study sheds light on critical BPA competencies that are perceived to be undervalued in the Kenyan context. Findings: Findings show that business process orchestration competencies are important for driving BPM maturity and for ERP systems implementations. This study found that business process elicitation, business analysis, business process improvement and a holistic overview of business thinking are often overlooked as critical competencies for BPAs but are nevertheless critical for building the BPA practitioner. Recommendations for Practitioners: From this study, practitioners such as top managers and BPAs can be enlightened on the specific competencies that require focus when carrying out BPM and when implementing ERP systems projects. Future Research: The next step is to investigate the interventions that organizations implement to build their BPA competencies. The main aim of this would be to describe those interventions that impact the requisite BPA competencies especially those competencies that were seen to be undervalued within the Kenyan context. Full Article
anal Agile Requirements Engineering: An Empirical Analysis and Evidence from a Tertiary Education Context By Published On :: 2019-04-16 Aim/Purpose: The study describes empirical research into agile Requirements Engineering (RE) practices based on an analysis of data collected in a large higher education organization. Background: Requirements Engineering (RE) in agile development contexts is considerably different than in traditional software development. The field of agile RE is still nascent where there is a need to evaluate its impact in real-world settings. Methodology: Using a case study methodology, the study involved interviewing nine experienced software practitioners who reflected on the use and implementation of various agile RE practices in two software development projects of a student management system. Contribution: The primary contribution of the paper is the evaluation of agile RE practices in a large tertiary educational organization. Based on the analysis of the data, it provides valuable insights into the practice of agile RE in a specific context (i.e., education), but just as importantly, the ones that were omitted or replaced with others and why. Findings: While the evolutionary and iterative approach to defining requirements was followed in general, not all agile practices could be fully adhered to in the case organization. Although face-to-face communication with the customers has been recognized as one the most important agile RE practices, it was one of the most difficult practices to achieve with a large and diverse customer base. Addressing people issues (e.g., resistance to change, thinking, and mindset) was found to be a key driver to following the iterative RE process effectively. Contrary to the value-based approach advocated in the literature, the value-based approach was not strictly adhered to in requirements prioritization. Continuous integration was perceived to be a more beneficial practice than prototyping, as it allows frequent integration of code and facilitates delivering working software when necessary. Recommendations for Practitioners: Our study has important implications for practitioners. Based on our empirical analysis, we provide specific recommendations for effective implementation of agile RE practices. For example, our findings suggest that practitioners could address the challenges associated with limited face-to-face communication challenges by producing flexible, accessible, and electronic documentation to enable communication. Recommendations for Researchers: Researchers can use the identified agile RE practices and their variants to per-form in-depth investigations into agile requirements engineering in other educational contexts. Impact on Society: There are a number of new technologies that offer exciting new opportunities that can be explored to maximize the benefits of agile and other requirements techniques. Future Research: Future research could conduct case studies in different contexts and thus con-tribute to developing bundles or collections of practices to improve software development processes in specific contexts. Full Article
anal Egocentric Database Operations for Social and Economic Network Analysis By Published On :: Full Article
anal Factors Determining the Balance between Online and Face-to-Face Teaching: An Analysis using Actor-Network Theory By Published On :: Full Article
anal Analysis of Explanatory and Predictive Architectures and the Relevance in Explaining the Adoption of IT in SMEs By Published On :: Full Article
anal A Qualitative Descriptive Analysis of Collaboration Technology in the Navy By Published On :: 2015-10-27 Collaboration technologies enable people to communicate and use information to make organizational decisions. The United States Navy refers to this concept as information dominance. Various collaboration technologies are used by the Navy to achieve this mission. This qualitative descriptive study objectively examined how a matrix oriented Navy activity perceived an implemented collaboration technology. These insights were used to determine whether a specific collaboration technology achieved a mission of information dominance. The study used six collaboration themes as a foundation to include: (a) Cultural intelligence, (b) Communication, (c) Capability, (d) Coordination, (e) Cooperation, and (f) Convergence. It was concluded that collaboration technology was mostly perceived well and helped to achieve some levels of information dominance. Collaboration technology improvement areas included bringing greater awareness to the collaboration technology, revamping the look and feel of the user interface, centrally paying for user and storage fees, incorporating more process management tools, strategically considering a Continuity of Operations, and incorporating additional industry best practices for data structures. Emerging themes of collaboration were collected to examine common patterns identified in the collected data. Emerging themes included acceptance, awareness, search, scope, content, value, tools, system performance, implementation, training, support, usage, structure, complexity, approach, governance/configuration management/policy, and resourcing. Full Article
anal Does Usability Matter? An Analysis of the Impact of Usability on Technology Acceptance in ERP Settings By Published On :: 2016-11-08 Though the field of management information systems, as a sector and a discipline, is the inventor of many guidelines and models, it appears to be a slow runner on practical implications of interface usability. This usability can influence end users’ attitude and behavior to use IT. The purpose of this paper was to examine the interface usability of a popular Enterprise Resource Planning (ERP) software system, SAP, and to identify related issues and implications to the Technology Acceptance Model (TAM). A survey was conducted of 112 SAP ERP users from an organization in the heavy metal industry in Bangladesh. The partial least squares technique was used to analyze the survey data. The survey findings empirically confirmed that interface usability has a significant impact on users’ perceptions of usefulness and ease of use which ultimately affects attitudes and intention to use the ERP software. The research model extends the TAM by incorporating three criteria of interface usability. It is the first known study to investigate usability criteria as an extension of TAM. Full Article
anal Analogical Thinking for Generation of Innovative Ideas: An Exploratory Study of Influential Factors By Published On :: 2016-07-25 Analogical thinking is one of the most effective tools to generate innovative ideas. It enables us to develop new ideas by transferring information from well-known domains and utilizing them in a novel domain. However, using analogical thinking does not always yield appropriate ideas, and there is a lack of consensus among researchers regarding the evaluation methods for assessing new ideas. Here, we define the appropriateness of generated ideas as having high structural and low superficial similarities with their source ideas. This study investigates the relationship between thinking process and the appropriateness of ideas generated through analogical thinking. We conducted four workshops with 22 students in order to collect the data. All generated ideas were assessed based on the definition of appropriateness in this study. The results show that participants who deliberate more before reaching the creative leap stage and those who are engaged in more trial and error for deciding the final domain of a new idea have a greater possibility of generating appropriate ideas. The findings suggest new strategies of designing workshops to enhance the appropriateness of new ideas. Full Article
anal Facilitating mCommerce Growth in Nigeria through mMoney Usage: A Preliminary Analysis By Published On :: 2016-05-29 A general belief is that Mobile Money (mMoney) has the catalytic effect of spurring mCommerce growth and driving financial inclusion in developing nations like Nigeria. In Nigeria, mMoney service is certainly a new financial service innovation in the country, and as a result critical issues surrounding its early critical mass adoption, including its perceived usefulness, remain largely opaque. In this paper, our aim was to explore factors influencing perceived usefulness of mMoney by using the extended technology acceptance model (TAM) as the theoretical underpinning of our work. This work is based on a usable sample of 127 respondents from two major cities in Nigeria. Overall, the study’s results indicate that perceived regulator assurance, service affordability, convenience, proximity to the nearest bank branch, and worry over ease of use are significant predictors of mMoney perceived usefulness. The work helps shed new insights about the significant factors that are closely related to the consumer’s perception of the relevance of mMoney services (to his/her financial needs). In sum, the study is an initial step to addressing the issue of perceived usefulness of mMoney service, including its pivotal importance to laying a solid foundation for mCommerce growth in Nigeria and similar sub-Saharan African (SSA) coun-tries. Full Article
anal A Thematic Analysis of Interdisciplinary Journal of Information, Knowledge, and Management (IJIKM) By Published On :: 2018-08-02 Aim/Purpose: This study investigates the research profile of the papers published in Interdisciplinary Journal of Information, Knowledge, and Management (IJIKM) to provide silhouette information of the journal for the editorial team, researchers, and the audience of the journal. Background: Information and knowledge management is an interdisciplinary subject. IJIKM defines intersections of multiple disciplinary research communities for the interdisciplinary subject. Methodology: A quantitative study of categorical content analysis was used for a thematic analysis of IJIKM. One hundred fifty nine (159) papers published since the inauguration of the journal in 2006 were coded and analyzed. Contribution: The study provides synopsized information about the interdisciplinary research profile of IJIKM, and adds value to the literature of information and knowledge management. Findings: The analysis reveals that IJIKM disseminates research papers with a wide range of research themes. Among the research themes, Organizational issues of knowledge/information management, Knowledge management systems/tools, Information/knowledge sharing, Technology for knowledge/information management, Information/knowledge application represent the five main research streams of IJIKM. The total number of papers on organizational issues of knowledge/information management increased from 16% to 28% during the past 6 years. Statistical method was the most common research methodology, and summarization was the most common research design applied in the papers of IJIKM. The paper also presents other patterns of participant countries, keywords frequencies, and reference citations. Recommendations for Practitioners: Innovation is the key to information and knowledge management. Practitioners of information and knowledge management can share best practices with external sectors. Recommendation for Researchers: Researchers can identify opportunities of cross-disciplinary research projects that involve experts in business, education, government, healthcare, technology, and psychology to advance knowledge in information and knowledge management. Impact on Society: Information and knowledge management is still a developing field, and readers of this paper can gain more understanding of the dissemination of the literature of information and knowledge management involved in all relevant disciplines. Future Research: A longitudinal study could follow up in the future to provide updated and comparative information of the research profile of the journal. Full Article
anal IDCUP Algorithm to Classifying Arbitrary Shapes and Densities for Center-based Clustering Performance Analysis By Published On :: 2020-05-04 Aim/Purpose: The clustering techniques are normally considered to determine the significant and meaningful subclasses purposed in datasets. It is an unsupervised type of Machine Learning (ML) where the objective is to form groups from objects based on their similarity and used to determine the implicit relationships between the different features of the data. Cluster Analysis is considered a significant problem area in data exploration when dealing with arbitrary shape problems in different datasets. Clustering on large data sets has the following challenges: (1) clusters with arbitrary shapes; (2) less knowledge discovery process to decide the possible input features; (3) scalability for large data sizes. Density-based clustering has been known as a dominant method for determining the arbitrary-shape clusters. Background: Existing density-based clustering methods commonly cited in the literature have been examined in terms of their behavior with data sets that contain nested clusters of varying density. The existing methods are not enough or ideal for such data sets, because they typically partition the data into clusters that cannot be nested. Methodology: A density-based approach on traditional center-based clustering is introduced that assigns a weight to each cluster. The weights are then utilized in calculating the distances from data vectors to centroids by multiplying the distance by the centroid weight. Contribution: In this paper, we have examined different density-based clustering methods for data sets with nested clusters of varying density. Two such data sets were used to evaluate some of the commonly cited algorithms found in the literature. Nested clusters were found to be challenging for the existing algorithms. In utmost cases, the targeted algorithms either did not detect the largest clusters or simply divided large clusters into non-overlapping regions. But, it may be possible to detect all clusters by doing multiple runs of the algorithm with different inputs and then combining the results. This work considered three challenges of clustering methods. Findings: As a result, a center with a low weight will attract objects from further away than a centroid with higher weight. This allows dense clusters inside larger clusters to be recognized. The methods are tested experimentally using the K-means, DBSCAN, TURN*, and IDCUP algorithms. The experimental results with different data sets showed that IDCUP is more robust and produces better clusters than DBSCAN, TURN*, and K-means. Finally, we compare K-means, DBSCAN, TURN*, and to deal with arbitrary shapes problems at different datasets. IDCUP shows better scalability compared to TURN*. Future Research: As future recommendations of this research, we are concerned with the exploration of further available challenges of the knowledge discovery process in clustering along with complex data sets with more time. A hybrid approach based on density-based and model-based clustering algorithms needs to compare to achieve maximum performance accuracy and avoid the arbitrary shapes related problems including optimization. It is anticipated that the comparable kind of the future suggested process will attain improved performance with analogous precision in identification of clustering shapes. Full Article
anal A Novel Telecom Customer Churn Analysis System Based on RFM Model and Feature Importance Ranking By Published On :: 2023-10-03 Aim/Purpose: In this paper, we present an RFM model-based telecom customer churn system for better predicting and analyzing customer churn. Background: In the highly competitive telecom industry, customer churn is an important research topic in customer relationship management (CRM) for telecom companies that want to improve customer retention. Many researchers focus on a telecom customer churn analysis system to find out the customer churn factors for improving prediction accuracy. Methodology: The telecom customer churn analysis system consists of three main parts: customer segmentation, churn prediction, and churn factor identification. To segment the original dataset, we use the RFM model and K-means algorithm with an elbow method. We then use RFM-based feature construction for customer churn prediction, and the XGBoost algorithm with SHAP method to obtain a feature importance ranking. We chose an open-source customer churn dataset that contains 7,043 instances and 21 features. Contribution: We present a novel system for churn analysis in telecom companies, which encompasses customer churn prediction, customer segmentation, and churn factor analysis to enhance business strategies and services. In this system, we leverage customer segmentation techniques for feature construction, which enables the new features to improve the model performance significantly. Our experiments demonstrate that the proposed system outperforms current advanced customer churn prediction methods in the same dataset, with a higher prediction accuracy. The results further demonstrate that this churn analysis system can help telecom companies mine customer value from the features in a dataset, identify the primary factors contributing to customer churn, and propose suitable solution strategies. Findings: Simulation results show that the K-means algorithm gets better results when the original dataset is divided into four groups, so the K value is selected as 4. The XGBoost algorithm achieves 79.3% and 81.05% accuracy on the original dataset and new data with RFM, respectively. Additionally, each cluster has a unique feature importance ranking, allowing for specialized strategies to be provided to each cluster. Overall, our system can help telecom companies implement effective CRM and marketing strategies to reduce customer churn. Recommendations for Practitioners: More accurate churn prediction reduces misjudgment of customer churn. The acquisition of customer churn factors makes the company more convenient to analyze the reasons for churn and formulate relevant conservation strategies. Recommendation for Researchers: The research achieves 81.05% accuracy for customer churn prediction with the Xgboost and RFM algorithms. We believe that more enhancements algorithms can be attempted for data preprocessing for better prediction. Impact on Society: This study proposes a more accurate and competitive customer churn system to help telecom companies conserve the local markets and reduce capital outflows. Future Research: The research is also applicable to other fields, such as education, banking, and so forth. We will make more new attempts based on this system. Full Article
anal Determinants of the Intention to Use Big Data Analytics in Banks and Insurance Companies: The Moderating Role of Managerial Support By Published On :: 2023-10-03 Aim/Purpose: The aim of this research paper is to suggest a comprehensive model that incorporates the technology acceptance model with the task-technology fit model, information quality, security, trust, and managerial support to investigate the intended usage of big data analytics (BDA) in banks and insurance companies. Background: The emergence of the concept of “big data,” prompted by the widespread use of connected devices and social media, has been pointed out by many professionals and financial institutions in particular, which makes it necessary to assess the determinants that have an impact on behavioral intention to use big data analytics in banks and insurance companies. Methodology: The integrated model was empirically assessed using self-administered questionnaires from 181 prospective big data analytics users in Moroccan banks and insurance firms and examined using partial least square (PLS) structural equation modeling. The results cover sample characteristics, an analysis of the validity and reliability of measurement models’ variables, an evaluation of the proposed hypotheses, and a discussion of the findings. Contribution: The paper makes a noteworthy contribution to the BDA adoption literature within the finance sector. It stands out by ingeniously amalgamating the Technology Acceptance Model (TAM) with Task-Technology Fit (TTF) while underscoring the critical significance of information quality, trust, and managerial support, due to their profound relevance and importance in the finance domain. Thus showing BDA has potential applications beyond the finance sector. Findings: The findings showed that TTF and trust’s impact on the intention to use is considerable. Information quality positively impacted perceived usefulness and ease of use, which in turn affected the intention to use. Moreover, managerial support moderates the correlation between perceived usefulness and the intention to use, whereas security did not affect the intention to use and managerial support did not moderate the influence of perceived ease of use. Recommendations for Practitioners: The results suggest that financial institutions can improve their adoption decisions for big data analytics (BDA) by understanding how users perceive it. Users are predisposed to use BDA if they presume it fits well with their tasks and is easy to use. The research also emphasizes the importance of relevant information quality, managerial support, and collaboration across departments to fully leverage the potential of BDA. Recommendation for Researchers: Further study may be done on other business sectors to confirm its generalizability and the same research design can be employed to assess BDA adoption in organizations that are in the advanced stage of big data utilization. Impact on Society: The study’s findings can enable stakeholders of financial institutions that are at the primary stage of big data exploitation to understand how users perceive BDA technologies and the way their perception can influence their intention toward their use. Future Research: Future research is expected to conduct a comparison of the moderating effect of managerial support on users with technical expertise versus those without; in addition, international studies across developed countries are required to build a solid understanding of users’ perceptions towards BDA. Full Article
anal Content-Rating Consistency of Online Product Review and Its Impact on Helpfulness: A Fine-Grained Level Sentiment Analysis By Published On :: 2023-09-22 Aim/Purpose: The objective of this research is to investigate the effect of review consistency between textual content and rating on review helpfulness. A measure of review consistency is introduced to determine the degree to which the review sentiment of textual content conforms with the review rating score. A theoretical model grounded in signaling theory is adopted to explore how different variables (review sentiment, review rating, review length, and review rating variance) affect review consistency and the relationship between review consistency and review helpfulness. Background: Online reviews vary in their characteristics and hence their different quality features and degrees of helpfulness. High-quality online reviews offer consumers the ability to make informed purchase decisions and improve trust in e-commerce websites. The helpfulness of online reviews continues to be a focal research issue regardless of the independent or joint effects of different factors. This research posits that the consistency between review content and review rating is an important quality indicator affecting the helpfulness of online reviews. The review consistency of online reviews is another important requirement for maintaining the significance and perceived value of online reviews. Incidentally, this parameter is inadequately discussed in the literature. A possible reason is that review consistency is not a review feature that can be readily monitored on e-commerce websites. Methodology: More than 100,000 product reviews were collected from Amazon.com and preprocessed using natural language processing tools. Then, the quality reviews were identified, and relevant features were extracted for model training. Machine learning and sentiment analysis techniques were implemented, and each review was assigned a consistency score between 0 (not consistent) and 1 (fully consistent). Finally, signaling theory was employed, and the derived data were analyzed to determine the effect of review consistency on review helpfulness, the effect of several factors on review consistency, and their relationship with review helpfulness. Contribution: This research contributes to the literature by introducing a mathematical measure to determine the consistency between the textual content of online reviews and their associated ratings. Furthermore, a theoretical model grounded in signaling theory was developed to investigate the effect on review helpfulness. This work can considerably extend the body of knowledge on the helpfulness of online reviews, with notable implications for research and practice. Findings: Empirical results have shown that review consistency significantly affects the perceived helpfulness of online reviews. The study similarly finds that review rating is an important factor affecting review consistency; it also confirms a moderating effect of review sentiment, review rating, review length, and review rating variance on the relationship between review consistency and review helpfulness. Overall, the findings reveal the following: (1) online reviews with textual content that correctly explains the associated rating tend to be more helpful; (2) reviews with extreme ratings are more likely to be consistent with their textual content; and (3) comparatively, review consistency more strongly affects the helpfulness of reviews with short textual content, positive polarity textual content, and lower rating scores and variance. Recommendations for Practitioners: E-commerce systems should incorporate a review consistency measure to rank consumer reviews and provide customers with quick and accurate access to the most helpful reviews. Impact on Society: Incorporating a score of review consistency for online reviews can help consumers access the best reviews and make better purchase decisions, and e-commerce systems improve their business, ultimately leading to more effective e-commerce. Future Research: Additional research should be conducted to test the impact of review consistency on helpfulness in different datasets, product types, and different moderating variables. Full Article
anal Antecedents of Business Analytics Adoption and Impacts on Banks’ Performance: The Perspective of the TOE Framework and Resource-Based View By Published On :: 2023-09-18 Aim/Purpose: This study utilized a comprehensive framework to investigate the adoption of Business Analytics (BA) and its effects on performance in commercial banks in Jordan. The framework integrated the Technological-Organizational-Environmental (TOE) model, the Diffusion of Innovation (DOI) theory, and the Resource-Based View (RBV). Background: The recent trend of utilizing data for business operations and decision-making has positively impacted organizations. Business analytics (BA) is a leading technique that generates valuable insights from data. It has gained considerable attention from scholars and practitioners across various industries. However, guidance is lacking for organizations to implement BA effectively specific to their business contexts. This research aims to evaluate factors influencing BA adoption by Jordanian commercial banks and examine how its implementation impacts bank performance. The goal is to provide needed empirical evidence surrounding BA adoption and outcomes in the Jordanian banking sector. Methodology: The study gathered empirical data by conducting an online questionnaire survey with senior and middle managers from 13 commercial banks in Jordan. The participants were purposefully selected, and the questionnaire was designed based on relevant and well-established literature. A total of 307 valid questionnaires were collected and considered for data analysis. Contribution: This study makes a dual contribution to the BA domain. Firstly, it introduces a research model that comprehensively examines the factors that influence the adoption of BA. The proposed model integrates the TOE framework, DOI theory, and RBV theory. Combining these frameworks allows for a comprehensive examination of BA adoption in the banking industry. By analyzing the technological, organizational, and environmental factors through the TOE framework, understanding the diffusion process through the DOI theory, and assessing the role of resources and capabilities through the RBV theory, researchers and practitioners can better understand the complex dynamics involved. This integrated approach enables a more nuanced assessment of the factors that shape BA adoption and its subsequent impact on business performance within the banking industry. Secondly, it uncovers the effects of BA adoption on business performance. These noteworthy findings stem from a rigorous analysis of primary data collected from commercial banks in Jordan. By presenting a holistic model and delving into the implications for business performance, this research offers valuable insights to researchers and practitioners alike in the field of BA. Findings: The findings revealed that various technological (data quality, complexity, compatibility, relative advantage), organizational (top management support, organizational readiness), and environmental (external support) factors are crucial in shaping the decision to adopt BA. Furthermore, the study findings demonstrated a positive relationship between BA adoption and performance outcomes in Jordanian commercial banks. Recommendations for Practitioners: The findings suggest that Jordanian commercial banks should enforce data quality practices, provide clear standards, invest in data quality tools and technologies, and conduct regular data audits. Top management support is crucial for fostering a data-driven decision-making culture. Organizational readiness involves having the necessary resources and skilled personnel, as well as promoting continuous learning and improvement. Highlighting the benefits of BA helps overcome resistance to technological innovation and encourages adoption by demonstrating improved decision-making processes and operational efficiency. Furthermore, external support is crucial for banks to adopt Business Analytics (BA). Banks should partner with experienced vendors to gain expertise and incorporate best practices. Vendors also provide training and technical support to overcome technological barriers. Compatibility is essential for optimal performance, requiring managers to modify workflows and IT infrastructure. Complexity, including data, organizational, and technical complexities, is a major obstacle to BA adoption. Banks should take a holistic approach, focusing on people, processes, and technology, and prioritize data quality and governance. Building a skilled team, fostering a data-driven culture, and investing in technology and infrastructure are essential. Recommendation for Researchers: The integration of the TOE framework, the DOI theory, and the RBV theory can prove to be a powerful approach for comprehensively analyzing the various factors that influence BA adoption within the dynamic banking industry. Furthermore, this combined framework enables us to gain deeper insights into the subsequent impact of BA adoption on overall business performance. Impact on Society: Examining the factors influencing BA adoption in the banking industry and its subsequent impact on business performance can have wide-ranging societal implications. It can promote data-driven decision-making, enhance customer experiences, strengthen fraud detection, foster financial inclusion, contribute to economic growth, and trigger discussions on ethical considerations. Future Research: To further advance future research, there are several avenues to consider. One option is to broaden the scope by including a larger sample size, allowing for a more comprehensive analysis. Another possibility is to investigate the impact of BA adoption on various performance indicators beyond the ones already examined. Additionally, incorporating qualitative research methods would provide a more holistic understanding of the organizational dynamics and challenges associated with the adoption of BA in Jordanian commercial banks. Full Article
anal Analysis of the Scale Types and Measurement Units in Enterprise Architecture (EA) Measurement By Published On :: 2023-05-21 Aim/Purpose: This study identifies the scale types and measurement units used in the measurement of enterprise architecture (EA) and analyzes the admissibility of the mathematical operations used. Background: The majority of measurement solutions proposed in the EA literature are based on researchers’ opinions and many with limited empirical validation and weak metrological properties. This means that the results generated by these solutions may not be reliable, trustworthy, or comparable, and may even lead to wrong investment decisions. While the literature proposes a number of EA measurement solutions, the designs of the mathematical operations used to measure EA have not yet been independently analyzed. It is imperative that the EA community works towards developing robust, reliable, and widely accepted measurement solutions. Only then can senior management make informed decisions about the allocation of resources for EA initiatives and ensure that their investment yields optimal results. Methodology: In previous research, we identified, through a systematic literature review, the EA measurement solutions proposed in the literature and classified them by EA entity types. In a subsequent study, we evaluated their metrology coverage from both a theoretical and empirical perspective. The metrology coverage was designed using a combination of the evaluation theory, best practices from the software measurement literature including the measurement context model, and representational theory of measurement to evaluate whether EA measurement solutions satisfy the metrology criteria. The research study reported here presents a more in-depth analysis of the mathematical operations within the proposed EA measurement solutions, and for each EA entity type, each mathematical operation used to measure EA was examined in terms of the scale types and measurement units of the inputs, their transformations through mathematical operations, the impact in terms of scale types, and measurement units of the proposed outputs. Contribution: This study adds to the body of knowledge on EA measurement by offering a metrology-based approach to analyze and design better EA measurement solutions that satisfy the validity of scale type transformations in mathematical operations and the use of explicit measurement units to allow measurement consistency for their usage in decision-making models. Findings: The findings from this study reveal that some important metrology and quantification issues have been overlooked in the design of EA measurement solutions proposed in the literature: a number of proposed EA mathematical operations produce numbers with unknown units and scale types, often the result of an aggregation of undetermined assumptions rather than explicit quantitative knowledge. The significance of such aggregation is uncertain, leading to numbers that have suffered information loss and lack clear meaning. It is also unclear if it is appropriate to add or multiply these numbers together. Such EA numbers are deemed to have low metrological quality and could potentially lead to incorrect decisions with serious and costly consequences. Recommendations for Practitioners: The results of the study provide valuable insights for professionals in the field of EA. Identifying the metrology limitations and weaknesses of existing EA measurement solutions may indicate, for instance, that practitioners should wait before using them until their design has been strengthened. In addition, practitioners can make informed choices and select solutions with a more robust metrology design. This, in turn, will benefit enterprise architects, software engineers, and other EA professionals in decision making, by enabling them to take into consideration factors more adequately such as cost, quality, risk, and value when assessing EA features. The study’s findings thus contribute to the development of more reliable and effective EA measurement solutions. Recommendation for Researchers: Researchers can use with greater confidence the EA measurement solutions with admissible mathematical operations and measurement units to develop new decision-making models. Other researchers can carry on research to address the weaknesses identified in this study and propose improved ones. Impact on Society: Developers, architects, and managers may be making inappropriate decisions based on seriously flawed EA measurement solutions proposed in the literature and providing undue confidence and a waste of resources when based on bad measurement design. Better quantitative tools will ultimately lead to better decision making in the EA domain, as in domains with a long history of rigor in the design of the measurement tools. Such advancements will benefit enterprise architects, software engineers, and other practitioners, by providing them with more meaningful measurements for informed decision making. Future Research: While the analysis described in this study has been explicitly applied to evaluating EA measurement solutions, researchers and practitioners in other domains can also examine measurement solutions proposed in their respective domains and design new ones. Full Article
anal Employing Artificial Neural Networks and Multiple Discriminant Analysis to Evaluate the Impact of the COVID-19 Pandemic on the Financial Status of Jordanian Companies By Published On :: 2023-05-08 Aim/Purpose: This paper aims to empirically quantify the financial distress caused by the COVID-19 pandemic on companies listed on Amman Stock Exchange (ASE). The paper also aims to identify the most important predictors of financial distress pre- and mid-pandemic. Background: The COVID-19 pandemic has had a huge toll, not only on human lives but also on many businesses. This provided the impetus to assess the impact of the pandemic on the financial status of Jordanian companies. Methodology: The initial sample comprised 165 companies, which was cleansed and reduced to 84 companies as per data availability. Financial data pertaining to the 84 companies were collected over a two-year period, 2019 and 2020, to empirically quantify the impact of the pandemic on companies in the dataset. Two approaches were employed. The first approach involved using Multiple Discriminant Analysis (MDA) based on Altman’s (1968) model to obtain the Z-score of each company over the investigation period. The second approach involved developing models using Artificial Neural Networks (ANNs) with 15 standard financial ratios to find out the most important variables in predicting financial distress and create an accurate Financial Distress Prediction (FDP) model. Contribution: This research contributes by providing a better understanding of how financial distress predictors perform during dynamic and risky times. The research confirmed that in spite of the negative impact of COVID-19 on the financial health of companies, the main predictors of financial distress remained relatively steadfast. This indicates that standard financial distress predictors can be regarded as being impervious to extraneous financial and/or health calamities. Findings: Results using MDA indicated that more than 63% of companies in the dataset have a lower Z-score in 2020 when compared to 2019. There was also an 8% increase in distressed companies in 2020, and around 6% of companies came to be no longer healthy. As for the models built using ANNs, results show that the most important variable in predicting financial distress is the Return on Capital. The predictive accuracy for the 2019 and 2020 models measured using the area under the Receiver Operating Characteristic (ROC) graph was 87.5% and 97.6%, respectively. Recommendations for Practitioners: Decision makers and top management are encouraged to focus on the identified highly liquid ratios to make thoughtful decisions and initiate preemptive actions to avoid organizational failure. Recommendation for Researchers: This research can be considered a stepping stone to investigating the impact of COVID-19 on the financial status of companies. Researchers are recommended to replicate the methods used in this research across various business sectors to understand the financial dynamics of companies during uncertain times. Impact on Society: Stakeholders in Jordanian-listed companies should concentrate on the list of most important predictors of financial distress as presented in this study. Future Research: Future research may focus on expanding the scope of this study by including other geographical locations to check for the generalisability of the results. Future research may also include post-COVID-19 data to check for changes in results. Full Article
anal A New Model for Collecting, Storing, and Analyzing Big Data on Customer Feedback in the Tourism Industry By Published On :: 2023-05-07 Aim/Purpose: In this study, the research proposes and experiments with a new model of collecting, storing, and analyzing big data on customer feedback in the tourism industry. The research focused on the Vietnam market. Background: Big Data describes large databases that have been “silently” built by businesses, which include product information, customer information, customer feedback, etc. This information is valuable, and the volume increases rapidly over time, but businesses often pay little attention or store it discretely, not centrally, thereby wasting an extremely large resource and partly causing limitations for business analysis as well as data. Methodology: The study conducted an experiment by collecting customer feedback data in the field of tourism, especially tourism in Vietnam, from 2007 to 2022. After that, the research proceeded to store and mine latent topics based on the data collected using the Topic Model. The study applied cloud computing technology to build a collection and storage model to solve difficulties, including scalability, system stability, and system cost optimization, as well as ease of access to technology. Contribution: The research has four main contributions: (1) Building a model for Big Data collection, storage, and analysis; (2) Experimenting with the solution by collecting customer feedback data from huge platforms such as Booking.com, Agoda.com, and Phuot.vn based on cloud computing, focusing mainly on tourism Vietnam; (3) A Data Lake that stores customer feedback and discussion in the field of tourism was built, supporting researchers in the field of natural language processing; (4) Experimental research on the latent topic mining model from the collected Big Data based on the topic model. Findings: Experimental results show that the Data Lake has helped users easily extract information, thereby supporting administrators in making quick and timely decisions. Next, PySpark big data processing technology and cloud computing help speed up processing, save costs, and make model building easier when moving to SaaS. Finally, the topic model helps identify customer discussion trends and identify latent topics that customers are interested in so business owners have a better picture of their potential customers and business. Recommendations for Practitioners: Empirical results show that facilities are the factor that customers in the Vietnamese market complain about the most in the tourism/hospitality sector. This information also recommends that practitioners reduce their expectations about facilities because the overall level of physical facilities in the Vietnamese market is still weak and cannot be compared with other countries in the world. However, this is also information to support administrators in planning to upgrade facilities in the long term. Recommendation for Researchers: The value of Data Lake has been proven by research. The study also formed a model for big data collection, storage, and analysis. Researchers can use the same model for other fields or use the model and algorithm proposed by this study to collect and store big data in other platforms and areas. Impact on Society: Collecting, storing, and analyzing big data in the tourism sector helps government strategists to identify tourism trends and communication crises. Based on that information, government managers will be able to make decisions and strategies to develop regional tourism, propose price levels, and support innovative programs. That is the great social value that this research brings. Future Research: With each different platform or website, the study had to build a query scenario and choose a different technology approach, which limits the ability of the solution’s scalability to multiple platforms. Research will continue to build and standardize query scenarios and processing technologies to make scalability to other platforms easier. Full Article
anal Feature analytics of asthma severity levels for bioinformatics improvement using Gini importance By www.inderscience.com Published On :: 2024-11-08T23:20:50-05:00 In the context of asthma severity prediction, this study delves into the feature importance of various symptoms and demographic attributes. Leveraging a comprehensive dataset encompassing symptom occurrences across varying severity levels, this investigation employs visualisation techniques, such as stacked bar plots, to illustrate the distribution of symptomatology within different severity categories. Additionally, correlation coefficient analysis is applied to quantify the relationships between individual attributes and severity levels. Moreover, the study harnesses the power of random forest and the Gini importance methodology, essential tools in feature importance analytics, to discern the most influential predictors in asthma severity prediction. The experimental results bring to light compelling associations between certain symptoms, notably 'runny-nose' and 'nasal-congestion', and specific severity levels, elucidating their potential significance as pivotal predictive indicators. Conversely, demographic factors, encompassing age groups and gender, exhibit comparatively weaker correlations with symptomatology. These findings underscore the pivotal role of individual symptoms in characterising asthma severity, reinforcing the potential for feature importance analysis to enhance predictive models in the realm of asthma management and bioinformatics. Full Article
anal A Systems Engineering Analysis Method for the Development of Reusable Computer-Supported Learning Systems By Published On :: Full Article
anal Enterprise E-Learning Success Factors: An Analysis of Practitioners’ Perspective (with a Downturn Addendum) By Published On :: Full Article
anal Comparison of Online Learning Behaviors in School vs. at Home in Terms of Age and Gender Based on Log File Analysis By Published On :: Full Article
anal Drills, Games or Tests? Evaluating Students' Motivation in Different Online Learning Activities, Using Log File Analysis By Published On :: Full Article
anal Analyzing Associations between the Different Ratings Dimensions of the MERLOT Repository By Published On :: Full Article
anal A Study of Online Exams Procrastination Using Data Analytics Techniques By Published On :: Full Article
anal Design and Development of an E-Learning Environment for the Course of Electrical Circuit Analysis By Published On :: Full Article
anal Faculty Usage of Social Media and Mobile Devices: Analysis of Advantages and Concerns By Published On :: Full Article
anal Analyzing the Quality of Students Interaction in a Distance Learning Object-Oriented Programming Discipline By Published On :: 2015-07-29 Teaching object-oriented programming to students in an in-classroom environment demands well-thought didactic and pedagogical strategies in order to guarantee a good level of apprenticeship. To teach it on a completely distance learning environment (e-learning) imposes possibly other strategies, besides those that the e-learning model of Open University of Portugal dictates. This article analyses the behavior of the students of the 1st cycle in Computer Science while interacting with the object-oriented programming (OOP) discipline available to them on the Moodle platform. Through the evaluation of the level of interaction achieved in a group of relevant selected actions by the students, it is possible to identify their relevancy to the success of the programming learning process. Data was extracted from Moodle, numerically analyzed, and, with the use of some charts, behavior patterns of students were identified. This paper points out potential new approaches to be considered in e-learning in order to enhance programming learning results, besides confirming a high level of drop-out and a low level of interaction, thus finding no clear correlation between students’ success and the number of online actions (especially in forums), which reveals a possible failure of the main pillar on which the e-learning model relies. Full Article
anal Analyzing the Discourse of Chais Conferences for the Study of Innovation and Learning Technologies via a Data-Driven Approach By Published On :: 2016-12-26 The current rapid technological changes confront researchers of learning technologies with the challenge of evaluating them, predicting trends, and improving their adoption and diffusion. This study utilizes a data-driven discourse analysis approach, namely culturomics, to investigate changes over time in the research of learning technologies. The patterns and changes were examined on a corpus of articles published over the past decade (2006-2014) in the proceedings of Chais Conference for the Study of Innovation and Learning Technologies – the leading research conference on learning technologies in Israel. The interesting findings of the exhaustive process of analyzing all the words in the corpus were that the most commonly used terms (e.g., pupil, teacher, student) and the most commonly used phrases (e.g., face-to-face) in the field of learning technologies reflect a pedagogical rather than a technological aspect of learning technologies. The study also demonstrates two cases of change over time in prominent themes, such as “Facebook” and “the National Information and Communication Technology (ICT) program”. Methodologically, this research demonstrates the effectiveness of a data-driven approach for identifying discourse trends over time. Full Article