Implementation of total quality management based on soft factor, In 1980s, among practitioners, scholars and consulters of quality management, a new concept evolved under the name of Total Quality Management (TQM). The concept was strongly influenced by the experience of high-quality products from Japanese manufactures and by what had been learned about Japanese approaches to quality management (Zandin, 2001). TQM is an approach for continuously improving the quality of goods and services delivered through the participation of individuals at all levels of an organization (Pfau, 1989). TQM is more than a program; it is a way of business management for the whole organization. It is a holistic corporate philosophy including three fundamental principles of ‘Total’ as participation of every person and every department; ‘Quality’ as meeting customer needs and expectations; and ‘Management’ as enabling conditions for total quality (Whyte & Witcher, 1992). Thus, TQM is defined as a comprehensive management philosophy which provides continuous improvement to all functions of an organization, and it is achieved when the subject of total quality is utilized from the acquisition of resources to customer service (Kaynak, 2003).TQM practices have been published extensively in measurement studies as well as in the studies that the relationship of TQM practices with various dependent variables has been investigated. TQM emphasizes that customer requirements and business goals are inseparable. It affirms an integrated management approach based on a set of techniques to achieve this objective. It requires cooperation among every part and demands fundamental changes in every aspect of the organization. It also requires continuous improvement not only in products/services quality but also in all operations for creating an organizational quality culture (Yusuf et al., 2007). It is important to establish a positive TQM environment in the whole organization in order to implement TQM. If every department and individual understands the needs, process and benefits of TQM, employees will accept the TQM philosophy and as a result will do their job more effectively (Yusuf et al. 2007).During the past 20 years, many authors have focused on the TQM factors and its dimensions. Ross (1993) explained TQM as a set of practices, continuous improvement, meeting customers’ requirements, reducing rework, increased employee involvement and teamwork, process redesign, competitive benchmarking, team-based problem-solving, constant measurement of results, and closer relationships with suppliers. In this respect, several researchers have suggested different soft and hard factors for TQM. Hard factors are related to the techniques and tools such as statistical process control and problem solving methods while the soft factors refer to the “management” part of the TQM which involves people, culture and improvement.Powell (1995) suggested 12 factors for TQM programs as it is shown in appendix A. Rahman andBullock (2005) proposed a logical approach to study the soft TQM, hard TQM and organizational performance relationships. Al-Marriet al. (2007) found 16 factorsas critical in successful implementation of TQM in a service industry. They include top management support, strategy, continuous improvement, benchmarking, customer focus,quality department, quality system, human resource management, recognition and reward, problem analysis, quality service technologies, service design, employees, services capes, service culture and social responsibility. As it was mentioned earlier, the soft dimensions of TQM refer to the management perspective of TQM practices. In the Powell’s framework, flexible manufacturing and measurement are related to the hard aspects and the other factors refer to the soft aspects
Barriers to implement green supply chain management in automobile industry using interpretive structural modeling technique, Along with the rapid change in global manufacturing scenario, environmental and social issues are becoming more important in managing any business. Green supply Chain Management (GSCM) is an approach to improve performance of the process and products according to the requirements of the environmental regulations (Hsu & Hu, 2008). GSCM has emerged in the last few years and covers all phases of product’s life cycle from design, production and distribution phases to the use of products by the end users and its disposal at the end of product’s life cycle (Borade & Bansod, 2007). GSCM is integrating environmental thinking (Gilbert, 2000) into Supply Chain Management (SCM).Awareness level of customers of Green practices opted by organizations has got raised in India also. So organizations need to focus on the utilization of energy and resources for making environmentally sound supply chain The objective of this paper is to identify various barriers to implement GSCM in Indian automobile industry, to indentify further the contextual relationship among the identified barriers to implement GSCM, to classify these barriers depending upon their driving and dependence power and finally to develop ISM based model of these barriers. ISM is a well established methodology for identifying relationship among specific item which define problem or an issue (Sage, 1977). Section 2 consists of literature review of GSCM. Barriers to implement GSCM relevant to Indian automobile industry have been identified and described in section 3. Step wise elaborated procedure of interpretive structural modeling of these barriers to implement in Indian automobile industry follows in section 4. Section 5 will help the readers to deal with these above said barriers to implement GSCM in Indian automobile industry efficiently and effectively. Conclusions, limitations of the study and scope of the future work have been discussed in subsequent sections.
The costs of poor data quality, Data are used in almost all the activities of companies and constitute the basis for decisions on operational and strategic levels. Poor quality data can, therefore, have significantly negative impacts on the efficiency of an organization, while high quality data are often crucial to a company’s success (Madnick et al., 2004; Haug et al., 2009; Batini et al., 2009; Even & Shankaranarayanan, 2009). However, several industry expert surveys indicate that data quality is an area, to which many companies seem not to give sufficient attention or know how to deal with efficiently (Marsh, 2005; Piprani & Ernst, 2008; Jing-hua et al., 2009). Vayghan et al. (2007) classify the data that most enterprises deal with in three categories: master data, transactional data, and historical data. Master data are defined as the basic characteristics of business entities, i.e. customers, products, employees, suppliers, etc. Thus, typically, master data are created once, used many times and do not change frequently (Knolmayer & Röthlin, 2006). Transaction data describe the relevant events in a company, i.e. orders, invoices, payments, deliveries, storage records etc. Since transactions are based on master data, erroneous master datacan have significant costs, e.g. an incorrect priced item may imply that money is lost. In this context Knolmayer and Röthlin (2006) argue that capturing and processing master data are error-prone activities where inappropriate information system architectures, insufficient coordination with business processes, inadequate software implementations or inattentive user behaviour may lead to disparate master data.
In spite of the importance of having correct and adequate data in a company, there seems to be ageneral agreement in literature that poor quality data is a problem in many companies. In fact, much academic literature claims that poor quality business data constitute a significant cost factor for many companies, which is supported by findings from several surveys from industrial experts (Marsh, 2005). On the other hand, Eppler and Helfert (2004) argue that although there is much literature that claims that the costs of poor data quality are significant in many companies, only very few studies demonstrate how to identify, categorize and measure such costs (i.e. how to establish the causal links between poor data quality and monetary effects). This is supported by Kim and Choi (2003) who state: “There have been limited efforts to systematically understand the effects of low quality data. The efforts have been directed to investigating the effects of data errors on computer-based models such as neural networks, linear regression models, rule-based systems, etc.” and “In practice, low quality data can bringmonetary damages to an organization in a variety of ways”. According to Kim (2002), the types of damage that low quality data can cause depend on the nature of data, the nature of the use of data, the types of responses (by the customers or citizens) to the damages, etc.As such, companies typically incur costs from two sides when speaking of master data quality. Firstly, companies incur costs when cleaning and ensuring high master data quality. Secondly, companies also incur costs for data that are not cleaned as poor master data quality might lead to faulty managerial decision-making. The purpose of this paper is to provide a better understanding of the relationship between such costs. To help determine the optimal data quality maintenance efforts, the paper provides: (1) a definition of the optimal data maintenance effort; and (2) a classification of costs inflicted by poor quality data. In this context the paper argues that there is a clear trade-off relationship between these two cost types and that thetask facing the companies in turn is to balance this trade-off. The remainder of the paper is organized as follows: First, literature on data quality is discussed in Section 2. Next, Section 3 proposes a model to determine the optimal data maintenance effort and a classification of different types of costs inflicted by poor quality data. Section 4 presents a case study to illustrate the application of the proposition. The paper ends with a conclusion in section 5.
Active learning in Operations Management: interactive multimedia software for teaching JIT/Lean Production, European higher education is increasingly becoming immersed in two realities that will without doubt have a bearing on its future over the coming decades. On the one hand, the impact that Information & Communication Technologies (ICT) are having on teaching methodology, the way teaching processes are conducted, and the fashion in which teachers relate to students. On the other hand, the legal framework that regulates university education in Europe is in the process of adapting to the European Higher Education Area (EHEA), instigated by the European Union. The Sorbonne (1998), Bologna (1999), Prague (2001), Berlin (2003), Bergen (2005), London (2007) &Leuven (2009) declarations, all agree on the need to harmonise higher education in Europe. This harmonisation requires significant changes to be made to the university system including the need to adapt teaching methodologies to the new challenges that are being faced. For example, the European Credit Transfer System (ECTS) is bringing about a decrease in the number of on-site class hours, which is affecting the teaching methodology to be used. This teaching philosophy is aimed at greater student autonomy and means a change from a teaching-based to a learning-based focus facilitated by the new technologies (European Commission, 2002). In other respects, there is also a patent need to respond to a growing demand for lifelong learning and distance learning. Going hand-in-hand with this is the need to cater for new student profiles, with a wide range of motivations, ages, interests and spaces.In these circumstances, we consider the development of new learning tools which incorporate the advantages provided by ICT to be an appropriate response to this need. ICT are having a clear effect on teaching-learning models, considerably broadening the possibilities of further education to the extent that it becomes, in practical terms, permanent education. They are thus also having a strong impact on the teaching methodology, the teaching/learning process linkage and on the relationship between instructors and students. ICT-based learning tools are becoming very powerful tools for conveying informationas they are significantly changing communication between the actors in the teaching-learning process. Institutions responsible for further education are well-aware of this phenomenon (NCET (National Council for Educational Technologies of UK) (1994), European Commission (1996), AACSB (American Assembly of Collegiate Schools of Business) (1997) and UNESCO (Andresen & Brink, 2002)). The interest shown by theEuropean Union in the latest Framework Programmes is a faithful reflection of the importance that itsmember States are affording the study, promotion and incorporation of the new technologies in Higher Education. Therefore, continuing along the lines of earlier initiatives, the Fifth European Community Framework Programme coveringResearch, TechnologicalDevelopment and Demonstration Activities (1998-2002) proposed, Information Society Technologies (IST) and a range of lines of action related to the use of ICT in teaching environments among its priority topics. These lines included the “development of tools, open platforms, advanced personalised-teaching systems and large-scale experiments to achieve the flexible university of the future”. Meanwhile IST continued to be a priority topic in the 6th Framework Programme with one of its prime objectives being the eEurope Action Plan. This initiative gave great importance to the “effective application of ICT in education and permanent learning” For this reason the eLearning Programme (2004-2006) was also adopted with one of its basic lines being “to develop virtual university campuses”. The Seventh Framework Programme (2007-2013) also provides for research initiatives in this same line in its “ICT Work Programme 2011-2012”. This is divided into eight Challenges that are considered to be of strategic interest for European society, one of which focuses on the use of the ICT in teaching both in educational institutions, including universities, and in the workplace (Challenge 8 -ICT for Learning and Access to Cultural Resources). One of the aspects included is the promotion of projects linked to developing tools that enable creative, non-linear learning, and the use of ICT for continuous training and the creation of new learning models on the basis of these technologies.