Home About us Contact | |||
Software Industry (software + industry)
Selected AbstractsBringing High Technology to Market: Successful Strategies Employed in the Worldwide Software IndustryTHE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 6 2006Chris Easingwood The launch stage can be critical for many new products, but particularly so for technology-intensive ones. This study examines this key stage in a high-tech sector: the worldwide computer software industry. Using a research instrument developed across a number of high-tech sectors, but adapted to the targeted sector, it describes a worldwide telephone-based survey of 300 organizations, resulting in 190 interviews, a response rate of 63%. It shows that five distinct and interpretable strategies are employed: (1) alliance strategy involves forming early strategic alliances as well as tactical alliances at the execution stage together with the development of unique distribution channels; (2) targeted low risk attempts to reduce the risk of adoption among identified segments by producing versions of the product specifically customized to the segments; (3) low-price original equipment manufacturer (OEM) is the only price-driven strategy and combines low price with channel building to OEMs who are looking for attractive price-to-performance ratios; (4) broadly based market preparation is an early-stage strategy that concentrates on educating the market vis-à-vis the technology and developing channels; and (5) niche-based technological superiority uses a technologically superior product to dominate a niche and corresponds closely to the chasm-crossing strategy expounded by Moore and others. Regarding superior product performance, successful software companies first of all engage in a broadly based preparation of the market but switch to a targeted strategy at the following stages of positioning and execution, built around superior technological performance and reduced risk. A somewhat different mix of strategies is adopted when the objective is superior market development, namely opening up new markets, reaching new customers, and developing new product platforms. Again the mix includes broadly based market preparation, this time along with alliances. This strategy is very much about working with partners. The broadly based market preparation strategy is key for both objectives, is long term in nature, and avoids narrowly defined niches. It seems that starting broad based and narrowing down, perhaps to a niche, only at a later stage when this is clearly the appropriate thing to do, pays dividends. [source] Serious Games: Broadening Games Impact Beyond EntertainmentCOMPUTER GRAPHICS FORUM, Issue 3 2007Ben Sawyer Computer and videogames for many years has been an island of technology and design innovation largely left to itself as it morphed from a cottage business into a global media and software industry. While there have been pockets of derivative activity related to games and game technology only in the last half-dozen years has there been a real movement toward exploiting this industry in many new and exciting ways. Today the general use of games and game technologies for purposes beyond entertainment is collectively referred to as serious games. The Serious Games Initiative was formed in 2002 and since its inception has been among a number of critical efforts that has helped open up the world and many disciplines to the ideas and innovations that may be sourced from the commercial, independent, and academic game fields. This has been a person-by-person, project-by-project effort that not only has informed us about the potential of games but also in how you merge innovation and innovators from one discipline with those in another. In this talk we will explore the total gamut of the serious games field identifying past the obvious how games and game technologies are being applied to problems in a wide array of areas including healthcare, productivity, visualization, science, and of course training and education. Once a proper definition of serious games is established the talk will focus on the current state of the field as it relates to research and infrastructure issues that are needed to make the difference between seeing serious games take hold as a major new practice or having it devolve into another trend of the moment lost to history. [source] Defining Expertise in Software Development While Doing GenderGENDER, WORK & ORGANISATION, Issue 4 2007Esther Ruiz Ben The optimism regarding opportunities for women to enter the professionalization process in software development during the past years has not been fully realized and the gender gap in Germany's information technology (IT) sector still persists. Women are almost completely unrepresented in the technical fields of the German software industry, particularly in small enterprises. In this article, I firstly offer an overview of the German IT sector's development and current status. Secondly, I discuss the construction of expertise and gendered meanings in the practice of software development and related implications for the enrolment of women in this field. Gender stereotypical assumptions about expertise in the practice of software development and structural factors related to the lack of life,work balance programmes, as well as the lack of internal training in most IT companies, contribute to organizational segregation [source] Competition and the Quality of Standard Form Contracts: The Case of Software License AgreementsJOURNAL OF EMPIRICAL LEGAL STUDIES, Issue 3 2008Florencia Marotta-Wurgler Standard form contracts are pervasive. Many legal academics believe that they are unfair. Some scholars and some courts have argued that sellers with market power or facing little competitive pressure may impose one-sided standard form terms that limit their obligation to consumers. This article uses a sample of 647 software license agreements drawn from many distinct segments of the software industry to empirically investigate the relationship between competitive conditions and the quality of standard form contracts. I find little evidence for the concern that firms with market power, as measured by market concentration or firm market share, require consumers to accept particularly one-sided terms; that is, firms in both concentrated and unconcentrated software market segments, and firms with high and low market share, offer similar terms to consumers. The results have implications for the judicial analysis of standard form contract enforceability. [source] Relative Value Relevance of R&D Reporting: An International ComparisonJOURNAL OF INTERNATIONAL FINANCIAL MANAGEMENT & ACCOUNTING, Issue 2 2002Ronald Zhao This study examines the relative value relevance of R&D reporting in France, Germany, the UK and the USA. France and the UK allow conditional capitalization of R&D costs, whereas Germany and the USA (except for the software industry) require the full and immediate expensing of all R&D costs. The relative value relevance of R&D reporting under different R&D accounting standards are compared while controlling for the reporting environment. Test results suggest that the level of R&D reporting has a significant effect on the association of equity price with accounting earnings and book value. The reporting of total R&D costs provides additional information to accounting earnings and book value in Germany and the USA (expensing countries), and the allocation of R&D costs between capitalization and expense further increases the value relevance of R&D reporting in France and the UK (capitalizing countries), including firms in the US software industry. [source] The Dynamic Influence of Social Capital on the International Growth of New VenturesJOURNAL OF MANAGEMENT STUDIES, Issue 6 2010Shameen Prashantham abstract This paper explores the origin, evolution, and appropriation of social capital by new ventures seeking international growth. Using longitudinal case studies in the software industry, we model the dynamic influence of social capital on new venture internationalization. We theorize that new ventures of founders from a globally-connected environment, such as with return migration or MNC experience, have higher stocks of initial social capital than others. We provide a nuanced analysis of the dynamic processes involved in the evolution of social capital, and highlight the mechanisms of decay and replenishment over time. Network learning plays a critical role in new ventures' ability to realize the potential contribution of social capital to international growth. [source] Industry Event Participation and Network Brokerage among Entrepreneurial VenturesJOURNAL OF MANAGEMENT STUDIES, Issue 4 2010Wouter Stam abstract Despite the recognition that network brokerage is beneficial for entrepreneurial ventures, little is known about its antecedents. This study examines how participation in industry events (e.g. conferences) relates to entrepreneurs' brokerage positions in informal industry networks and how these positions, in turn, impact new venture performance. Using a unique dataset of 45 events and subsequent network relations among entrepreneurs from 90 firms in the open source software industry, results indicate that: (1) entrepreneurs who participated in heterogeneous events or who bridged between events with few common participants were more likely to be brokers; (2) the relationship between event bridging and brokerage was stronger for entrepreneurs with broader prior career experiences; and (3) network brokerage mediated the event participation,performance link. It appears that events may limit structural opportunities for brokerage and that individual differences matter for exploiting these opportunities. Overall, this study increases understanding of how and when particular networking behaviours are beneficial for entrepreneurs. [source] Half a Century of Public Software Institutions: Open Source as a Solution to Hold-Up ProblemJOURNAL OF PUBLIC ECONOMIC THEORY, Issue 4 2010MICHAEL SCHWARZ We argue that the intrinsic inefficiency of proprietary software has historically created a space for alternative institutions that provide software as a public good. We discuss several sources of such inefficiency, focusing on one that has not been described in the literature: the underinvestment due to fear of hold-up. An inefficient hold-up occurs when a user of software must make complementary investments, when the return on such investments depends on future cooperation of the software vendor, and when contracting about a future relationship with the software vendor is not feasible. We also consider how the nature of the production function of software makes software cheaper to develop when the code is open to the end users. Our framework explains why open source dominates certain sectors of the software industry (e.g., programming languages), while being almost non existent in some other sectors (e.g., computer games). We then use our discussion of efficiency to examine the history of institutions for provision of public software from the early collaborative projects of the 1950s to the modern "open source" software institutions. We look at how such institutions have created a sustainable coalition for provision of software as a public good by organizing diverse individual incentives, both altruistic and profit-seeking, providing open source products of tremendous commercial importance, which have come to dominate certain segments of the software industry. [source] Test processes in software product evolution,a qualitative survey on the state of practiceJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 1 2003Per Runeson Abstract In order to understand the state of test process practices in the software industry, we have conducted a qualitative survey, covering software development departments at 11 companies in Sweden of different sizes and application domains. The companies develop products in an evolutionary manner, which means either new versions are released regularly, or new product variants under new names are released. The survey was conducted through workshop and interview sessions, loosely guided by a questionnaire scheme. The main conclusions of the survey are that the documented development process is emphasized by larger organizations as a key asset, while smaller organizations tend to lean more on experienced people. Further, product volution is performed primarily as new product variants for embedded systems, and as new versions for packaged software. The development is structured using incremental development or a daily build approach; increments are used among more process-focused organizations, and daily build is more frequently utilized in less process-focused organizations. Test automation is performed using scripts for products with focus on functionality, and recorded data for products with focus on non-functional properties. Test automation is an issue which most organizations want to improve; handling the legacy parts of the product and related documentation presents a common problem in improvement efforts for product evolution. Copyright © 2003 John Wiley & Sons, Ltd. [source] Predicting the Number of Defects Remaining In Operational SoftwareNAVAL ENGINEERS JOURNAL, Issue 1 2001P. J. Hartman Ph.D ABSTRACT Software is becoming increasingly critical to the Fleet as more and more commercial off-the-shelf (COTS) programs are being introduced in operating systems and applications. Program managers need to specify, contract, and manage the development and integration of software for warfare systems, condition based monitoring, propulsion control, parts requisitions, and shipboard administration. The intention here is to describe the state-of-the-art in Software Reliability Engineering (SRE) and defect prediction for commercial and military programs. The information presented here is based on data from the commercial software industry and shipboard program development. The strengths and weaknesses of four failure models are compared using these cases. The Logarithmic Poisson Execution Time (LPET) model best fits the data and satisfied the fundamental principles of reliability theory. The paper presents the procedures for defining software failures, tracking defects, and making spreadsheet predictions of the defects still remaining in the software after it has been deployed. Rules-of-thumb for the number of defects in commercial software and the relative expense required to fix these errors are provided for perspective. [source] Skills shortages are not always what they seem: migration and the Irish software industryNEW TECHNOLOGY, WORK AND EMPLOYMENT, Issue 1-2 2008James Wickham This paper argues that the skills shortage in the Irish software industry is socially produced by a range of domestic factors, especially the education and training system. It also contends that immigration reinforces rather than resolves skills shortages. [source] Intellectual Property, Architecture, and the Management of Technological Transitions: Evidence from Microsoft CorporationTHE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 3 2009Alan MacCormack Many studies highlight the challenges facing incumbent firms in responding effectively to major technological transitions. Though some authors argue that these challenges can be overcome by firms possessing what have been called dynamic capabilities, little work has described in detail the critical resources that these capabilities leverage or the processes through which these resources accumulate and evolve. This paper explores these issues through an in-depth exploratory case study of one firm that has demonstrated consistently strong performance in an industry that is highly dynamic and uncertain. The focus for the present study is Microsoft, the leading firm in the software industry. The focus on Microsoft is motivated by providing evidence that the firm's product performance has been consistently strong over a period of time in which there have been several major technological transitions,one indicator that a firm possesses dynamic capabilities. This argument is supported by showing that Microsoft's performance when developing new products in response to one of these transitions,the growth of the World Wide Web,was superior to a sample of both incumbents and new entrants. Qualitative data are presented on the roots of Microsoft's dynamic capabilities, focusing on the way that the firm develops, stores, and evolves its intellectual property. Specifically, Microsoft codifies knowledge in the form of software "components," which can be leveraged across multiple product lines over time and accessed by firms developing complementary products. The present paper argues that the process of componentization, the component "libraries" that result, the architectural frameworks that define how these components interact, and the processes through which these components are evolved to address environmental changes represent critical resources that enable the firm to respond to major technological transitions. These arguments are illustrated by describing Microsoft's response to two major technological transitions. [source] Bringing High Technology to Market: Successful Strategies Employed in the Worldwide Software IndustryTHE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 6 2006Chris Easingwood The launch stage can be critical for many new products, but particularly so for technology-intensive ones. This study examines this key stage in a high-tech sector: the worldwide computer software industry. Using a research instrument developed across a number of high-tech sectors, but adapted to the targeted sector, it describes a worldwide telephone-based survey of 300 organizations, resulting in 190 interviews, a response rate of 63%. It shows that five distinct and interpretable strategies are employed: (1) alliance strategy involves forming early strategic alliances as well as tactical alliances at the execution stage together with the development of unique distribution channels; (2) targeted low risk attempts to reduce the risk of adoption among identified segments by producing versions of the product specifically customized to the segments; (3) low-price original equipment manufacturer (OEM) is the only price-driven strategy and combines low price with channel building to OEMs who are looking for attractive price-to-performance ratios; (4) broadly based market preparation is an early-stage strategy that concentrates on educating the market vis-à-vis the technology and developing channels; and (5) niche-based technological superiority uses a technologically superior product to dominate a niche and corresponds closely to the chasm-crossing strategy expounded by Moore and others. Regarding superior product performance, successful software companies first of all engage in a broadly based preparation of the market but switch to a targeted strategy at the following stages of positioning and execution, built around superior technological performance and reduced risk. A somewhat different mix of strategies is adopted when the objective is superior market development, namely opening up new markets, reaching new customers, and developing new product platforms. Again the mix includes broadly based market preparation, this time along with alliances. This strategy is very much about working with partners. The broadly based market preparation strategy is key for both objectives, is long term in nature, and avoids narrowly defined niches. It seems that starting broad based and narrowing down, perhaps to a niche, only at a later stage when this is clearly the appropriate thing to do, pays dividends. [source] Interoperability and Other Issues at the IP,Anti-trust Interface: The EU Microsoft CaseTHE JOURNAL OF WORLD INTELLECTUAL PROPERTY, Issue 4 2008Dr Duncan Curley The judgment in 2007 of the Court of First Instance in Microsoft Corporation v European Commission was the culmination of one of the biggest anti-trust battles ever to have taken place in the European Union. Although most aspects of the European Commission's original decision of 2004 were upheld, the Microsoft case remains interesting at several levels. The judgment deals with the question of when it is permissible, in the public interest, to encroach upon the exclusivity of intellectual property rights-holders, by requiring the grant of licences to third parties seeking to enter or remain on the market. The case provides an illustration of Community policy objectives being implemented through the medium of the competition rules, namely the opening up of the software industry to more competition and the encouragement of innovation in information technology. It also provides an illustration of differing attitudes to the anti-trust regulation of unilateral conduct by companies with a dominant market position in Europe and the United States. [source] Sequential innovation, patents, and imitationTHE RAND JOURNAL OF ECONOMICS, Issue 4 2009James Bessen We argue that when innovation is "sequential" (so that each successive invention builds in an essential way on its predecessors) and "complementary" (so that each potential innovator takes a different research line), patent protection is not as useful for encouraging innovation as in a static setting. Indeed, society and even inventors themselves may be better off without such protection. Furthermore, an inventor's prospective profit may actually be enhanced by competition and imitation. Our sequential model of innovation appears to explain evidence from a natural experiment in the software industry. [source] SEQUENTIAL MERGERS WITH DIFFERING DIFFERENTIATION LEVELS,AUSTRALIAN ECONOMIC PAPERS, Issue 3 2009TAKESHI EBINA We study sequential merger incentives under presence of product differentiation. Two sets of firms produce closely related goods, whereas each set produces more differentiated goods. Merger incentives under product differentiation are found to be stronger for two firms producing closely related goods than more differentiated goods. Also, after one merger, other firms are willing to follow with their own merger, resulting in sequential mergers. This result is consistent with the recent mergers in the video game software industry in Japan. [source] Learning and Organization in the Knowledge-Based Information Economy: Initial Findings from a Participatory Action Research Case StudyBRITISH JOURNAL OF MANAGEMENT, Issue 2 2000Richard T. Harrison This paper reports on an ongoing, multiphase, project-based action learning and research project. In particular, it summarizes some aspects of the learning climate and outcomes for a case-study company in the software industry. Using a participatory action research approach, the learning company framework developed by Pedler et al. (1997) is used to initiate critical reflection in the company at three levels: managing director, senior management team and technical and professional staff. As such, this is one of the first systematic attempts to apply this framework to the entire organization and to a company in the knowledge-based learning economy. Two sets of issues are of general concern to the company: internal issues surrounding the company's reward and recognition policies and practices and the provision of accounting and control information in a business-relevant way to all levels of staff; and external issues concerning the extent to which the company and its members actively learn from other companies and effectively capture, disseminate and use information accessed by staff in boundary-spanning roles. The paper concludes with some illustrations of changes being introduced by the company as a result of the feedback on and discussion of these issues. [source] |