Textual Data (textual + data)

Distribution by Scientific Domains


Selected Abstracts


The Textual Criticism of Middle English Manuscript Traditions: A Survey of Critical Issues in the Interpretation of Textual Data

LITERATURE COMPASS (ELECTRONIC), Issue 6 2009
Gavin Cole
This essay is intended to survey two broad issues which determine the use of textual data. The first is the underlying orientation towards the use of textual data and how this relates to critical evaluations of agency, authority and materiality. This essay surveys two broad orientations: (i) an essentially retrospective genetic orientation and, (ii) an orientation which focuses on the phenomenon of change. Both approaches are dependent on the ability to distinguish original readings from scribal readings, identify genetic relationships and account for acts of horizontal transmission. With this in mind, the second issue with which this essay is concerned is the importance of critical interpretation in the categorisation of textual data. This essay argues that textual criticism is a practical demonstration of the difficulties of interpretation and that no textual data ,has any real evidential value until it has been interpreted' (Patterson 90). [source]


An evaluation of nursing practice models in the context of the severe acute respiratory syndrome epidemic in Hong Kong: a preliminary study

JOURNAL OF CLINICAL NURSING, Issue 6 2006
Engle Angela Chan PhD
Aim and objective., Like other health-care workers, Hong Kong nurses had their professional knowledge and skills seriously challenged during the SARS outbreak. Could current nursing practices support the care of SARS or SARS-like patients in the future? If not, alternative practices would be needed. Providing a preliminary understanding, this paper compares the conventional with different nursing delivery models in a simulated SARS ward and focuses on nurses' efficiency, infection control practices and views of the two models. Design and methods., This study was conducted in three phases. First, a baseline understanding of nursing practices was achieved through four workflow observations. In an eight-hour day, four research assistants observed nursing activities in the medical and fever wards. These data were used in the second phase to construct two sets of clinical vignettes, pertaining to SARS patient care in both conventional and alternative practice models. These scripts were discussed with nine nurses of various ranks from the hospital under study for their expert validation and input. In the third phase, nurse participants and patient actors enacted the vignettes in a simulated setting. Video-taped observations and four nurse participant interviews were employed. Observational data were analysed through descriptive statistics and independent t -tests. Textual data were coded and categorized for common meanings. Results., Conventional practice from the findings consisted of cubicle and named nurse nursing. While the former reflected modified team and functional nursing, it did not confine patient care within a cubicle as suggested by its name. The latter depicted a modified primary nursing approach in a team, with delegation of care. Preliminary findings concerning infection control and nurse satisfaction revealed that the alternative model had an advantage over the conventional. Relevance to clinical practice., This study findings lay the foundation for clinical trials, which would evaluate the significance of patient-care quality, cost-effectiveness and better human resource management by restructuring current nursing practices. [source]


A new data model for XML databases

INTELLIGENT SYSTEMS IN ACCOUNTING, FINANCE & MANAGEMENT, Issue 3 2002
Richard Ho
The widespread activity involving the Internet and the Web causes large amounts of electronic data to be generated every day. This includes, in particular, semi-structured textual data such as electronic documents, computer programs, log files, transaction records, literature citations, and emails. Storing and manipulating the data thus produced has proven difficult. As conventional DBMSs are not suitable for handling semi-structured data, there is a strong demand for systems that are capable of handling large volumes of complex data in an efficient and reliable way. The Extensible Markup Language (XML) provides such solution. In this paper, we present the concept of a ,vertical view model' and its uses as a mapping mechanism for converting complex XML data to relational database tables, and as a standalone data model for storing complex XML data. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Using Multimedia to Introduce Young People to Public Art in Glasgow

INTERNATIONAL JOURNAL OF ART & DESIGN EDUCATION, Issue 3 2000
Glen Coutts
This paper is based on a presentation at the NSEAD/AAIAD Millennium Conference in Bristol, April 2000 and takes as its focus a recent multimedia publication, a CD-ROM, commissioned by Glasgow 1999, entitled ,Scanning the City'. The commission was to find effective ways that students in schools could interrogate the diverse urban fabric of Glasgow. The electronic revolution has shifted the paradigms of teaching and learning by creating the opportunity to engage interactively with visual and textual data in ways that permit investigation of the built environments at a number of levels of intensity. The paper explains the background to the CD-ROM, describes the design, content and theoretical underpinning of ,Scanning the City' and discusses ways it might be used in a variety of educational contexts. It concludes by looking forward to the next stages of the research including a study of how young people and teachers are using the CD-ROM and other related multimedia publications. [source]


Patient subjective experience and satisfaction during the perioperative period in the day surgery setting: A systematic review

INTERNATIONAL JOURNAL OF NURSING PRACTICE, Issue 4 2006
BN (Hons), Lenore Rhodes RN
This systematic review used the Joanna Briggs Institute Qualitative Assessment and Review Instrument to manage, appraise, analyse and synthesize textual data in order to present the best available information in relation to how patients experience nursing interventions and care during the perioperative period in the day surgery setting. Some of the significant findings that emerged from the systematic review include the importance of pre-admission contact, provision of relevant, specific education and information, improving communication skills and maintaining patient privacy throughout their continuum of care. [source]


The Textual Criticism of Middle English Manuscript Traditions: A Survey of Critical Issues in the Interpretation of Textual Data

LITERATURE COMPASS (ELECTRONIC), Issue 6 2009
Gavin Cole
This essay is intended to survey two broad issues which determine the use of textual data. The first is the underlying orientation towards the use of textual data and how this relates to critical evaluations of agency, authority and materiality. This essay surveys two broad orientations: (i) an essentially retrospective genetic orientation and, (ii) an orientation which focuses on the phenomenon of change. Both approaches are dependent on the ability to distinguish original readings from scribal readings, identify genetic relationships and account for acts of horizontal transmission. With this in mind, the second issue with which this essay is concerned is the importance of critical interpretation in the categorisation of textual data. This essay argues that textual criticism is a practical demonstration of the difficulties of interpretation and that no textual data ,has any real evidential value until it has been interpreted' (Patterson 90). [source]


Qualitative Data Collection and Analysis Methods: The INSTINCT Trial

ACADEMIC EMERGENCY MEDICINE, Issue 11 2007
William J. Meurer MD
Patient care practices often lag behind current scientific evidence and professional guidelines. The failure of such knowledge translation (KT) efforts may reflect inadequate assessment and management of specific barriers confronting both physicians and patients at the point of treatment level. Effective KT in this setting may benefit from the use of qualitative methods to identify and overcome these barriers. Qualitative methodology allows in-depth exploration of the barriers involved in adopting practice change and has been infrequently used in emergency medicine research. The authors describe the methodology for qualitative analysis within the INcreasing Stroke Treatment through INteractive behavioral Change Tactics (INSTINCT) trial. This includes processes for valid data collection and reliable analysis of the textual data from focus group and interview transcripts. INSTINCT is a 24-hospital, randomized, controlled study that is designed to evaluate a system-based barrier assessment and interactive educational intervention to increase appropriate tissue plasminogen activator (tPA) use in ischemic stroke. Intervention hospitals undergo baseline barrier assessment using both qualitative as well as quantitative (survey) techniques. Investigators obtain data on local barriers to tPA use, as well as information on local attitudes, knowledge, and beliefs regarding acute stroke treatment. Targeted groups at each site include emergency physicians, emergency nurses, neurologists, radiologists, and hospital administrators. Transcript analysis using NVivo7 with a predefined barrier taxonomy is described. This will provide both qualitative insight on thrombolytic use and importance of specific barrier types for each site. The qualitative findings subsequently direct the form of professional education efforts and system interventions at treatment sites. [source]


The Needs and Benefits of Applying Textual Data Mining within the Product Development Process

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 1 2004
Rakesh Menon
Abstract As a result of the growing competition in recent years, new trends such as increased product complexity, changing customer requirements and shortening development time have emerged within the product development process (PDP). These trends have added more challenges to the already-difficult task of quality and reliability prediction and improvement. They have given rise to an increase in the number of unexpected events in the PDP. Traditional tools are only partially adequate to cover these unexpected events. As such, new tools are being sought to complement traditional ones. This paper investigates the use of one such tool, textual data mining for the purpose of quality and reliability improvement. The motivation for this paper stems from the need to handle ,loosely structured textual data' within the product development process. Thus far, most of the studies on data mining within the PDP have focused on numerical databases. In this paper, the need for the study of textual databases is established. Possible areas within a generic PDP for consumer and professional products, where textual data mining could be employed are highlighted. In addition, successful implementations of textual data mining within two large multi-national companies are presented. Copyright © 2003 John Wiley & Sons, Ltd. [source]