Event Data (event + data)

Distribution by Scientific Domains


Selected Abstracts


Semiparametric Analysis for Recurrent Event Data with Time-Dependent Covariates and Informative Censoring

BIOMETRICS, Issue 1 2010
C.-Y. Huang
Summary Recurrent event data analyses are usually conducted under the assumption that the censoring time is independent of the recurrent event process. In many applications the censoring time can be informative about the underlying recurrent event process, especially in situations where a correlated failure event could potentially terminate the observation of recurrent events. In this article, we consider a semiparametric model of recurrent event data that allows correlations between censoring times and recurrent event process via frailty. This flexible framework incorporates both time-dependent and time-independent covariates in the formulation, while leaving the distributions of frailty and censoring times unspecified. We propose a novel semiparametric inference procedure that depends on neither the frailty nor the censoring time distribution. Large sample properties of the regression parameter estimates and the estimated baseline cumulative intensity functions are studied. Numerical studies demonstrate that the proposed methodology performs well for realistic sample sizes. An analysis of hospitalization data for patients in an AIDS cohort study is presented to illustrate the proposed method. [source]


Spatial Multistate Transitional Models for Longitudinal Event Data

BIOMETRICS, Issue 1 2008
F. S. Nathoo
Summary Follow-up medical studies often collect longitudinal data on patients. Multistate transitional models are useful for analysis in such studies where at any point in time, individuals may be said to occupy one of a discrete set of states and interest centers on the transition process between states. For example, states may refer to the number of recurrences of an event, or the stage of a disease. We develop a hierarchical modeling framework for the analysis of such longitudinal data when the processes corresponding to different subjects may be correlated spatially over a region. Continuous-time Markov chains incorporating spatially correlated random effects are introduced. Here, joint modeling of both spatial dependence as well as dependence between different transition rates is required and a multivariate spatial approach is employed. A proportional intensities frailty model is developed where baseline intensity functions are modeled using parametric Weibull forms, piecewise-exponential formulations, and flexible representations based on cubic B-splines. The methodology is developed within the context of a study examining invasive cardiac procedures in Quebec. We consider patients admitted for acute coronary syndrome throughout the 139 local health units of the province and examine readmission and mortality rates over a 4-year period. [source]


A reference model for grid architectures and its validation

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2010
Wil van der Aalst
Abstract Computing and data-intensive applications in physics, medicine, biology, graphics, and business intelligence require large and distributed infrastructures to address the challenges of the present and the future. For example, process mining applications are faced with terrabytes of event data and computationally expensive algorithms. Computer grids are increasingly being used to deal with such challenges. However, grid computing is often approached in an ad hoc and engineering-like manner. Despite the availability of many software packages for grid applications, a good conceptual model of the grid is missing. This paper provides a formal description of the grid in terms of a colored Petri net (CPN). This CPN can be seen as a reference model for grids as it clarifies the basic concepts at the conceptual level. Moreover, the CPN allows for various kinds of analyses ranging from verification to performance analysis. We validate our model based on real-life experiments using a testbed grid architecture available in our group and we show how the model can be used for the estimation of throughput times for scientific workflows. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Comparison of three expert elicitation methods for logistic regression on predicting the presence of the threatened brush-tailed rock-wallaby Petrogale penicillata

ENVIRONMETRICS, Issue 4 2009
Rebecca A. O'Leary
Abstract Numerous expert elicitation methods have been suggested for generalised linear models (GLMs). This paper compares three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression. These methods were trialled on two experts in order to model the habitat suitability of the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The first elicitation approach is a geographically assisted indirect predictive method with a geographic information system (GIS) interface. The second approach is a predictive indirect method which uses an interactive graphical tool. The third method uses a questionnaire to elicit expert knowledge directly about the impact of a habitat variable on the response. Two variables (slope and aspect) are used to examine prior and posterior distributions of the three methods. The results indicate that there are some similarities and dissimilarities between the expert informed priors of the two experts formulated from the different approaches. The choice of elicitation method depends on the statistical knowledge of the expert, their mapping skills, time constraints, accessibility to experts and funding available. This trial reveals that expert knowledge can be important when modelling rare event data, such as threatened species, because experts can provide additional information that may not be represented in the dataset. However care must be taken with the way in which this information is elicited and formulated. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Evaluation of putative hazard tests under background risk heterogeneity

ENVIRONMETRICS, Issue 3 2009
Yuan Liu
Abstract In this paper, an evaluation of some tests for adverse health effects around fixed locations (putative hazard sites) is made. We address the situation where a heterogeneous background is included both in terms of correlated and uncorrelated heterogeneity (UH). In addition, we examine a set of tests for case event data (residential case addresses) including both distance and directional effects. The tests include distance and directional score tests (DIR), integrated intensity tests, the Besag & Newell (BES) test, and Kulldorff's focus clustering scan test. Monte Carlo power of the tests is evaluated and power curves are finally presented. Interesting results from this work include the lack of power found when correlated heterogeneity (CH) is present. This is not found with UH. The BES and Scan tests have lower power than the integrated intensity tests and score tests in general. The integrated intensity test demonstrates its omnibus ability to detect either distance or directional effects. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Analysis of rain quality data from the South African interior

ENVIRONMETRICS, Issue 4 2002
Jacky Galpin
Abstract Rain acidity may be ascribed to emissions from power station stacks, as well as emissions from other industry, biomass burning, maritime influences, agricultural influences, etc. Rain quality data are available for 30 sites in the South African interior, some from as early as 1985 for up to 14 rainfall seasons, while others only have relatively short records. The article examines trends over time in the raw and volume weighted concentrations of the parameters measured, separately for each of the sites for which sufficient data are available. The main thrust, however, is to examine the inter-relationship structure between the concentrations within each rain event (unweighted data), separately for each site, and to examine whether these inter-relationships have changed over time. The rain events at individual sites can be characterized by approximately eight combinations of rainfall parameters (or rain composition signatures), and these are common to all sites. Some sites will have more events from one signature than another, but there appear to be no signatures unique to a single site. Analysis via factor and cluster analysis, with a correspondence analysis of the results, also aid interpretation of the patterns. This spatio-temporal analysis, performed by pooling all rain event data, irrespective of site or time period, results in nine combinations of rainfall parameters being sufficient to characterize the rain events. The sites and rainfall seasons show patterns in these combinations of parameters, with some combinations appearing more frequently during certain rainfall seasons. In particular, the presence of the combination of low acetate and formate with high magnesium appears to be increasing in the later rainfall seasons, as does this combination together with calcium, sodium, chloride, potassium and fluoride. As expected, sites close together exhibit similar signatures. Copyright © 2002 John Wiley & Sons, Ltd. [source]


Problem Representation and Conflict Dynamics in the Middle East and Northern Ireland

FOREIGN POLICY ANALYSIS, Issue 3 2005
Donald A. Sylvan
In an effort to explain conflictual and cooperative actions in the Middle East and Northern Ireland at a time (1995,1999) when international structural factors were relatively constant, this article focuses on cognitive factors. Specifically, statements of leaders representing multiple actors in the Israeli,Palestinian and Northern Ireland relationship are examined. Texts from these leaders serve as data for the independent variable, problem representation. Hypotheses argue that the existence and centrality of actor problem representations can help explain behavior, in a manner that adds to more widely used scholarly explanations. More specifically, the article explores the relationship among leaders' "problem representations" and conflict between the entities they lead. Problem representation is measured in three dimensions using Foreign Broadcast Information Service (World News Connection) texts: (1) centrality of enemy image, (2) how inclusive/exclusive the leader is in describing the in-group and principal outgroup, and (3) the key definition of the problem (coded in terms of three strategies: justice, governance, or threat). Conflict, the dependent variable, is measured in terms of KEDS-TABARI event data on deeds (not words) in the Israeli,Palestinian and Northern Ireland conflicts. Strikingly, the analysis finds that Israeli actions are strongly related to prior and current Palestinian leaders' problem representations, much more than they are to Israeli leaders' problem representations. Similarly, Palestinian actions are found to have a clear relationship with prior and current Israeli leaders' problem representations, much more than they are with Palestinian leaders' problem representations. These results are particularly strong when the problem representation is one of the overall political leadership on the "other" side. Additionally, in-group and out-group inclusivity are the most significant predictors of actions. For Northern Ireland, the same two themes prevail, although not as strongly: there is a clear statistical relationship between each side's problem representations and the other side's actions, stronger in fact than the relationship between their own side's representations and actions. Also, in-group and out-group inclusivity produce a strong statistical relationship with conflict and cooperation. Finally, results are compared with a "tit-for-tat" hypothesis, and found to embellish that hypothesis. [source]


Fair and Just Culture, Team Behavior, and Leadership Engagement: The Tools to Achieve High Reliability

HEALTH SERVICES RESEARCH, Issue 4p2 2006
Allan S. Frankel
Background. Disparate health care provider attitudes about autonomy, teamwork, and administrative operations have added to the complexity of health care delivery and are a central factor in medicine's unacceptably high rate of errors. Other industries have improved their reliability by applying innovative concepts to interpersonal relationships and administrative hierarchical structures (Chandler 1962). In the last 10 years the science of patient safety has become more sophisticated, with practical concepts identified and tested to improve the safety and reliability of care. Objective. Three initiatives stand out as worthy regarding interpersonal relationships and the application of provider concerns to shape operational change: The development and implementation of Fair and Just Culture principles, the broad use of Teamwork Training and Communication, and tools like WalkRounds that promote the alignment of leadership and frontline provider perspectives through effective use of adverse event data and provider comments. Methods. Fair and Just Culture, Teamwork Training, and WalkRounds are described, and implementation examples provided. The argument is made that they must be systematically and consistently implemented in an integrated fashion. Conclusions. There are excellent examples of institutions applying Just Culture principles, Teamwork Training, and Leadership WalkRounds,but to date, they have not been comprehensively instituted in health care organizations in a cohesive and interdependent manner. To achieve reliability, organizations need to begin thinking about the relationship between these efforts and linking them conceptually. [source]


Preoperative chemoradiation versus radiation alone for stage II and III resectable rectal cancer: A systematic review and meta-analysis

INTERNATIONAL JOURNAL OF CANCER, Issue 12 2009
Wim Ceelen
Abstract Combining chemotherapy with preoperative radiotherapy (RT) has a sound radiobiological rationale. We performed a systematic review and meta-analysis of trials comparing preoperative RT with preoperative chemoradiation (CRT) in rectal cancer patients. The Cochrane Central Register of Controlled Trials, Web of Science, Embase and Medline (Pubmed) were searched from 1975 until June 2007. Dichotomous parameters were summarized using the odds ratio while time to event data were analyzed using the pooled hazard ratio for death. From the primary search result of 324 trials, 4 relevant randomized trials were identified. The addition of chemotherapy significantly increased grade III and IV acute toxicity (p = 0.002) while no differences were observed in postoperative morbidity or mortality. Preoperative CRT significantly increased the rate of pathological complete response (p < 0.001) although this did not translate into a higher sphincter preservation rate (p = 0.29). The local recurrence rate was significantly lower in the CRT group (p < 0.001). No statistically significant differences were observed in disease free survival (p = 0.89) or overall survival (p = 0.79). Compared to preoperative RT alone, preoperative CRT improves local control in rectal cancer but is associated with a more pronounced treatment related toxicity. The addition of chemotherapy does not benefit sphincter preservation rate or long-term survival. Future trials should address improvements in the rate of distant metastasis and overall survival by incorporating more active chemotherapy. © 2008 UICC [source]


Follow-up of serious offender patients in the community: multiple methods of tracing

INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 3 2002
Elizabeth Jamieson Lecturer
Abstract Longitudinal studies of people with mental disorder are important in understanding outcome and intervention effects but attrition rates can be high. This study aimed to evaluate use of multiple record sources to trace, over 12 years, a one-year discharge cohort of high-security hospital patients. Everyone leaving such a hospital in 1984 was traced until a census date of 31 December 1995. Data were collected from several national databases (Office for National Statistics (ONS), Home Office (HO) Offenders' Index, Police National Computer Records, the Electoral Roll) and by hand-searching responsible agency records (HO, National Health Service). Using all methods, only three of the 204 patients had no follow-up information. Home Office Mental Health Unit data were an excellent source, but only for people still under discharge restrictions (<50% after eight years). Sequential tracing of hospital placements for people never or no longer under such restrictions was laborious and also produced only group-specific yield. The best indicator of community residence was ONS information on general practitioner (GP/primary care) registration. The electoral roll was useful when other sources were exhausted. Follow-up of offenders/offender-patients has generally focused on event data, such as re-offending. People untraced by that method alone, however, are unlikely to be lost to follow-up on casting a wider records net. Using multiple records, attrition at the census was 38%, but, after certain assumptions, reduced further to 5%. Copyright © 2002 Whurr Publishers Ltd. [source]


Nonstationary fault detection and diagnosis for multimode processes

AICHE JOURNAL, Issue 1 2010
Jialin Liu
Abstract Fault isolation based on data-driven approaches usually assume the abnormal event data will be formed into a new operating region, measuring the differences between normal and faulty states to identify the faulty variables. In practice, operators intervene in processes when they are aware of abnormalities occurring. The process behavior is nonstationary, whereas the operators are trying to bring it back to normal states. Therefore, the faulty variables have to be located in the first place when the process leaves its normal operating regions. For an industrial process, multiple normal operations are common. On the basis of the assumption that the operating data follow a Gaussian distribution within an operating region, the Gaussian mixture model is employed to extract a series of operating modes from the historical process data. The local statistic T2 and its normalized contribution chart have been derived for detecting abnormalities early and isolating faulty variables in this article. © 2009 American Institute of Chemical Engineers AIChE J, 2010 [source]


PRECIPITATION CHANGES FROM 1956 TO 1996 ON THE WALNUT GULCH EXPERIMENTAL WATERSHED,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2002
Mary H. Nichols
ABSTRACT: The climate of Southern Arizona is dominated by summer precipitation, which accounts for over 60 percent of the annual total. Summer and non-summer precipitation data from the USDA-ARS Walnut Gulch Experimental Watershed are analyzed to identify trends in precipitation characteristics from 1956 to 1996. During this period, annual precipitation increased. The annual precipitation increase can be attributed to an increase in precipitation during non-summer months, and is paralleled by an increase in the proportion of annual precipitation contributed during non-summer months. This finding is consistent with previously reported increases in non-summer precipitation in the southwestern United States. Detailed event data were analyzed to provide insight into the characteristics of precipitation events during this time period. Precipitation event data were characterized based on the number of events, event precipitation amount, 30-minute event intensity, and event duration. The trend in non-summer precipitation appears to be a result of increased event frequency since the number of events increased during nonsummer months, although the average amount per event, average event intensity, and average event duration did not. During the summer "monsoon" season, the frequency of recorded precipitation events increased but the average precipitation amount per event decreased. Knowledge of precipitation trends and the characteristics of events that make up a precipitation time series is a critical first step in understanding and managing water resources in semiarid ecosystems. [source]


Developing tools for the safety specification in risk management plans: lessons learned from a pilot project,

PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 5 2008
Andrew J. P. Cooper BSc
Abstract Purpose Following the adoption of the ICH E2E guideline, risk management plans (RMP) defining the cumulative safety experience and identifying limitations in safety information are now required for marketing authorisation applications (MAA). A collaborative research project was conducted to gain experience with tools for presenting and evaluating data in the safety specification. This paper presents those tools found to be useful and the lessons learned from their use. Methods Archive data from a successful MAA were utilised. Methods were assessed for demonstrating the extent of clinical safety experience, evaluating the sensitivity of the clinical trial data to detect treatment differences and identifying safety signals from adverse event and laboratory data to define the extent of safety knowledge with the drug. Results The extent of clinical safety experience was demonstrated by plots of patient exposure over time. Adverse event data were presented using dot plots, which display the percentages of patients with the events of interest, the odds ratio, and 95% confidence interval. Power and confidence interval plots were utilised for evaluating the sensitivity of the clinical database to detect treatment differences. Box and whisker plots were used to display laboratory data. Conclusions This project enabled us to identify new evidence-based methods for presenting and evaluating clinical safety data. These methods represent an advance in the way safety data from clinical trials can be analysed and presented. This project emphasises the importance of early and comprehensive planning of the safety package, including evaluation of the use of epidemiology data. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Conditional Lifetime Data Analysis Using the Limited Expected Value Function

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 3 2004
John Quigley
Abstract Much failure, and other event, data are commonly highly censored. Consequently this limits the efficacy of many statistical analysis techniques. The limited expected value (LEV) function presents an alternative way of characterizing lifetime distributions. In essence the LEV provides a means of calculating a truncated mean time to failure (MTTF) (or mean time before failure (MTBF) if appropriate) that is adjusted at each of the censoring times and so appears potentially suitable for dealing with censored data structures. In theory, the LEV has been defined for many standard distributions, however its practical use is not well developed. This paper aims to extend the theory of LEV for typical censoring structures to develop procedures that will assist in model identification as well as parameter estimation. Applications to typical event data will be presented and the use of LEV in comparison with a selection of existing lifetime distributional analysis will be made based on some preliminary research. Copyright © 2004 John Wiley & Sons, Ltd. [source]


NHPP models for categorized software defects

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 6 2005
Zhaohui Liu
Abstract We develop NHPP models to characterize categorized event data, with application to modelling the discovery process for categorized software defects. Conditioning on the total number of defects, multivariate models are proposed for modelling the defects by type. A latent vector autoregressive structure is used to characterize dependencies among the different types. We show how Bayesian inference can be achieved via MCMC procedures, with a posterior prediction-based L -measure used for model selection. The results are illustrated for defects of different types found during the System Test phase of a large operating system software development project. Copyright © 2005 John Wiley & Sons, Ltd. [source]


An overview of the heterogeneous telescope network system: Concept, scalability and operation

ASTRONOMISCHE NACHRICHTEN, Issue 3 2008
R.R. White
Abstract In the coming decade there will be an avalanche of data streams devoted to astronomical exploration opening new windows of scientific discovery. The shear volume of data and the diversity of event types (Kantor 2006; Kaiser 2004; Vestrand & Theiler & Wozniak 2004) will necessitate; the move to a common language for the communication of event data, and enabling telescope systems with the ability to not just simply respond, but to act independently in order to take full advantage of available resources in a timely manner. Developed over the past three years, the Virtual Observatory Event (VOEvent) provides the best format for carrying these diverse event messages (White et al. 2006a; Seaman & Warner 2006). However, in order for the telescopes to be able to act independently, a system of interoperable network nodes must be in place, that will allow the astronomical assets to not only issue event notifications, but to coordinate and request specific observations. The Heterogeneous Telescope Network (HTN) is a network architecture that can achieve the goals set forth and provide a scalable design to match both fully autonomous and manual telescope system needs (Allan et al. 2006a;White et al. 2006b; Hessman 2006b). In this paper we will show the design concept of this meta-network and nodes, their scalable architecture and complexity, and how this concept can meet the needs of institutions in the near future. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Semiparametric Analysis for Recurrent Event Data with Time-Dependent Covariates and Informative Censoring

BIOMETRICS, Issue 1 2010
C.-Y. Huang
Summary Recurrent event data analyses are usually conducted under the assumption that the censoring time is independent of the recurrent event process. In many applications the censoring time can be informative about the underlying recurrent event process, especially in situations where a correlated failure event could potentially terminate the observation of recurrent events. In this article, we consider a semiparametric model of recurrent event data that allows correlations between censoring times and recurrent event process via frailty. This flexible framework incorporates both time-dependent and time-independent covariates in the formulation, while leaving the distributions of frailty and censoring times unspecified. We propose a novel semiparametric inference procedure that depends on neither the frailty nor the censoring time distribution. Large sample properties of the regression parameter estimates and the estimated baseline cumulative intensity functions are studied. Numerical studies demonstrate that the proposed methodology performs well for realistic sample sizes. An analysis of hospitalization data for patients in an AIDS cohort study is presented to illustrate the proposed method. [source]


Flexible Estimation of Differences in Treatment-Specific Recurrent Event Means in the Presence of a Terminating Event

BIOMETRICS, Issue 3 2009
Qing Pan
Summary In this article, we consider the setting where the event of interest can occur repeatedly for the same subject (i.e., a recurrent event; e.g., hospitalization) and may be stopped permanently by a terminating event (e.g., death). Among the different ways to model recurrent/terminal event data, the marginal mean (i.e., averaging over the survival distribution) is of primary interest from a public health or health economics perspective. Often, the difference between treatment-specific recurrent event means will not be constant over time, particularly when treatment-specific differences in survival exist. In such cases, it makes more sense to quantify treatment effect based on the cumulative difference in the recurrent event means, as opposed to the instantaneous difference in the rates. We propose a method that compares treatments by separately estimating the survival probabilities and recurrent event rates given survival, then integrating to get the mean number of events. The proposed method combines an additive model for the conditional recurrent event rate and a proportional hazards model for the terminating event hazard. The treatment effects on survival and on recurrent event rate among survivors are estimated in constructing our measure and explain the mechanism generating the difference under study. The example that motivates this research is the repeated occurrence of hospitalization among kidney transplant recipients, where the effect of expanded criteria donor (ECD) compared to non-ECD kidney transplantation on the mean number of hospitalizations is of interest. [source]


A survival analysis for recurrent events in psychiatric research

BIPOLAR DISORDERS, Issue 2 2004
Christopher Baethge
Objectives:, Time to first recurrence, as analyzed by the Kaplan,Meier (KM) survival analysis, is a commonly applied statistical method in psychiatric research. However, many psychiatric disorders are characterized not by a single event, but rather by recurrent events, such as multiple affective episodes. This study aims to demonstrate a method of survival analysis that takes multiple recurrences into account. Methods:, We examined data on sex differences in a sample of 181 patients undergoing prophylactic treatment with lithium or carbamazepine (serum level assayed) for bipolar disorder (ICD-10). The classical KM method was compared with an approach developed by Peña, Strawderman and Hollander (PSH) that uses recurrent event data to estimate survival function. Results:, The results obtained with the multiple events method differed considerably from those acquired using the standard KM analysis. When taking recurrent event data into account, the probability of remaining well was lower and survival times were longer. In addition, whereas the standard KM analysis indicated that male patients had a higher likelihood of remaining well, the alternative method revealed that both sexes were similarly likely to remain well. Conclusions:, Survival analysis techniques that take recurrent events into account are potentially important instruments for the study of psychiatric conditions characterized by multiple recurrences. In many cases, the standard KM analysis appears to provide only a rough approximation of the course of illness. [source]


Multilevel models for longitudinal data

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 1 2008
Fiona Steele
Summary., Repeated measures and repeated events data have a hierarchical structure which can be analysed by using multilevel models. A growth curve model is an example of a multilevel random-coefficients model, whereas a discrete time event history model for recurrent events can be fitted as a multilevel logistic regression model. The paper describes extensions to the basic growth curve model to handle auto-correlated residuals, multiple-indicator latent variables and correlated growth processes, and event history models for correlated event processes. The multilevel approach to the analysis of repeated measures data is contrasted with structural equation modelling. The methods are illustrated in analyses of children's growth, changes in social and political attitudes, and the interrelationship between partnership transitions and childbearing. [source]


Privacy issues and the monitoring of sumatriptan in the New Zealand Intensive Medicines Monitoring Programme

PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, Issue 7 2001
DTM&H, David M. Coulter MB
Abstract Purpose The purpose of this paper is to describe how the New Zealand (NZ) Intensive Medicines Monitoring Programme (IMMP) functions in relation to NZ privacy laws and to describe the attitudes of patients to drug safety monitoring and the privacy of their personal and health information. Methods The IMMP undertakes prospective observational event monitoring cohort studies on new drugs. The cohorts are established from prescription data and the events are obtained using prescription event monitoring and spontaneous reporting. Personal details, prescribing history of the monitored drugs and adverse events data are stored in databases long term. The NZ Health Information Privacy Code is outlined and the monitoring of sumatriptan is used to illustrate how the IMMP functions in relation to the Code. Patient responses to the programme are described. Results Sumatriptan was monitored in 14,964 patients and 107,646 prescriptions were recorded. There were 2344 reports received describing 3987 adverse events. A majority of the patients were involved in the recording of events data either personally or by telephone interview. There were no objections to the monitoring process on privacy grounds. Conclusion Given the fact that all reasonable precautions are taken to ensure privacy, patients perceive drug safety to have greater priority than any slight risk of breach of confidentiality concerning their personal details and health information. Copyright © 2001 John Wiley & Sons, Ltd. [source]