Toxic Levels (toxic + level)

Distribution by Scientific Domains


Selected Abstracts


Intravesical alkalinized lidocaine (PSD597) offers sustained relief from symptoms of interstitial cystitis and painful bladder syndrome

BJU INTERNATIONAL, Issue 7 2009
J. Curtis Nickel
OBJECTIVE To assess the immediate and sustained relief of the symptoms of interstitial cystitis/painful bladder syndrome (IC/PBlS) after a consecutive 5-day course of treatment with intravesical alkalinized lidocaine (PSD597), and to characterize the pharmacokinetics of single and multiple doses of intravesical PSD597 in a subgroup of patients. PATIENTS AND METHODS In all, 102 adult patients (99 women) with a clinical diagnosis of IC/PBlS were randomized from 19 centres in the USA and Canada to receive a daily intravesical instillation of PSD597 (200 mg lidocaine, alkalinized with a sequential instillation of 8.4% sodium bicarbonate solution, to a final volume of 10 mL) or placebo (double-blind), for 5 consecutive days. Patients were followed at intervals up to 29 days after the first instillation. Efficacy was assessed by changes in the Global Response Assessment (GRA), Likert scales for bladder pain, urgency and frequency, and validated O'Leary-Sant IC symptom and problem indices. RESULTS Significantly more patients treated with PSD597 rated their overall bladder symptoms as moderately or markedly improved on the GRA scale 3 days after completing the 5-day course of treatment (30% and 9.6%, respectively, for patients treated with PSD597 and placebo; P = 0.012). The treatment effects were also maintained beyond the end of treatment and are further supported by the secondary endpoints, including symptom and problem indices. The peak serum lidocaine concentration during the study was <2 µg/mL, and well below the toxic level (>5 µg/mL). CONCLUSION This preliminary study showed that PSD597 was effective for providing sustained amelioration of symptoms of IC/PBlS beyond the acute treatment phase. The drug was safe, well tolerated and devoid of the systemic side-effects often experienced with oral drug administration. Long-term studies are needed to determine the optimum regimen to maintain this favourable treatment effect. [source]


Effects of intravenous lidocaine overdose on cardiac electrical activity and blood pressure in the horse

EQUINE VETERINARY JOURNAL, Issue 5 2001
G. A. MEYER
Summary This study aimed to identify blood serum lidocaine concentrations in the horse which resulted in clinical signs of intoxication, and to document the effects of toxic levels on the cardiovascular and cardiopulmonary systems. Nineteen clinically normal mature horses of mixed breed, age and sex were observed. Lidocaine administration was initiated in each subject with an i.v. loading dose of 1.5 mg/kg bwt and followed by continuous infusion of 0.3 mg/kg bwt/min until clinical signs of intoxication were observed. Intoxication was defined as the development of skeletal muscle tremors. Prior to administration of lidocaine, blood samples for lidocaine analysis, heart rate, mean arterial blood pressure, systolic blood pressure, diastolic blood pressure, respiratory rate and electrocardiographic (ECG) data were collected. After recording baseline data, repeat data were collected at 5 min intervals until signs of intoxication were observed. The range of serum lidocaine concentrations at which the clinical signs of intoxication were observed was 1.85,4.53 ,g/ml (mean ± s.d. 3.24 ± 0.74 ,g/ml). Statistically significant changes in P wave duration, P-R interval, R-R interval and Q-T interval were observed in comparison to control values, as a result of lidocaine administration. These changes in ECG values did not fall outside published normal values and were not clinically significant. Heart rate, blood pressures and respiratory rates were unchanged from control values. This study establishes toxic serum lidocaine levels in the horse, and demonstrates that there were no clinically significant cardiovascular effects with serum lidocaine concentrations less than those required to produce signs of toxicity. [source]


Clinical benefit of interventions driven by therapeutic drug monitoring

HIV MEDICINE, Issue 5 2005
AL Rendón
Background Adequate plasma concentrations of antiretroviral drugs are key to achieving and maintaining long-term suppression of HIV replication. Multiple factors may influence drug levels, causing increases or reductions that may, respectively, result in toxicity or virological failure. Therapeutic drug monitoring (TDM) might help to detect and correct such abnormalities. Objective To evaluate the usefulness of TDM in the care of HIV-infected patients in an out-patient clinical setting. Methods All the requests for TDM of protease inhibitors (PIs) and nonnucleoside reverse transcriptase inhibitors (NNRTIs) for patients attending our HIV out-patient clinic from October 2000 to August 2003 were analysed. Blood samples were collected before the morning dose. Drug concentrations were measured by high performance liquid chromatography by ultraviolet waves (HPLC-UV). Results A total of 151 requests from 137 patients were assessed. The reasons for requesting TDM were drug toxicity (59%), virological failure (39%) and possible drug interactions (2%). NNRTI levels were more often requested because of toxicity, while PI levels were more often requested because of virological failure. Elevated drug levels were confirmed in 36% of patients with suspected drug toxicity, while subtherapeutic levels were found in 37% of patients failing virologically. Based on the results of TDM, dose modifications were made in 37% of patients, allowing correction of such abnormalities in 80% of cases. Moreover, adequate plasma concentrations were confirmed in 79% of patients whose levels were assessed again. Conclusions Therapeutic drug monitoring may be a useful tool to identify toxic levels of NNRTI and subtherapeutic concentrations of PI. Dose adjustments following TDM may ameliorate drug-related toxicities or improve virological response rates. [source]


Chronic arsenic toxicity from Ayurvedic medicines

INTERNATIONAL JOURNAL OF DERMATOLOGY, Issue 6 2008
DNB (DERMAT), MNAMS, Sujay Khandpur MD (DERMAT)
Background, Ayurvedic medicines are known to contain arsenic and concentrations up to toxic levels have been reported in certain formulations. However, clinical disease due to arsenic containing ayurvedic medicines has rarely been reported. We seek to highlight the existence of toxic levels of arsenic in certain ayurvedic preparations that can produce serious systemic manifestations. Methods, An 11-year-old girl developed manifestations of arsenical keratosis (punctuate palmoplantar keratoderma and leucomelanoderma) and non-cirrhotic portal hypertension, 6 months and 18 months respectively after intake of ayurvedic medications, prescribed for epilepsy. The eight ayurvedic preparations consumed by the patient and her serum levels were analyzed for arsenic content. Results, Arsenic content of ayurvedic medicines ranged from 5 mg/L to 248 mg/L. The serum arsenic level was 202.20 µg/L (normal < 60 µg/L). Skin manifestations improved after the discontinuation of ayurvedic medications. Conclusions, Ayurvedic medications should be consumed under strict guidance and supervision of qualified practitioners to prevent such catastrophies. [source]


Systems biology approaches for toxicology,

JOURNAL OF APPLIED TOXICOLOGY, Issue 3 2007
William Slikker Jr
Abstract Systems biology/toxicology involves the iterative and integrative study of perturbations by chemicals and other stressors of gene and protein expression that are linked firmly to toxicological outcome. In this review, the value of systems biology to enhance the understanding of complex biological processes such as neurodegeneration in the developing brain is explored. Exposure of the developing mammal to NMDA (N -methyl- d -aspartate) receptor antagonists perturbs the endogenous NMDA receptor system and results in enhanced neuronal cell death. It is proposed that continuous blockade of NMDA receptors in the developing brain by NMDA antagonists such as ketamine (a dissociative anesthetic) causes a compensatory up-regulation of NMDA receptors, which makes the neurons bearing these receptors subsequently more vulnerable (e.g. after ketamine washout), to the excitotoxic effects of endogenous glutamate: the up-regulation of NMDA receptors allows for the accumulation of toxic levels of intracellular Ca2+ under normal physiological conditions. Systems biology, as applied to toxicology, provides a framework in which information can be arranged in the form of a biological model. In our ketamine model, for example, blockade of NMDA receptor up-regulation by the co-administration of antisense oligonucleotides that specifically target NMDA receptor NR1 subunit mRNA, dramatically diminishes ketamine-induced cell death. Preliminary gene expression data support the role of apoptosis as a mode of action of ketamine-induced neurotoxicity. In addition, ketamine-induced cell death is also prevented by the inhibition of NF- ,B translocation into the nucleus. This process is known to respond to changes in the redox state of the cytoplasm and has been shown to respond to NMDA-induced cellular stress. Although comprehensive gene expression/proteomic studies and mathematical modeling remain to be carried out, biological models have been established in an iterative manner to allow for the confirmation of biological pathways underlying NMDA antagonist-induced cell death in the developing nonhuman primate and rodent. Published in 2007 John Wiley & Sons, Ltd. [source]


Biomarkers as biological indicators of xenobiotic exposure

JOURNAL OF APPLIED TOXICOLOGY, Issue 4 2001
Fernando Gil
Abstract The presence of a xenobiotic in the environment always represents a risk for living organisms. However, to talk about impregnation there is a need to detect toxicity in the organism, and the concept of intoxication is related to specific organ alterations and clinical symptoms. Moreover, the relationship between the toxic levels within the organism and the toxic response is rather complex and has a difficult forecast because it depends on several factors, namely toxicokinetic and genetic factors. One of the methods to quantify the interaction with xenobiotics and its potential impact on living organisms, including the human being, is monitoring by the use of the so-called biomarkers. They can provide measures of the exposure, toxic effect and individual susceptibility to environmental chemical compounds and may be very useful to assess and control the risk of long-term outcomes associated with exposure to xenobiotic (i.e. heavy metals, halogenated hydrocarbons, pesticides). Copyright © 2001 John Wiley & Sons, Ltd. [source]


Chronic arsenic poisoning: a global health issue , a report of multiple primary cancers

JOURNAL OF CUTANEOUS PATHOLOGY, Issue 2 2007
R. R. Walvekar
Effects of prolonged exposure to high levels of As in drinking water have been observed and documented in various epidemiological studies from all over the world. The non-malignant cutaneous effects of chronic exposure to inorganic As are well known. A case presenting with multiple cutaneous cancers as well as an internal lung primary in a patient exposed to toxic levels of As in the drinking water is discussed along with a review of literature. [source]


Novel polymerase chain reaction primers for the specific detection of bacterial copper P-type ATPases gene sequences in environmental isolates and metagenomic DNA

LETTERS IN APPLIED MICROBIOLOGY, Issue 6 2010
R. De la Iglesia
Abstract Aims:, In the last decades, the worldwide increase in copper wastes release by industrial activities like mining has driven environmental metal contents to toxic levels. For this reason, the study of the biological copper-resistance mechanisms in natural environments is important. Therefore, an appropriate molecular tool for the detection and tracking of copper-resistance genes was developed. Methods and Results:, In this work, we designed a PCR primer pair to specifically detect copper P-type ATPases gene sequences. These PCR primers were tested in bacterial isolates and metagenomic DNA from intertidal marine environments impacted by copper pollution. As well, T-RFLP fingerprinting of these gene sequences was used to compare the genetic composition of such genes in microbial communities, in normal and copper-polluted coastal environments. New copper P-type ATPases gene sequences were found, and a high degree of change in the genetic composition because of copper exposure was also determined. Conclusions:, This PCR based method is useful to track bacterial copper-resistance gene sequences in the environment. Significance and Impact of the Study:, This study is the first to report the design and use of a PCR primer pair as a molecular marker to track bacterial copper-resistance determinants, providing an excellent tool for long-term analysis of environmental communities exposed to metal pollution. [source]


Neurotoxicity of immunosuppressive drugs

LIVER TRANSPLANTATION, Issue 11 2001
Eelco F.M. Wijdicks MD
The clinical profile of neurotoxicity caused by immunosuppression has changed. When toxic levels are reached, both cyclosporine and tacrolimus may produce a clinical spectrum that varies from tremor and acute confusional state to status epilepticus and major speech or language abnormalities. Coma has become an unusual manifestation. Magnetic resonance imaging has been better defined, and abnormalities may be more widespread than those in the posterior lobes. These white matter lesions are caused by vasogenic edema, but may lead to apoptosis and cytotoxic edema if exposure is prolonged. Recent evidence suggests inhibition of a drug-efflux pump and dysfunction of the blood-brain barrier by enhanced nitric oxide production. [source]


Mechanisms of iron regulation in mycobacteria: role in physiology and virulence

MOLECULAR MICROBIOLOGY, Issue 6 2003
G. Marcela Rodriguez
Summary The role of iron in mycobacteria as in other bacteria goes beyond the need for this essential cofactor. Limitation of this metal triggers an extensive response aimed at increasing iron acquisition while coping with iron deficiency. In contrast, iron-rich environments prompt these prokaryotes to induce synthesis of iron storage molecules and to increase mechanisms of protection against iron-mediated oxidative damage. The response to changes in iron availability is strictly regulated in order to maintain sufficient but not excessive and potentially toxic levels of iron in the cell. This response is also linked to other important processes such as protection against oxidative stress and virulence. In bacteria, iron metabolism is regulated by controlling transcription of genes involved in iron uptake, transport and storage. In mycobacteria, this role is fulfilled by the iron- dependent regulator IdeR. IdeR is an essential protein in Mycobacterium tuberculosis, the causative agent of human tuberculosis. It functions as a repressor of iron acquisition genes, but is also an activator of iron storage genes and a positive regulator of oxidative stress responses. [source]


Comparative physiology of salt and water stress

PLANT CELL & ENVIRONMENT, Issue 2 2002
R. Munns
Abstract Plant responses to salt and water stress have much in common. Salinity reduces the ability of plants to take up water, and this quickly causes reductions in growth rate, along with a suite of metabolic changes identical to those caused by water stress. The initial reduction in shoot growth is probably due to hormonal signals generated by the roots. There may be salt-specific effects that later have an impact on growth; if excessive amounts of salt enter the plant, salt will eventually rise to toxic levels in the older transpiring leaves, causing premature senescence, and reduce the photosynthetic leaf area of the plant to a level that cannot sustain growth. These effects take time to develop. Salt-tolerant plants differ from salt-sensitive ones in having a low rate of Na+ and Cl, transport to leaves, and the ability to compartmentalize these ions in vacuoles to prevent their build-up in cytoplasm or cell walls and thus avoid salt toxicity. In order to understand the processes that give rise to tolerance of salt, as distinct from tolerance of osmotic stress, and to identify genes that control the transport of salt across membranes, it is important to avoid treatments that induce cell plasmolysis, and to design experiments that distinguish between tolerance of salt and tolerance of water stress. [source]


Response of potted grapevines to increasing soil copper concentration

AUSTRALIAN JOURNAL OF GRAPE AND WINE RESEARCH, Issue 1 2009
M. TOSELLI
Abstract Background and Aims:, Copper accumulation in soil may promote phytotoxicity in grapevines. Nutritional implications of potted vines to increasing concentrations of copper (Cu) in either clay loam soil or clay loam soil mixed with 85% sand were tested on Vitis vinifera (L.) cv Sangiovese and crop toxicity threshold and symptoms determined. Methods and Results:, Soils were mixed at planting with Cu at the rates (mg Cu/kg) of 0 (control, native soil Cu only), 50, 100, 200, 400, 600, 800 and 1000, and non-bearing vines were grown in these for two seasons. Reduction of root growth was observed after addition of ,400 mg Cu/kg to both soils; reduction of shoot growth, leaf number and chlorosis of leaf edges were detected only in sand-enriched soil. Root Cu concentration increased in response to soil Cu addition. Unlike that of leaf Cu and N, the amount of P and Fe (in both soils) and Mg and Ca (in sand-enriched soil only) were reduced by soil Cu. Conclusion:, Vines grown in sand-enriched soil tolerated lower concentrations of Cu than in clay loam soil, probably because of the lower nutritional status and the higher root Cu concentration. Significance of the Study:, Results provide information on the concentration of soil Cu that grapevine can tolerate and on the nutrients involved in the response to toxic levels of soil Cu in clay loam and sandy clay loam soils. [source]


Degradation of xenobiotics in a partitioning bioreactor in which the partitioning phase is a polymer

BIOTECHNOLOGY & BIOENGINEERING, Issue 4 2003
Brian G. Amsden
Abstract Two-phase partitioning bioreactors (TPPBs) are characterized by a cell-containing aqueous phase and a second immiscible phase that contains toxic and/or hydrophobic substrates that partition to the cells at subinhibitory levels in response to the metabolic demand of the organisms. To date, the delivery phase in TPPBs has been a hydrophobic solvent that traditionally needed to possess a variety of important properties including biocompatibility, nonbioavailability, low volatility, and low cost, among others. In the present work we have shown that the organic solvent phase can be replaced by inexpensive polymer beads that function in a similar fashion as organic solvents, delivering a toxic substrate to cells based on equilibrium considerations. Specifically, 3.4 mm diameter beads of poly(ethylene-co-vinyl acetate) (EVA) were used to reduce the aqueous concentration of phenol in a bioreactor from toxic levels ( ,2,000 mg/L) to subinhibitory levels (,750 mg/L), after which Pseudomonas putida ATCC 11172 was added to the system and allowed to consume the total phenol loading. Thus, the beads absorbed the toxic substrate and released it to the cells on demand. The EVA beads, which could be reused, were able to absorb 14 mg phenol/g EVA. This work has opened the possibility of using widely mixed cultures in TPPB systems without concern for degradation of the delivery material and without concern of contamination. © 2003 Wiley Periodicals. Biotechnol Bioeng84: 399,305, 2003. [source]


Learning and behavioural difficulties but not microcephaly in three brothers resulting from undiagnosed maternal phenylketonuria

CHILD: CARE, HEALTH AND DEVELOPMENT, Issue 5 2004
C. Shaw-Smith
Abstract Universal screening introduced in the 1960s has reduced the incidence of learning disability resulting from phenylketonuria (PKU), which is a treatable condition. Nonetheless, PKU may still be having an impact on the paediatric-age population. We report a woman with previously undiagnosed PKU who was born before the onset of universal screening. She is of normal intelligence, and so the diagnosis was not suspected until after the birth of her three children. Her serum phenylalanine concentration was found to be in excess of 1 mmol/L, well into the toxic range. She has had three sons, all of whom have a significant degree of learning disability resulting from intrauterine exposure to toxic levels of phenylalanine. None of the sons had microcephaly, a physical sign that, if present, might have helped to point towards the correct diagnosis. We suggest that maternal PKU should be suspected where there is sibling recurrence of cognitive impairment, particularly where the mother was born before the initiation of the neonatal screening programme for PKU. [source]