The toxicology of mercury and its compounds
Tore Syversena, Corresponding author contact information, E-mail the corresponding author, Parvinder Kaurb
a Norwegian University of Science and Technology, Department of Neuroscience, Postboks 8905 MTFS, N-7491 Trondheim, Norway
b Norwegian University of Science and Technology, Department of Department of Circulation and Medical Imaging, Postboks 8905 MTFS, N-7491 Trondheim, Norway
A concentrated review on the toxicology of inorganic mercury together with an extensive review on the neurotoxicology of methylmercury is presented. The challenges of using inorganic mercury in dental amalgam are reviewed both regarding the occupational exposure and the possible health problems for the dental patients. The two remaining “mysteries” of methylmercury neurotoxicology are also being reviewed; the cellular selectivity and the delayed onset of symptoms. The relevant literature on these aspects has been discussed and some suggestions towards explaining these observations have been presented.
Nickel species: analysis and toxic effects.
This review gives an overview on the analysis of inorganic nickel species and their toxic effects. Based on the analytical procedure applied inorganic nickel species are usually classified in soluble, sulfidic, metallic and oxidic nickel fractions. Only few works were attempting a chemical characterization of the different nickel compounds in each fraction. This general classification in four nickel species groups is widely used in toxicological studies dealing with nickel particulate matter in workplace air. Compared to the general population, occupationally exposed people have a higher risk of respiratory tract cancer due to inhalation of nickel at their workplace in the nickel-producing or using industries. High cancer risk is related to less soluble oxidic and especially sulfidic nickel species in refinery dust. In contrast, within the general population the most harmful health effect related to nickel exposure is allergic contact dermatitis due to prolonged skin contact with nickel. Absorption processes of nickel species and molecular mechanisms of nickel toxicity are briefly outlined.
Role of manganese in neurodegenerative diseases.
Bowman AB, Kwakye GF, Herrero Hernández E, Aschner M.
Department of Neurology, Vanderbilt Kennedy Center, Center for Molecular Toxicology, Vanderbilt University Medical Center, Nashville, TN 37232-8552, United States.
Manganese (Mn) is an essential ubiquitous trace element that is required for normal growth, development and cellular homeostasis. Exposure to high Mn levels causes a clinical disease characterized by extrapyramidal symptom resembling idiopathic Parkinson's disease (IPD). The present review focuses on the role of various transporters in maintaining brain Mn homeostasis along with recent methodological advances in real-time measurements of intracellular Mn levels. We also provide an overview on the role for Mn in IPD, discussing the similarities (and differences) between manganism and IPD, and the relationship between α-synuclein and Mn-related protein aggregation, as well as mitochondrial dysfunction, Mn and PD. Additional sections of the review discuss the link between Mn and Huntington's disease (HD), with emphasis on huntingtin function and the potential role for altered Mn homeostasis and toxicity in HD. We conclude with a brief survey on the potential role of Mn in the etiologies of Alzheimer's disease (AD), amyotrophic lateral sclerosis (ALS) and prion disease. Where possible, we discuss the mechanistic commonalities inherent to Mn-induced neurotoxicity and neurodegenerative disorders.
Risks and benefits of copper in light of new insights of copper homeostasis.
de Romaña DL, Olivares M, Uauy R, Araya M.
Instituto de Nutrición y Tecnología de los Alimentos, Universidad de Chile, Avenida El Líbano 5524, Macul 5540, Macul, Santiago, Chile.
Copper is an essential micronutrient involved in a variety of biological processes indispensable to sustain life. At the same time, it can be toxic when present in excess, the most noticeable chronic effect being liver damage. Potent, efficient regulatory mechanisms control copper absorption in the digestive tract and copper biliary excretion; absorption ranges between 12 and 60% in humans, depending on Cu intake, presence of other factors in the diet that may promote or inhibit its absorption and on the copper status of the individual. Current evidence suggests that copper deficiency may be more prevalent than previously thought, while copper toxicity is uncommon under customary daily life conditions. Menkes syndrome and Wilson disease are genetic conditions associated with severe copper deficiency and severe copper toxicity, respectively. Effects of milder degrees of copper deficiency and excess copper exposure are not well described, mainly due to lack of sensitive and specific indicators; serum copper concentration and ceruloplasmin are the most frequently used indicators, but they only detect rather intense changes of copper status. Of the many proteins assessed as potential markers of copper status the chaperone of Zn-Cu superoxide dismutase (CCS1) has yielded promising results; data on its performance under different conditions are needed to confirm its use as an indicator of early copper deficiency. Defining copper requirements and upper safe limits of consumption (UL) is a complex process since there are adverse health consequences from both copper deficiency and copper excess (U shape curve). The regulatory framework for risk assessment of essential trace elements introduced by the International Programme on Chemical Safety (IPCS) has proposed a homeostatic model to determine the Adequate Range of Oral Intake (AROI) of essential trace elements; the nadir of the resulting U shape curve serves to define the AROI. At this range of intake physiological mechanisms allow for normal homeostasis and basically, there are no detectable adverse effects. At present, Recommended Dietary Intakes (DRIs) and Adequate Intakes (AIs) are used to recommend copper intakes at different ages and life situations. Evidence obtained in humans and non-human primates presented here suggest that current copper UL should be re evaluated. Developing the scientific basis for a copper UL and evaluating the relevance of copper deficiency globally are future key challenges for copper researchers.
Platinum speciation used for elucidating activation or inhibition of Pt-containing anti-cancer drugs
Helmholtz-Zentrum München – German Research Center for Environmental Health, Institute of Ecological Chemistry, Ingolstädter Landstr. 1, 85764 Neuherberg, Germany
This article reviews approaches on platinum speciation with respect to Pt drugs in anti-cancer therapies. The paper starts with the introduction of available platinum-based drugs and describes their assumed principle of action. It is now generally accepted that these Pt complexes exhibit their therapeutic action by coordination to DNA which leads to bending of the DNA structure and to an inhibition of the DNA polymerase progression. But dose-limiting side effects, including nephrotoxicity as well as resistance to some of these Pt compounds, are still a major problem. Platinum speciation moved increasingly into the focus of interest when it became clear that (1) the active drugs were the hydrolyzation products rather than the originally administered ones and (2) that the parallel formation of inactive Pt–protein complexes, which additionally reduce the efficacy of Pt anti-tumor agents, compete with the formation of the cytotoxic Pt-DNA lesions. Speciation analysis methods were employed based on chromatography or capillary electrophoresis respectively, each coupled to inductively coupled plasma (ICP)-mass spectrometry (MS) or electrospray ionization (ESI)-MS.
The paper describes these Pt-speciation investigations, which started with exploring hydrolyzation kinetics in aqueous solutions. These experiments were followed by the speciation investigations in model solutions containing proteins or other sulphur-containing ligands, which could also be responsible for deactivation of the Pt agent in vivo. The experiments improved the understanding of the metabolite form, by which the metal complex enters the tumor cells, and whether and how this metabolized complex is already inactivated at this time. As an example, reaction kinetics of cisplatin (cis-[diamminedichloroplatinum(II)]) with albumin, transferrin, myoglobin, ubiquitin, and metallothionein were investigated and reaction products were speciated.
Finally, Pt-speciation in serum of medicated cancer patients was conducted by several research groups, which are outlined in the Section “Investigations in serum”.
The section “Investigations in urine of cancer treated patients” deals with speciation experiments on the Pt-metabolites excreted by the organism. By these means an assessment of the in vivo metabolism of Pt-drugs may be possible. Finally, the development of new anti-cancer metallodrugs needs the respective analytical techniques reported in the last section of the paper.
Iodine requirements and the risks and benefits of correcting iodine deficiency in populations
Michael B. Zimmermann
The Human Nutrition Laboratory, ETH Zürich, Switzerland
Division of Human Nutrition, Wageningen University, Wageningen, The Netherlands
Iodine deficiency has multiple adverse effects on growth and development due to inadequate thyroid hormone production that are termed the iodine deficiency disorders (IDD). IDD remains the most common cause of preventable mental impairment worldwide. IDD assessment methods include urinary iodine concentration, goiter, thyroglobulin and newborn thyrotropin. In nearly all iodine-deficient countries, the best strategy to control IDD is salt iodization, one of the most cost-effective ways to contribute to economic and social development. When salt iodization is not possible, iodine supplements can be targeted to vulnerable groups. Introduction of iodized salt to regions of chronic IDD may transiently increase the incidence of thyroid disorders, and programs should include monitoring for both iodine deficiency and excess. Although more data on the epidemiology of thyroid disorders caused by differences in iodine intake are needed, overall, the relatively small risks of iodine excess are far outweighed by the substantial risks of iodine deficiency.
On risks and benefits of iron supplementation recommendations for iron intake revisited
Klaus Schümann, Thomas Ettle, Bernadett Szegnera, Bernd Elsenhans, Noel W. Solomons
Science Center Weihenstephan, Technical University Munich, Am Forum 5, D-85350 Freising, Germany
Department for Food of Animal Origin, Animal Nutrition, and Physiology of Nutrition, University for Soil Culture Wien, Gregor Mendel Strasse 33, A-1180 Wien, Austria
Walther-Straub-Institute for Pharmacology and Toxicology, Ludwig-Maximilians University, Munich Goethestrasse 33, D-80336 München, Germany
Center for Studies of Sensory Impairment, Aging and Metabolism, 17a Avenida #16-89, Zona 11, Guatemala City 01011, Guatemala
Iron is an essential trace element with a high prevalence of deficiency in infants and in women of reproductive age from developing countries. Iron deficiency is frequently associated with anaemia and, thus, with reduced working capacity and impaired intellectual development. Moreover, the risk for premature delivery, stillbirth and impaired host-defence is increased in iron deficiency. Iron-absorption and -distribution are homeostatically regulated to reduce the risk for deficiency and overload. These mechanisms interact, in part, with the mechanisms of oxidative stress and inflammation and with iron availability to pathogens. In the plasma, fractions of iron may not be bound to transferrin and are hypothesised to participate in atherogenesis. Repleted iron stores and preceding high iron intakes reduce intestinal iron absorption which, however, offers no reliable protection against oral iron overload.
Recommendations for dietary iron intake at different life stages are given by the US Food and Nutrition Board (FNB), by FAO/WHO and by the EU Scientific Committee, among others. They are based, on estimates for iron-losses, iron-bioavailability from the diet, and iron-requirements for metabolism and growth. Differences in choice and interpretation of these estimates lead to different recommendations by the different panels which are discussed in detail.
Assessment of iron-related risks is based on reports of adverse health effects which were used in the attempts to derive an upper safe level for dietary iron intake. Iron-related harm can be due to direct intestinal damage, to oxidative stress, or to stimulated growth of pathogens. Unfortunately, it is problematic to derive a reproducible cause–effect and dose–response relationship for adverse health effects that suggest a relationship to iron-intake, be they based on mechanistic or epidemiological observations. Corresponding data and interpretations are discussed for the intestinal lumen, the vascular system and for the intracellular and interstitial space, considering interference of the mechanisms of iron homoeostasis as a likely explanation for differences in epidemiological observations.
Zinc requirements and the risks and benefits of zinc supplementation
Wolfgang Mareta, Harold H. Sandsteada
Department of Preventive Medicine and Community Health, Division of Human Nutrition, University of Texas Medical Branch, 700 Harborside Drive, Galveston, TX 77555, USA
Department of Anesthesiology, University of Texas Medical Branch, 700 Harborside Drive, Galveston, TX 77555, USA
The adult human contains 2–3 g of zinc, about 0.1% of which are replenished daily. On this basis and based on estimates of bioavailability of zinc, dietary recommendations are made for apparently healthy individuals. Absent chemical, functional, and/or physical signs of zinc deficiency are assumed indicative of adequacy. More specific data are seldom available. Changing food preferences and availability, and new food preparation, preservation, and processing technologies may require re-evaluation of past data. Conservative estimates suggest that ⩾25% of the world's population is at risk of zinc deficiency. Most of the affected are poor, and rarely consume foods rich in highly bioavailable zinc, while subsisting on foods that are rich in inhibitors of zinc absorption and/or contain relatively small amounts of bioavailable zinc. In contrast, among the relatively affluent, food choice is a major factor affecting risk of zinc deficiency. An additional problem, especially among the relatively affluent, is risk of chronic zinc toxicity caused by excessive consumption of zinc supplements. High intakes of zinc relative to copper can cause copper deficiency. A major challenge that has not been resolved for maximum health benefit is the proximity of the recommended dietary allowance (RDA) and the reference dose (RfD) for safe intake of zinc. Present recommendations do not consider the numerous dietary factors that influence the bioavailability of zinc and copper, and the likelihood of toxicity from zinc supplements. Thus the current assumed range between safe and unsafe intakes of zinc is relatively narrow. At present, assessment of zinc nutriture is complex, involving a number of chemical and functional measurements that have limitations in sensitivity and specificity. This approach needs to be enhanced so that zinc deficiency or excess can be detected early. An increasing number of associations between diseases and zinc status and apparently normal states of health, where additional zinc might be efficacious to prevent certain conditions, point at the pharmacology of zinc compounds as a promising area. For example, relationships between zinc and diabetes mellitus are an area where research might prove fruitful. In our opinion, a multidisciplinary approach will most likely result in success in this fertile area for translational research.