Books by Subject
- 2011 ProQuest SafariPaul Teetor.
- 2013 SpringerM. Alma Rodriguez, Ronald S. Walters, and Thomas W. Burke, editors.Introduction / M. Alma Rodriguez -- History of MD Anderson's tumor registry / Sarah H. Taylor -- Statistical methods / Geoffrey G. Giacco, Sarah H. Taylor and Kenneth R. Hess -- Breast cancer / Aman U. Buzdar ... [et al.] -- Prostate cancer / Deborah A. Kuban ... [et al.] -- Non-small cell lung cancer / Ritsuko Komaki, Anne S. Tsao, and Reza J. Mehran -- Small cell lung cancer / Frank V. Fossella -- Colon cancer / Cathy Eng, Patrick Lynch, and John Skibber -- Ovarian cancer / Robert L. Coleman and David M. Gershenson -- Cervical cancer / Patricia J. Eifel and Charles Levenback -- Endometrial cancer / Thomas Burke ... [et al.] -- Pancreatic cancer (exocrine) / Jason Fleming ... [et al.] -- Kidney cancer / Scott E. Delacroix Jr. ... [et al.] -- Bladder cancer / Robert S. Svatek ... [et al.] -- Cutaneous melanoma / Jeffrey E. Gershenwald, Geoffrey G. Giacco, and Jeffrey E. Lee -- Liver cancer / Evan S. Glazer and Steven A. Curley -- Esophageal cancer / Linus Ho ... [et al.] -- Gastric cancer / Alexandria T. Phan and Paul F. Mansfield -- Acute myeloid leukemia / Emil J. Freireich -- Chronic lymphocytic leukemia/small lymphocytic lymphoma / Apostolia-Maria Tsimberidou and Michael J. Keating -- Hodgkin lymphoma / Michelle Fanale ... [et al.] -- Non-hodgkin indolent B-cell lymphoma / Sattva S. Neelapu -- Non-hodgkin aggressive B-cell lymphoma / M. Alma Rodriguez -- Multiple myeloma / Donna Weber and Raymond Alexanian -- Head and neck cancer / Ehab Hanna ... [et al.] -- Thyroid cancer / Steven I. Sherman, Nancy Perrier, and Gary L. Clayman -- Soft tissue sarcomas / Vinod Ravi, Raphael Pollock, and Shreyaskumar R. Patel -- Sarcomas of bone / Valerae Lewis.
- 2014 ScienceDirectJung W. Suh, Youngmin Kim."Beyond simulation and algorithm development, many developers increasingly use MATLAB even for product deployment in computationally heavy fields. This often demands that MATLAB codes run faster by leveraging the distributed parallelism of Graphics Processing Units (GPUs). While MATLAB successfully provides high-level functions as a simulation tool for rapid prototyping, the underlying details and knowledge needed for utilizing GPUs make MATLAB users hesitate to step into it. Accelerating MATLAB with GPUs offers a primer on bridging this gap. Starting with the basics, setting up MATLAB for CUDA (in Windows, Linux and Mac OS X) and profiling, it then guides users through advanced topics such as CUDA libraries. The authors share their experience developing algorithms using MATLAB, C++ and GPUs for huge datasets, modifying MATLAB codes to better utilize the computational power of GPUs, and integrating them into commercial software products. Throughout the book, they demonstrate many example codes that can be used as templates of C-MEX and CUDA codes for readers' projects. Download example codes from the publisher's website: http://booksite.elsevier.com/9780124080805/ Shows how to accelerate MATLAB codes through the GPU for parallel processing, with minimal hardware knowledge -- Explains the related background on hardware, architecture and programming for ease of use -- Provides simple worked examples of MATLAB and CUDA C codes as well as templates that can be reused in real-world projects."--Provided by publisher.
- 2007 ProQuest SafariKen Bluttman and Wayne S. Freeze.
- 2012 CRCnetBASEShein-Chung Chow, Mark Chang.Protocol amendment -- Adaptive randomization -- Adaptive hypotheses -- Adaptive dose-escalation trials -- Adaptive group sequential design -- Statistical tests for adaptive seamless designs -- Adaptive sample size adjustment -- Two-stage adaptive design -- Adaptive treatment switching -- Bayesian approach -- Biomarker adaptive trials -- Target clinical trials -- Sample size and power estimation -- Clinical trial simulation -- Regulatory perspectives : a review of FDA draft guidance.
- 2012Olivia Yueh-Wen Liao.When developing new drugs, Phase I and II trials are commonly conducted to determine the dose of the new treatment in preparation for the subsequent confirmatory Phase III trial. However, because these early-phase trials usually do not have large enough sample sizes to decide which dosage level or treatment regimen is the best, several of them may arise as candidates for the confirmatory Phase III trial. Conventional fixed sample size designs that carry out all the treatment arms of interest are obviously expensive. Therefore, the pharmaceutical industry is increasingly interested in adaptive designs that can use information acquired during the course of the trial to update certain of the design features. In this thesis we explore several existing designs and discuss their pros and cons. We then propose one that shares the flexibility of Bayesian adaptive designs, while still being able to maintain the frequentist type I error probability. We develop an asymptotic theory for efficient outcome-adaptive randomization schemes and optimal stopping rules. Our approach consists of developing asymptotic lower bounds for the expected sample sizes from the treatment arms and the control arm, and using generalized sequential likelihood ratio procedures to achieve these bounds. These allow us to allocate patients and study resources efficiently by using outcome-adaptive randomization schemes, or by arm suspension/selection if fixed randomization is used. We also derive an adaptive test with a p-value that can be evaluated by Monte Carlo simulation based on an ordering scheme of the sample space. We then show that the approach can also be applied to the closely related problem of multi-stage testing of multiple hypotheses.
- 2012 CRCnetBASELyle D. Broemeling.1. Introduction -- 2. Medical tests and preliminary information -- 3. Preview of the book -- 4. Fundamentals of diagnostic accuracy -- 5. Regression and medical test accuracy -- 6. Agreement and test accuracy -- 7. Estimating test accuracy with an imperfect reference standard -- 8. Verification bias and test accuracy -- 9. Test accuracy and medical practice -- 10. Accuracy of combined tests -- 11. Bayesian methods of meta-analysis.
- 2012 Thieme Bookedited by Mohit Bhandari, Bernd Robioneck ; associate editor, Emil Schemitsch ; managing editor, Sheila Sprague ; with contributions by Volker Alt ... [et al.].1. Factorial Randomized Trials -- Summary -- Introduction -- Randomized clinical trial design strategies with multiple interventions -- Selecting a factorial randomized design -- Conclusion -- 2. Expertise-based randomized trials -- Summary -- Introduction -- Conventional RCT design -- Expertise-based RCT design -- Challenges of expertise-based RCTs -- Independent assessor -- Balanced and consecutive or random contributions to screening pool -- Perceived equivalence between practices -- Conclusion -- 3. Randomization systems and technology -- Summary -- Introduction -- Methods of patient allocation -- Other considerations -- Conclusion -- 4. Blinding and concealment -- Summary -- Introduction -- Concealment -- Blinding -- Conclusion -- 5. Composite outcome in orthopaedics: understanding the concepts -- Summary -- Introduction -- Rationale for use of a composite outcome -- Limitations of using composite outcomes -- Guidelines for creating a "good" composite outcome -- Reporting and interpreting composite outcomes -- Conclusion -- 6. Adjudication of outcomes-systems and approaches -- Summary -- Introduction -- Importance of adjudication -- Process of adjudication -- Existing methods of adjudication -- Web-based adjudication -- Conclusion -- 7. Subgroup analyses -- Summary -- Introduction -- Subgroup analysis defined -- Design of a subgroup analysis -- Reporting -- Interpretation -- Conclusion -- 8. Trial management-advance concepts and systems -- Summary -- Introduction -- Phases of clinical trials -- Common considerations in conducting a clinical trial -- Trial committees -- The search for funding -- Conclusion 9. Case-control studies -- Summary -- Introduction -- Definition of a case-control study -- Conclusion -- 10. Cohort studies -- Summary -- Introduction -- Cohort studies in the hierarchy of evidence -- Types of cohort study designs -- Methods for reducing confounding and assessing causality -- A checklist to evaluate or improve the strength of evidence -- Conclusion --; 11. Survey Design -- Summary -- Introduction -- Identifying a research question -- Survey development -- Survey design -- Using exiting surveys -- survey validataion -- Pilot testing -- Survey administration -- Ethical considerations -- Financial considerations -- Conclusion -- 12. Qualitative studies -- Summary -- Introduction -- What is qualitative research? -- How is qualitative research done? -- Conclusion -- 13. Economic analysis -- Summary -- Introduction -- Theoretical background -- Conducting health economics studies -- Conclusion -- 14. Literature searches -- Summary -- Introduction -- When to conduct a literature search -- How to conduct a literature search -- Study selection -- Assessing methodological quality of studies -- Data extraction and analysis -- Conclusion -- 15. Summary -- Introduction -- the P value does not assess the magnitude of a treatment effect -- The challenge of comparing results across studies -- Types of effect sizes -- Confidence intervals for effect sizes -- Use of effect size in meta-analysis -- Conclusion -- 16. Fixed effects versus random effects -- Summary -- Introduction -- the data pool: fixed effects versus random effects -- Deciding which model to use -- Conclusion -- 17. Heterogeneity -- Summary -- Introduction -- Heterogeneity defined -- Identifying heterogeneity -- Dealing with heterogeneity -- Conclusion -- 18. Uncovering publicatiion bias -- Summary -- Introduction -- Authors and publication bias -- Detecting and adjusting for publication bias -- Minimizing the effect of publication bias -- A possible solution: trial registers -- Conclusion -- 19. Statiecal pooling - programs and systems -- Summary -- Introduction -- Selectin the appropriate programs and systems: factors to consider --Review of programs and systems -- Types of programs and systems -- Recommendations -- Conclusion -- 20. Meta-analysis of observational studies -- Summary -- Introduction -- The observational data dilemma -- Conclusion -- 21. Meta-regression -- Summary -- Introduction -- Heterogeneity and meta-regression -- Meta-regression mechanics -- Interpreting a meta-regression -- Limitations of meta-regression -- Conclusion -- 22. Preparing a statistical analysis plan -- Summary -- Introduction -- Data management -- Statistical procedures -- Data safety and monitoring board -- Sample size -- Interim analysis -- Reports to investigators -- Sensitivity analysis -- Tables of presentation and publication -- Software -- Reporting guidelines -- General policies -- Privacy considerations -- Apendices -- What is in the literature about SAP? -- Conclusion --; 24. Survival analysis -- Summary -- Introduction -- Documenting time-to-event (survival) data -- the rationale for time-to-event (survival) analysis -- Methods for survival analysis -- Comparing groups -- Practical considerations in the desin of time-to-event studies -- Conclusion -- 25. Interim analyses in randomized trials -- Summary -- Introduction -- Interim analysis defined -- Design of an interim analysis -- Data monitoring committees -- Consequences of stopping early for benefit -- Conclusion -- 26. Conflicts of interest reporting -- Summary -- Introduction -- Legal implications -- Types of conflict of interest -- Who should disclose conflicts of interest? -- When must conflicts of interest be disclosed? -- Guidelines and recommendations for disclosure -- Why must personal financial interests be disclosed? -- Alternatives to disclosure policies -- Conclusion -- 27. Authorship-modern approaches and reporting -- Summary -- Introduction -- Academic stream -- Educational stream -- Industrial stream -- Popular media stream -- Conclusion -- 28. Randomized trials reporting checklists -- Summary -- Introduction -- Consort statement -- Nonpharmacological trials -- NPT extension to consort statement -- CLEAR NPT -- Conclusion -- 29. Observational studies reporting checklists -- Summary -- Introduction -- Checklist for observational studies: the STROBE statement -- Guide to investigators: how to used the STROBE statement -- The STROBE statement: explanations of checklist items -- Conclusion -- 30. Meta-analysis reporting checklists -- Summary -- Introduction -- Meta-analysis reporting checklists -- Conclusion -- Resources and contacts for surgical research -- Summary -- Introduction -- Mentorship -- Graduate programes -- Online courses -- Textbooks and journals -- Courses and workshops -- Contract research organizations -- Conclusion -- Glossary of terms.
- 2007 SpringerPeter Lucas, José A. Gámez, Antonio Salmerón, eds.
- 2010Fernando Amat Gil.Cryo-Electron tomography (CET) is the only imaging technology capable of visualizing the 3D organization of intact bacterial whole cells at nanometer resolution in situ. However, quantitative image analysis of CET datasets is extremely challenging due to very low signal to noise ratio (well below 0dB), missing data and heterogeneity of biological structures. In this thesis, we present a probabilistic framework to align CET images in order to improve resolution and create structural models of different biological structures. The alignment problem of 2D and 3D CET images is cast as a Markov Random Field (MRF), where each node in the graph represents a landmark in the image. We connect pairs of nodes based on local spatial correlations and we find the "best'' correspondence between the two graphs. In this correspondence problem, the "best'' solution maximizes the probability score in the MRF. This probability is the product of singleton potentials that measure image similarity between nodes and the pairwise potentials that measure deformations between edges. Well-known approximate inference algorithms such as Loopy Belief Propagation (LBP) are used to obtain the "best'' solution. We present results in two specific applications: automatic alignment of tilt series using fiducial markers and subtomogram alignment. In the first case we present RAPTOR, which is being used in several labs to enable real high-throughput tomography. In the second case our approach is able to reach the contrast transfer function limit in low SNR samples from whole cells as well as revealing atomic resolution details invisible to the naked eye through nanogold labeling.
- 2006 SpringerRainer Flindt ; translated by Neil Solomon.
- 2007 SpringerAlain F. Zuur, Elena N. Ieno, Graham M. Smith.
- 2014 CRCnetBASEGuanyu Wang.Food intake and energy metabolism -- Glucose homeostasis -- Optimal glucose homeostasis -- Bistability as a fundamental phenomenon -- Biomolecular network -- P13K-AKT-TOR pathway -- Diseases related to metabolism -- Mathematical modeling of the P13K-AKT-TOR Pathway -- Fundamental decomposition -- Normal phenotype -- Disease phenotypes -- Tao of diseases.
- 2007Mohamed M. Shoukri , Mohammad A. Chaudhary.
- 2015 Wiley[edited by] Dongmei Chen, Bernard Moulin, Jianhong Wu.Introduction to analyzing and modeling spatial and temporal dynamics of infectious diseases / Dongmei Chen, Bernard Moulin, Jianhong Wu -- Modeling the spread of infectious diseases : a review / Dongmei Chen -- West Nile virus : a narrative from bioinformatics and mathematical modeling studies / U.S.N. Murty, Amit Kumar Banerjee and Jianhong Wu -- West Nile virus risk assessment and forecasting using statistical and dynamical models / Ahmed Abdelrazec, Yurong Cao, Xin Gao, Paul Proctor, Hui Zheng, and Huaiping Zhu -- Using mathematical modeling to integrate disease surveillance and global air transportation data / Julien Arino and Kamran Khan -- Mathematical modeling of malaria models with spatial effects / Daozhou Gao and Shigui Ruan -- Avian influenza spread and transmission dynamics / Lydia Bourouiba, Stephen Gourley, Rongsong Liu, John Takekawa, and Jianhong Wu -- Analyzing the potential impact of bird migration on the global spread of H5N1 avian influenza (2007-2011) using spatio-temporal mapping methods / Heather Richardson and Dongmei Chen -- Cloud computing-enabled cluster detection using a flexibly shaped scan statistic for real-time syndromic surveillance / P. Belanger and K. Moore -- Mapping the distribution of malaria : current approaches and future directions / L.R. Johnson, K.D. Lafferty, A. McNally, E. Mordecai, K. Paaijmans, S. Pawar, S.J. Ryan -- Statistical modeling of spatio-temporal infectious disease transmission / Rob Deardon, Xuan Fang and Grace Pui Sze Kwong -- Spatio-temporal dynamics of schistosomiasis in China : bayesian-based geostatistical analysis / Zhi-Jie Zhang -- Spatial analysis and statistical modeling of 2009 H1N1 pandemic in the greater Toronto area / Frank Wen, Dongmei Chen, Anna Majury -- West Nile virus mosquito abundance modeling using a non-stationary spatio-temporal geostatistics / Eun-Hye Yoo, Dongmei Chen, Curtis Russel -- Spatial pattern analysis of multivariate disease data / Cindy X. Feng and Charmaine Dean -- The zoonosismags project (part 1) : population-based geosimulation of zoonoses in an informed virtual geographic environment / Bernard Moulin, Mondher Bouden, Daniel Navarro -- Zoonosismags project (part 2) : complementarity of a rapid-propotyping tool and of a full-scale geosimulator for population-based geosimulation of zoonoses / Bernard Moulin, Daniel Navarro, Dominic Marcotte, Said Sedrati -- Web-mapping and behaviour pattern extraction tools to assess lyme disease risk for humans in peri-urban forests / Hedi Haddad, Bernard Moulin, Franck Manirakiza Christelle Maha, Vincent Godard and Samuel Mermet -- An integrated approach for communicable disease geosimulation based on epidemiological, human mobility and public intervention models / Hedi Haddad, Bernard Moulin, Marius Thariault -- Smartphone trajectories as data sources for agent-based infection spread modeling / M.R. Friesen and R.D. McLeod.
- 2013 CambridgeJos W.R. Twisk, Department of Epidemiology and Biostatistics, Medical Center and the Department of Health Sciences of the Vrije Universteit, Amsterdam.1. Introduction -- 2. Study design -- 3. Continuous outcome variables -- 4. Continuous outcome variables, relationships with other variables -- 5. The modelling of time -- 6. Other possibilities for modelling longitudinal data --7. Dichotomous outcome variables -- 8. Categorical and 'count' outcome variables -- 9. Analysis data from experimental studies -- 10. Missing data in longitudinal studies -- 11. Sample size calculations -- 12. Software for longitudinal data analysis -- 13. One step further.
- 2013 CRCnetBASEGeoff Der, Brian S. Everitt."Adding topics useful to medical statisticians, this new edition of a popular intermediate-level reference explores the use of SAS for analyzing medical data. A new chapter on visualizing data includes a detailed account of graphics for investigating data and smoothing techniques. The book also includes new chapters on measurement in medicine, epidemiology/observational studies, meta-analysis, Bayesian methods, and handling missing data. The book maintains its example-based approach, with SAS code and output included throughout and available online"--Provided by publisher.
- 2013 CRCnetBASEDing-Geng (Din) Chen, Karl E. Peace."Preface In Chapter 8 of our previous book (Chen and Peace, 2010), we briefy introduced meta-analysis using R. Since then, we have been encouraged to develop an entire book on meta-analyses using R that would include a wide variety of applications - which is the theme of this book. In this book we provide a thorough presentation of meta-analysis with detailed step-by-step illustrations on their implementation using R. In each chapter, examples of real studies compiled from the literature and scienti c publications are presented. After presenting the data and sufficient background to permit understanding the application, various meta-analysis methods appropriate for analyzing data are identi ed. Then analysis code is developed using appropriate R packages and functions to meta-analyze the data. Analysis code development and results are presented in a stepwise fashion. This stepwise approach should enable readers to follow the logic and gain an understanding of the analysis methods and the R implementation so that they may use R and the steps in this book to analyze their own meta-data. Based on their experience in biostatistical research and teaching biostatistical meta-analysis, the authors understand that there are gaps between developed statistical methods and applications of statistical methods by students and practitioners. This book is intended to ll this gap by illustrating the implementation of statistical mata-analysis methods using R applied to real data following a step-by-step presentation style. With this style, the book is suitable as a text for a course in meta-data analysis at the graduate level (Master's or Doctorate's), particularly for students seeking degrees in statistics or biostatistics"-- Provided by publisher.
- 2014 ebraryXiao-Hua Zhou, Chuan Zhou, Danping Liu, Xiaobo Ding.Missing data concepts and motivating examples -- Overview of methods for dealing with missing data -- Design considerations in the presence of missing data -- Crosssectional data methods -- Longitudinal data methods -- Survival analysis under ignorable missingness -- Nonignorable missingness -- Analysis of randomized clinical trials with noncompliance.
- Peter Sprent and Nigel C. Smeeton..
- 2009 SpringerAndrea S. Foulkes.
- 2007 SpringerJoaquim P. Marques de Sá.
- by Themistocles L. Assimes ... [et al.].
- 2004 AccessMedicineBeth Dawson, Robert G. Trapp.Also available: Print – 2004
- 2011 CRCnetBASEScott M. Berry, Bradley P. Carlin, J . Jack Lee, and Peter MüllerChapter 1. Statistical approaches for clinical trials -- Chapter 2. Basics of Bayesian inference -- Chapter 3. Phase I studies -- Chapter 4. Phase II studies -- Chapter 5. Phase III studies -- Chapter 6. Special topics.
- 2012Sergio Bacallado de Lara.A host of sequential models in probability and statistics are characterized by time reversibility, from Markov chain Monte Carlo samplers to queueing networks. In physics, this property arises naturally from Hamiltonian mechanics. Molecular dynamics simulations are computer experiments which approximate classical mechanics in a system of interacting particles; in consequence, they are frequently reversible. Recent technical progress has made it possible to investigate the dynamics of biological macromolecules in silico using molecular dynamics simulations. An active area of research within this field is concerned with modeling the output of a simulation stochastically. This dissertation deals with the problem of incorporating knowledge of reversibility into the estimation and testing of stochastic models. We define a range of Bayesian inference algorithms, which are motivated by specific problems in the analysis of molecular dynamics simulations.
- 2012 CRCnetBASEPhil Woodward."Although the popularity of the Bayesian approach to statistics has been growing for years, many still think of it as somewhat esoteric, not focused on practical issues, or generally too difficult to understand. Bayesian Analysis Made Simple is aimed at those who wish to apply Bayesian methods but either are not experts or do not have the time to create WinBUGS code and ancillary files for every analysis they undertake. Accessible to even those who would not routinely use Excel, this book provides a custom-made Excel GUI, immediately useful to those users who want to be able to quickly apply Bayesian methods without being distracted by computing or mathematical issues.From simple NLMs to complex GLMMs and beyond, Bayesian Analysis Made Simple describes how to use Excel for a vast range of Bayesian models in an intuitive manner accessible to the statistically savvy user. Packed with relevant case studies, this book is for any data analyst wishing to apply Bayesian methods to analyze their data, from professional statisticians to statistically aware scientists"-- Provided by publisher.
- 2007 CRCnetBASELyle D. Broemeling.Introduction -- Diagnostic medicine -- Other diagnostic procedures -- Bayesian statistics -- Bayesian methods for diagnostic accuracy -- Regression and test accuracy -- Agreement -- Diagnostic imaging and clinical trials -- Other topics.
- 2007 SpringerJim Albert."Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The early chapters present the basic tenets of Bayesian thinking by use of familiar one and two-parameter inferential problems. Bayesian computational methods such as Laplace's method, rejection sampling, and the SIR algorithm are illustrated in the context of a random effects model. The construction and implementation of Markov Chain Monte Carlo (MCMC) methods is introduced. These simulation-based algorithms are implemented for a variety of Bayesian applications such as normal and binary response regression, hierarchical modeling, order-restricted inference, and robust modeling. Algorithms written in R are used to develop Bayesian tests and assess Bayesian models by use of the posterior predictive distribution. The use of R to interface with WinBUGS, a popular MCMC computing language, is described with several illustrative examples."--Jacket.
- 2009 CRCnetBASEAndrew B. Lawson.Bayesian inference and modeling -- Computational issues -- Residuals and goodness-of-fit -- Disease map reconstruction and relative risk estimation -- Disease cluster detection -- Ecological analysis -- Multiple scale analysis -- Multivariate disease analysis -- Spatial survival and longitudinal analysis -- Spatiotemporal disease mapping.
- 2013 CRCnetBASEAndrew B. Lawson.Bayesian inference and modeling -- Computational issues -- Residuals and goodness-of-fit -- Disease map reconstruction and relative risk estimation -- Disease cluster detection -- Regression and ecological analysis -- Putative hazard modeling -- Multiple scale analysis -- Multivariate disease analysis -- Spatial survival and longitudinal analysis -- Spatiotemporal disease mapping -- Disease map surveillance.
- 2009 CRCnetBASELyle D. Broemeling.Chapter 1. Introduction to Agreement -- Chapter 2. Bayesian Methods of Agreement for Two Raters -- Chapter 3. More than Two Raters -- Chapter 4. Agreement and Correlated Observations -- Chapter 5. Modeling Patterns of Agreement -- Chapter 6. Agreement with Quantitative Scores -- Chapter 7. Sample Sizes for Agreement Studies.
- 2014 CRCnetBASELyle D. Broemeling, Broemeling and Associates, Medical Lake, Washington, USA.Written by a biostatistics expert with over 20 years of experience in the field, Bayesian Methods in Epidemiology presents statistical methods used in epidemiology from a Bayesian viewpoint. It employs the software package WinBUGS to carry out the analyses and offers the code in the text and for download online. The book examines study designs that investigate the association between exposure to risk factors and the occurrence of disease. It covers introductory adjustment techniques to compare mortality between states and regression methods to study the association between various risk factors.
- 2010 CRCnetBASEMing T. Tan, Guo-Liang Tian, Kai Wang Ng.Optimization, Monte Carlo simulation, and numerical integration -- Exact solutions -- Discrete missing data problems -- Computing posteriors in the EM-type structures -- Constrained parameter problems -- Checking compatibility and uniqueness -- Basic statistical distributions and stochastic processes.
- 2010 CRCnetBASETomohiro Ando."Along with many practical applications, Bayesian Model Selection and Statistical Modeling presents an array of Bayesian inference and model selection procedures. It thoroughly explains the concepts, illustrates the derivations of various Bayesian model selection criteria through examples, and provides R code for implementation. The author shows how to implement a variety of Bayesian inference using R and sampling methods, such as Markov chain Monte Carlo. He covers the different types of simulation-based Bayesian model selection criteria, including the numerical calculation of Bayes factors, the Bayesian predictive information criterion, and the deviance information criterion. He also provides a theoretical basis for the analysis of these criteria. In addition, the author discusses how Bayesian model averaging can simultaneously treat both model and parameter uncertainties. Selecting and constructing the appropriate statistical model significantly affect the quality of results in decision making, forecasting, stochastic structure explorations, and other problems. Helping you choose the right Bayesian model, this book focuses on the framework for Bayesian model selection and includes practical examples of model selection criteria."--Publisher's description.
- 2011 CRCnetBASEedited by Dipak K. Dey, Samiran Ghosh, Bani K. Mallick.Chapter 1. Estimation and Testing in Time-Course Microarray Experiments -- Chapter 2. Classification for Differential Gene Expression Using Bayesian Hierarchical Models -- Chapter 3. Applications of MOSS for Discrete Multi-Way Data -- Chapter 4. Nonparametric Bayesian Bioinformatics -- Chapter 5. Measurement Error and Survival Model for cDNA Microarrays -- Chapter 6. Bayesian Robust Inference for Differential Gene Expression -- Chapter 7. Bayesian Hidden Markov Modeling of Array CGH Data -- Chapter 8. Bayesian Approaches to Phylogenetic Analysis -- Chapter 9. Gene Selection for the Identification of Biomarkers in High-Throughput Data -- Chapter 10. Sparsity Priors for Protein - Protein Interaction Predictions -- Chapter 11. Learning Bayesian Networks for Gene Expression Data -- Chapter 12. In-Vitro to In-Vivo Factor Profiling in Expression Genomics -- Chapter 13. In-Vitro to In-Vivo Factor Profiling in Expression Genomics Machines -- Chapter 14. A Bayesian Mixture Model for Protein Biomarker Discovery -- Chapter 15. Bayesian Methods for Detecting Differentially Expressed Genes -- Chapter 16. Bayes and Empirical Bayes Methods for Spotted Microarray Data Analysis -- Chapter 17. Bayesian Classification Method for QTL Mapping.
- Edited by S.K. Upadhyay, Umesh Singh, Dipak K. Dey.
- 2014 ebraryKevin Bretonnel Cohen, Dina Demner-Fushman.Biomedical Natural Language Processing" is a comprehensive tour through the classic and current work in the field. It discusses all subjects from both a rule-based and a machine learning approach, and also describes each subject from the perspective of both biological science and clinical medicine. The intended audience is readers who already have a background in natural language processing, but a clear introduction makes it accessible to readers from the fields of bioinformatics and computational biology, as well. The book is suitable as a reference, as well as a text for advanced courses in biomedical natural language processing and text mining.
- 2014 AtyponJoseph V. Tranquillo.Biomedical Signals and Systems is meant to accompany a one-semester undergraduate signals and systems course. It may also serve as a quick-start for graduate students or faculty interested in how signals and systems techniques can be applied to living systems. The biological nature of the examples allows for systems thinking to be applied to electrical, mechanical, fluid, chemical, thermal and even optical systems. Each chapter focuses on a topic from classic signals and systems theory: System block diagrams, mathematical models, transforms, stability, feedback, system response, control, time and frequency analysis and filters. Embedded within each chapter are examples from the biological world, ranging from medical devices to cell and molecular biology. While the focus of the book is on the theory of analog signals and systems, many chapters also introduce the corresponding topics in the digital realm. Although some derivations appear, the focus is on the concepts and how to apply them. Throughout the text, systems vocabulary is introduced which will allow the reader to read more advanced literature and communicate with scientist and engineers. Homework and Matlab simulation exercises are presented at the end of each chapter and challenge readers to not only perform calculations and simulations but also to recognize the real-world signals and systems around them.
- 2009Wayne W. Daniel.
- 2008Geoffrey R. Norman, David L. Streiner.The nature of data and statistics -- Analysis of variance -- Regression and correlation -- Nonparametric statistics -- Reprise.
- 2008 SpringerDaryl S. Paulson.
- 2009 SpringerFrancesco Sardanelli, Giovanni Di Leo.
- Calculations for molecular biology and biotechnology : a guide to mathematics in the laboratory. 2nd ed.2010 ScienceDirectFrank H. Stephenson.Chapter 1. Scientific notation and metric prefixes -- Chapter 2. Solutions, mixtures, and media -- Chapter 3. Cell growth -- Chapter 4. Working with bacteriophages -- Chapter 5. Nucleic acid quantification -- Chapter 6. Labeling nucleic acids with radioisotopes -- Chapter 7. Oligonucleotide synthesis -- Chapter 8. The polymerase chain reaction (PCR) -- Chapter 9. The real-time polymerase chain reaction (RT-PCR) -- Chapter 10. Recombinant DNA -- Chapter 11. Protein -- Chapter 12. Centrifugation -- Chapter 13. Forensics and paternity.
- 2013 CRCnetBASEMarco Scianna and Luigi Preziosi."All biological phenomena emerge from an intricate interconnection of multiple processes occurring at different levels of organization: namely, at the molecular, the cellular and the tissue level, see Figure 1. These natural levels can approximately be connected to a microscopic, mesoscopic, and macroscopic scale, respectively. The microscopic scale refers to those processes that occur at the subcellular level, such as DNA synthesis and duplication, gene dynamics, activation of receptors, transduction of chemical signals, diffusion of ions and transport of proteins. The mesoscopic scale, on the other hand, can refer to cell-level phenomena, such as adhesive interactions between cells or between cells and ECM components, cell duplication and death and cell motion. The macroscopic scale finally corresponds to those processes that are typical of multicellular behavior, such as population dynamics, tissue mechanics and organ growth and development. It is evident that research in biology and medicine needs to work in a multiscale fashion. This brings many challenging questions and a complexity that can not be addressed in the classical way, but can take advantage of the increasing collaboration between natural and exact sciences (for more detailed comments the reader is referred to [90, 262]). On the other hand, the recent literature provides evidence of the increasing attention of the mathematical, statistical, computational and physical communities toward biological and biomedical modeling, consequence of the successful results obtained by a multidisciplinary approach to the Life Sciences problems"-- Provided by publisher.
- 2011Jeremy J. Shen.In this thesis, we present advancements in some change-point problems and their applications to genomic problems that arises from massively parallel sequencing. Change-point problems are concerned with abrupt changes in the generating distribution of a stochastic process evolving over time, space, or any ordered set. This thesis focuses on a number of change-point models and inference problems on point processes. We provide a change-point model and efficient algorithms to detect change-points in relative intensity of non-homogeneous Poisson processes. A model selection approach is constructed in the spirit of classical Baysian Information Criterion, but tailored to the irregularity of change-point problems. We review an array of inference problems surrounding the change-point construct; and propose a point-wise Bayesian credible interval for the parameter of the generating distribution for exponential family. An asymptotic result on the relationship between frequentist and Baysian change-point estimator is shown. We investigate how data characteristics, such as sample size, signal strength, and change-point location, influences the inference procedures through a simulation study. On the application front, modern massively parallel sequencing generates enormous and rich data with much systematic and random noise. We provide a survey of the sequencing technologies and some statistical challenges in various steps of sequencing data analysis. A recent application of sequencing in population and tumor genomics is the profiling of genome copy number and detection of copy number variations across sample. We demonstrate in this thesis that sequencing reads can be viewed naturally as a stochastic process along the genome. Copy number variants are modeled as abrupt jumps in the read intensity function. This modeling assumption resembles the biological reality of mutations that leads to copy number change. We demonstrate the application of change-point methods on actual sequencing data. Our method is found to compare favorably against a commonly used existing method in a spike-in simulation study. We lastly discuss a direction in which our change-point methods can be extended. It is often of interest to find recurrent copy number variants among a collection of biological samples. We review existing array-based multi-sample copy number profiling methods. Estimation and model selection procedures for the multi-sample sequencing setting are derived as extensions of our two-sample methods. A key challenge is the treatment of carrier status, which is whether a sample carries the recurrent variant of question. We present two sets of methods, one based on the assumption that all samples are carriers and the other based on a known carrier set. The statistical characteristics of the two methods are compared in a number of simulation scenarios.
- 2009 OvidGloria P. Craig.
- 2009 Thieme Bookedited by Mohit Bhandari, Anders Joensson.Why we need clinical research -- Historical perspectives of clinical research -- Evidence-based surgery defined -- Myths and misconceptions about evidence-based medicine -- Becoming an evidence-based surgeon -- Principles of clinical research -- Various research design classifications -- Hierarchy of research studies: from case series to meta-analyses -- Randomized and nonrandomized studies -- Understanding research study design -- Clinical case series -- Case-control study -- Prospective cohort study -- Randomized trial -- Meta-analysis -- Economic analysis -- Diagnostic study -- Reliability study -- Understanding outcomes measurement -- Classification of outcomes -- What makes an outcome measure useful -- Common generic outcome scales for surgeons -- Common disease-specific outcome scales for surgeons -- Understanding treatment effects -- Common ways to present treatment effects -- Confidence interval defined -- P value defined -- Errors in hypothesis testing -- Clinical versus statistical significance -- Practice of clinical research -- Planning a research study -- Requirements of a clinical research proposal -- Identification of a good research question -- How to conduct a comprehensive literature search -- Guide to planning a randomized trial -- Guide to planning a nonrandomized study -- Study sample size -- How to budget for a research study -- Research ethics, review boards, and consent forms -- Regulatory issues in the evaluation of a new device or drug -- Strategies in research funding -- Conducting a research study -- Roles of the research team -- Role of a central methods center -- Role of a data monitoring committee -- Need for separate adjudication of outcomes -- Data management -- Study case report forms -- Study manual of operations -- Review of basic statistical principles -- Statistical means and proportions -- Regression analysis -- Analysis of variance -- Correlation defined.
- 2011 CRCnetBASEDing-Geng (Din) Chen, Karl E. Peace."With examples based on the authors' 30 years of real-world experience in many areas of clinical drug development, this book provides a thorough presentation of clinical trial methodology. It presents detailed step-by-step illustrations on the implementation of the open-source software R. Case studies demonstrate how to select the appropriate clinical trial data. The authors introduce the corresponding biostatistical analysis methods, followed by the step-by-step data analysis using R. They also offer the R program for download, along with other essential data, on their website"--Provided by publisher.
- 2014 WileyMichael O'Kelly, Bohdana Ratitch."This book provides practical guidance for statisticians, clinicians, and researchers involved in clinical trials in the biopharmaceutical industry, medical and public health organisations. Academics and students needing an introduction to handling missing data will also find this book invaluable. The authors describe how missing data can affect the outcome and credibility of a clinical trial, show by examples how a clinical team can work to prevent missing data, and present the reader with approaches to address missing data effectively. The book is illustrated throughout with realistic case studies and worked examples, and presents clear and concise guidelines to enable good planning for missing data. The authors show how to handle missing data in a way that is transparent and easy to understand for clinicians, regulators and patients. New developments are presented to improve the choice and implementation of primary and sensitivity analyses for missing data. Many SAS code examples are included - the reader is given a toolbox for implementing analyses under a variety of assumptions"--Provided by publisher.
- 2009 CRCnetBASERichard J. Hayes, Lawrence H. Moulton.Variability between clusters -- Choosing whether to randomise by cluster -- Choice of clusters -- Matching and stratification -- Randomisation procedures -- Sample size -- Alternative study designs -- Basic principles of analysis -- Analysis based on cluster-level summaries -- Regression analysis based on individual level data -- Analysis of trials with more complex designs -- Ethnical considerations -- Data monitoring -- Reporting and interpretation.
- 2009 CRCnetBASEGabriel Valiente.
- 2010 ProQuest SafariGlenn A. Walker, Jack Shostak.Introduction & basics -- Topics in hypothesis testing -- The data set TRIAL -- The one-sample t-test -- The two-sample t-test -- One-way ANOVA -- Two-way ANOVA -- Repeated measures analysis -- The crossover design -- Linear regression -- Analysis of covariance -- The Wilcoxon signed-rank test -- The Wilcoxon rank-sum test -- The Kruskal-Wallis test -- The binomial test -- The Chi-square test -- Fisher's exact test -- McNemar's test -- The Cochran-Mantel-Haenszel test -- Logistic regression -- The log-rank test -- The Cox proportional hazards model -- Exercises.Also available: Print – 2010
- Comparative effectiveness and efficacy research and analysis for practice (CEERAP) : applications in health care2012 SpringerFrancesco Chiappelli, editor ; Xenia Maria Caldeira Brant, Corazon B. Cajulis, co-editors.Recent trends in health care across the United States and internationally have emphasized a novel approach that consists in comparing the effectiveness and efficacy of treatment interventions with a patient-centered emphasis (i.e., evidence-based health care), while ensuring cost constraints, maximizing benefits, and minimizing risks. In this book, experts in comparative effectiveness and efficacy research and analysis for practice (CEERAP) in health care in general address a range of topical issues. The emphasis is on implications for endodontics and nursing, both of which are considered in a series of detailed chapters. Commonalities and differences among CEERAP, utility-based and logic-based analysis and decision-making, and evidence-based and patient-centered practice are defined and discussed. The book concludes by examining applications for CEERAP in developing patient-centered optimal treatment interventions for the next decade.
- 2008 CRCnetBASEedited by Ravindra Khattree and Dayanand Naik.
- 2008Wendy L. Martinez, Angel R. Martinez.
- 2008 SpringerYadolah Dodge.
- 2009 CRCnetBASEEsa I. Uusipaikka.
- 2011 CRCnetBASEShein-Chung Chow."Preface In pharmaceutical/clinical development of a test drug or treatment, relevant clinical data are usually collected from subjects with the diseases under study in order to evaluate safety and efficacy of the test drug or treatment under investigation. To provide accurate and reliable assessment, well-controlled clinical trials under valid study design are necessarily conducted. Clinical trial process is a lengthy and costly process, which is necessary to ensure a fair and reliable assessment of the test treatment under investigation. Clinical trial process consists of protocol development, trial conduct, data collection, statistical analysis/interpretation, and reporting. In practice, controversial issues evitably occur regardless the compliance of good statistical practice (GSP) and good clinical practice (GCP). Controversial issues in clinical trials are referred to as debatable issues that are commonly encountered during the conduct of clinical trials. In practice, controversial issues could be raised from, but are not limited to, (1) compromises between theoretical and real/common practices, (2) miscommunication and/or misunderstanding in perception/interpretation among regulatory agencies, clinical scientists, and biostatisticians, and (3) disagreement, inconsistency, miscommunication/misunderstanding, and errors in clinical practice"--Provided by publisher.
- 2007 CRCnetBASEMichael Greenacre.
- John Maindonald and W. John Braun.
- 2008 SpringerSU Catalog (SearchWorks) Click LINK above for Print location/circulation status.Phil Spector.
- 2011 CRCnetBASELuís Torgo."This hands-on book uses practical examples to illustrate the power of R and data mining. Assuming no prior knowledge of R or data mining/statistical techniques, it covers a diverse set of problems that pose different challenges in terms of size, type of data, goals of analysis, and analytical tools. The main data mining processes and techniques are presented through detailed, real-world case studies. With these case studies, the author supplies all necessary steps, code, and data. Mirroring the do-it-yourself approach of the text, the supporting website provides data sets and R code" -- Provided by publisher.
- 2011 SpringerGraham Williams.Part 1. Explorations -- Introduction -- Getting Started -- Working with Data -- Loading Data -- Exploring Data -- Interactive Graphics -- Transforming Data -- Part 2. Building Models -- Descriptive and Predictive Analytics -- Cluster Analysis -- Association Analysis -- Decision Trees -- Random Forests -- Boosting -- Support Vector Machines -- Part 3. Delivering Performance -- Model Performance Evaluation -- Deployment -- Part 4. Appendices -- Installing Rattle -- Sample Datasets.
- 2007Mamdouh Refaat.
- Reva Berman Brown and Mark Saunders.Why you need to use statistics in your research -- Understanding statistical language -- Linking data collection with analysis -- Presenting data -- Describing data -- Inferring differences and relationships from data -- What next? -- Appendix 1: Asking questions and types of data -- Appendix 2: Useful statistical software -- Appendix 3: An alphabet of statistics.
- 2009 CRCnetBASEShein-Chung Chow, Jen-pei Liu.Design of bioavailability studies -- Statistical inferences for effects from a standard 2x2 crossover design -- Statistical methods for average bioequivalence -- Power and sample size determination -- Transformation and analysis of individual subject ratios -- Assessment of inter- and intra-subject variabilities -- Assumptions of outlier detection for average bioequivalence -- Optimal crossover designs for two formulations for average bioequivalence -- Assessment of bioequivalence for more than two formulations -- Population and individual bioequivalence -- Statistical procedures for assessment of population and individual bioequivalence -- Assessment of bioequivalence for drugs with negligible plasma levels -- In vitro bioequivalence testing -- In vitro dissolution profiles comparison -- Meta-analysis for bioequivalence review -- Population pharmacokinetics -- Other pharmacokinetic studies -- Review of regulatory guidances on bioequivalence -- Frequently asked questions and future challenges.
- 2009 CRCnetBASEedited by Karl E. Peace.Overview of time-to-event endpoint methodology / Karl E. Peace -- Design (and monitoring) of clinical trials with time-to-event endpoints / Michael W. Sill and Larry Rubinstein -- Overview of time-to-event parametric methods / Karl E. Peace and Kao-Tai Tsai -- Overview of semiparametric inferential methods for time-to-event endpoints / Jianween Cai and Donglin Zeng -- Overview of inferential methods for categorical time-to-event data / Eric V. Slud -- Overview of Bayesian inferential methods including time-to-event endpoints / Laura H. Gunn -- An efficient alternative to the Cox model for small time-to-event trials / Devan V. Mehrotra and Arthur J. Roth -- Estimation and testing for change in hazard for time-to-event endpoints / Rafia Bhore and Mohammad Huque -- Overview of descriptive and graphical methods for time-to-event data / Michael O'Connell and Bob Treder -- Design and analysis of analgesic trials / Akiko Okamoto, Julia Wang, and Surya Mohanty -- Design and analysis of analgesic trials with paired time-to-event endpoints / Zhu Wang and Hon Keung Tony Ng -- Time-to-event endpoint methods in antibiotic trials / Karl E. Peace -- Design and analysis of cardiovascular prevention trials / Michelle McNabb and Andreas Sashegyi -- Design and analysis of antiviral trials / Anthony C. Segreti and Lynn P. Dix -- Cure rate models with applications to melanoma and prostate cancer data / Ming-Hui Chen and Sungduk Kim -- Parametric likelihoods for multiple nonfatal competing risks and death, with application to cancer data / Peter F. Thall and Xuemei Wang -- Design, summarization, analysis, and interpretation of cancer prevention trials / Matthew C. Somerville, Jennifer B. Shannon, and Timothy H. Wilson -- LASSO method in variable selection for right-censored time-to-event data with application to astrocytoma brain tumor and chronic myelogonous leukemia / Lili Yu and Dennis Pearl -- Selecting optimal treatments based on predictive factors / Eric C. Polley and Mark J. van der Laan -- Application of time-to-event methods in the assessment of safety in clinical trials / Kelly L. Moore and Mark J. van der Laan -- Design and analysis of chronic carcinogenicity studies of pharmaceuticals in rodents / Mohammad Atiar Rahman and Karl K. Lin -- Design and analysis of time-to-tumor response in animal studies : a Bayesian perspective / Steve Thomson and Karl K. Lin.
- 2012 WileyGerald van Belle, Kathleen F. Kerr.Design and Analysis of Experiments in the Health Sciences; Contents; Preface; 1 The Basics; 1.1 Four Basic Questions; 1.2 Variation; 1.3 Principles of Design and Analysis; 1.4 Experiments and Observational Studies; 1.5 Illustrative Applications of Principles; 1.6 Experiments in the Health Sciences; 1.7 Adaptive Allocation; 1.7.1 Equidistribution; 1.7.2 Adaptive Allocation Techniques; 1.8 Sample Size Calculations; 1.9 Statistical Models for the Data; 1.10 Analysis and Presentation; 1.10.1 Graph the Data in Several Ways; 1.10.2 Assess Assumptions of the Statistical Model.
- 2010 CRCnetBASEDiane L. Fairclough.Using SAS, SPSS, and R, this book addresses design and analysis aspects in enough detail so that readers can apply statistical methods to their own longitudinal studies. This edition includes a new chapter on testing models that involve moderation and mediation, a new chapter on QALYs and QTWiST specific to clinical trials, and recent methodological developments for the analysis of trials with missing data. It also presents revised discussions of multiple comparisons procedures that focus on the integration of HRQoL outcomes with other study outcomes using gatekeeper strategies.
- 2008Patrick Dattalo.Basic terms and concepts -- Statistical power analysis -- Confidence intervals: measures of precision -- Computer-intensive methods -- Additional considerations, recommendations, and conclusions -- Worked examples.
- Developing a national registry of pharmacologic and biologic clinical trials : workshop report (2006)2006 NAPCommittee on Clinical Trial Registries, Board on Health Sciences Policy.
- 2002 NAPCommittee on Dietary Risk Assessment in the WIC Program, Food and Nutrition Board, Institute of Medicine.Also available: Print – 2002
- Also available: Print – 1979-
- 2006 NCBI Bookshelfeditors, Dean T. Jamison, Richard G. Feachem, Malegapuru W. Makgoba ... [et al.].Changing patterns of disease and mortality in Sub-Saharan Africa: an overview / Florence K. Baingana and Eduard R. Bos -- Levels and trends in mortality in Sub-Saharan Africa: an overview / Jacob Adetunji and Eduard R. Bos -- Trends in child mortality, 1960 to 2000 / Kenneth Hill and Agbessi Amouzou -- Levels and trends of adult mortality / Debbie Bradshaw and Ian M. Timaeus -- Causes of death / Chalapati Rao, Alan D. Lopez, and Yusuf Hemed -- Population and mortality after AIDS / Rodolfo A. Bulatao -- Levels and patterns of mortality at INDEPTH demographic surveillance systems / Osman A. Sankoh ... [et al.] -- Trends and issues in child undernutrition / Todd Benson and Meera Shekar -- Diarrheal diseases / Cynthia Boschi-Pinto, Claudio F. Lanata, Walter Mendoza, and Demissie Habte -- Developmental disabilities / Geoff Solarsh and Karen J. Hofman -- Acute respiratory infections / Shabir A. Mahdi and Keith P. Klugman -- Vaccine-preventable diseases / Mark A. Miller and John T. Sentz -- Tuberculosis / Christopher Dye ... [et al.] -- Malaria / Robert W. Snow and Judy A. Omumbo -- Onchocerciasis / Uche Amazigo ... [et al.] -- Maternal mortality / Khama O. Rogo, John Oucho, and Philip Mwalali -- HIV/AIDS / Souleymane Mboup ... [et al.] -- Lifestyle and related risk factors for chronic diseases / Krisela Steyn and Albertino Damasceno -- Diabetes mellitus / Jean-Claude Mbanya and Kaushik Ramiaya -- Cancers / Freddy Sitas ... [et al.] -- Cardiovascular disease / Anthony Mbewu and Jean-Claude Mbanya -- Mental health and the abuse of alcohol and controlled substances / Florence K. Baingana, Atalay Alem, and Rachel Jenkins -- Neurological disorders / Donald Silberberg and Elly Katabira -- Violence and injuries / Brett Bowman ... [et al.].
- 2011 ProQuest SafariJohn K. Kruschke.This book's organization : read me first! -- Introduction : models we believe in -- What is this stuff called probability? -- Bayes' rule -- Inferring a binomial proportion via exact mathematical analysis -- Inferring a binomial proportion via grid approximation -- Inferring a binomial proportion via the Metropolis algorithm -- Inferring two binomial proportions via Gibbs sampling -- Bernoulli likelihood with hierarchical prior -- Hierarchical modeling and model comparison -- Null hypothesis significance testing -- Bayesian approaches to testing a point ("null") hypothesis -- Goals, power, and sample size -- Overview of the generalized linear model -- Metric predicted variable on a single group -- Metric predicted variable with one metric predictor -- Metric predicted variable with multiple metric predictors -- Metric predicted variable with one nominal predictor -- Metric predicted variable with multiple nominal predictors -- Dichotomous predicted variable -- Ordinal predicted variable -- Contingency table analysis -- Tools in the trunk.
- 2011 CRCnetBASEYing Kuen Cheung.Part I. Fundamentals -- Chapter 1. Introduction -- Chapter 2. Dose Finding in Clinical Trials -- Chapter 3. The Continual Reassessment Method -- Chapter 4. One-Parameter Dose-Toxicity Models -- Chapter 5. Theoretical Properties -- Chapter 6. Empirical Properties -- Part II. Design Calibration -- Chapter 7. Specifications of a CRM Design -- Chapter 8. Initial Guesses of Toxicity Probabilities -- Chapter 9. Least Informative Normal Prior -- Chapter 10. Initial Design -- Part III. CRM and Beyond -- Chapter 11. The Time-to-Event CRM -- Chapter 12. CRM with Multiparameter Models -- Chapter 13. When the CRM Fails -- Chapter 14. Stochastic Approximation.
- 2012 CRCnetBASEHans van Houwelingen, Hein Putter."In the last twenty years, dynamic prediction models have been extensively used to monitor patient prognosis in survival analysis. Written by one of the pioneers in the area, this book synthesizes these developments in a unified framework. It covers a range of models, including prognostic and dynamic prediction of survival using genomic data and time-dependent information. The text includes numerous examples using real data that is taken from the authors collaborative research. R programs are provided for implementing the methods"--Provided by publisher.
- 2010Nicholas Johnson.We present models and algorithms that can be applied to common problems in analysis of genomic data. These include CNV (Copy Number Variation) detection, local ancestry inference in admixed populations, and haplotype inference in panels of unrelated individuals. Chapter 2 proposes a new algorithm for the Fused Lasso Signal Approximator which was recently been proposed as an alternative to HMM's for CNV detection. Chapter 3 describes new models for local ancestry inference when high density genotype data is available, and our focus is on a higher order Autoregressive Hidden Markov Model (ARHMM). We give solutions to problems that have thus far prevented the use of higher order ARHMM's for this task, and we demonstrate the model on real and simulated data. Finally, in chapter 4 we given an approach for inferring haplotypes from unphased genotype data. We optimize a likelihood closely related to the PHASE model (which is considered one of the most accurate), and we show that the proposed approach is substantially more accurate than recent alternatives. The work in these chapters contributes to common and important tasks in analysis of genomic data for association studies.
- 2010 Kais Fam FoundThe Kaiser Family Foundation, Health Research and Educational Trust, [and the National Opinion Research Center]."This annual survey of employers provides a detailed look at trends in employer-sponsored health coverage, including premiums, employee contributions, cost-sharing provisions, and other relevant information. The survey continued to document the prevalence of high-deductible health plans associated with a savings option and included questions on wellness benefits and health risk assessments. The 2010 survey included 3,143 randomly selected public and private firms with three or more employees (2,046 of which responded to the full survey and 1,097 of which responded to an additional question about offering coverage). Researchers at the Kaiser Family Foundation, the National Opinion Research Center at the University of Chicago, and Health Research & Educational Trust designed and analyzed the survey"--Website.Also available: Print – 2010
- 2005- Wiley[editors-in-chief], Peter Armitage, Theodore Colton.The Encyclopedia of Biostatistics, 2nd Edition offers a reference to support the development and use of statistical methods for addressing the problems and critical issues that confront scientists, practitioners and policy makers engaged in the life and medical sciences.
- Reference tool covering statistics, probability theory, biostatistics, quality control, and economics with emphasis in applications of statistical methods in sociology, engineering, computer science, biomedicine, psychology, survey methodology, and other client disciplines.
- 2009 SpringerBryan Kestenbaum ; editors, Kathryn L. Adeney, Noel S. Weiss ; contributing author, Abigail B. Shoben.
- 2007 Springeredited by Agota Szende, Mark Oppe, and Nancy Devlin.
- 2014 SpringerStephen P. Glasser, editor.In its extensively revised and updated Second Edition, this book provides a solid foundation for readers interested in clinical research. Discussion encompasses genetic, pharmacoepidemiologic and implementation research. All chapters have been updated with new information and many new tables have been added to elucidate key points. The book now offers discussion on how to handle missing data when analyzing results, and coverage of Adaptive Designs and Effectiveness Designs and new sections on Comparative Effectiveness Research and Pragmatic Trials. Chapter 6 includes new material on Phase 0 Trials, expanded coverage of Futility Trials, a discussion of Medical Device approval, Off Label Drug use and the role of the FDA in regulating advertising. Additional new information includes the role of pill color and shape in association with the placebo effect and an examination of issues surrounding minority recruitment. The final chapter offers a new section on manuscript preparation along with a discussion of various guidelines being adopted by journals: CONSORT, STROBE, PRISMA, MOOSE and others; and coverage of Conflicts of Interest, Authorship, Coercive Citation, and Disclosures in Industry-Related Associations. Building on the strengths of its predecessor in its comprehensive approach and authoritative advice, the new edition offers more of what has made this book a popular, trusted resource for students and working researchers alike.
- 2008 Springeredited by Stephen P. Glasser.I. pt. I. Clinical Research: Definitions, "Anatomy and Physiology," and the Quest for "Universal Truth" / Stephen P. Glasser, p. 3-11 -- Introduction to Clinical Research and Study Designs / Stephen P. Glasser, p. 13-27 -- Clinical Trials / Stephen P. Glasser, p. 29-62 -- Alternative Interventional Study Designs / Stephen P. Glasser, p. 63-71 -- Postmarketing Research / Stephen P. Glasser, Elizabeth Delzell and Maribel Salas, p. 73-91 -- The United States Federal Drug Administration (FDA) and Clinical Research / Stephen P. Glasser, Carol M. Ashton and Nelda P. Wray, p. 73-110 -- The Placebo and Nocebo Effect / Stephen P. Glasser and William Frishman, p. 111-140 -- Recruitment and Retention / Stephen P. Glasser, p. 141-149 -- Data Safety and Monitoring Boards (DSMBs) / Stephen P. Glasser and O. Dale Williams, p. 151-158 -- Meta-Analysis / Stephen P. Glasser and Sue Duval, p. 159-177 -- II. pt. II. -- Research Methods for Genetic Studies / Sadeep Shrestha and Donna K. Arnett, p. 181-199 -- Research Methods for Pharmacoepidemiology Studies / Maribel Salas and Bruno Stricker, p. 201-216 -- Implementation Research: Beyond the Traditional Randomized Controlled Trial / Amanda H. Salanitro, Carlos A. Estrada and Jeroan J. Allison, p. 217-244 -- Research Methodology for Studies of Diagnostic Tests / Stephen P. Glasser, p. 245-257 -- III. pt. III. Statistical Power and Sample Size: Some Fundamentals for Clinician Researchers / J. Michael Oakes, p. 261-278 -- Association, Cause, and Correlation / Stephen P. Glasser and Gary Cutter, p. 279-294 -- Bias, Confounding, and Effect Modification / Stephen P. Glasser , p. 295-302 -- It's All About Uncertainty / Stephen P. Glasser and George Howard, p. 303-316 -- Grant Writing / Donna K. Arnett and Stephen P. Glasser, p. 317-325 -- IV. pt. IV. The Media and Clinical Research / Stephen P. Glasser, p. 329-333 -- Mentoring and Advising / Stephen P. Glasser and Edward W. Hook, p. 335-340 -- Presentation Skills: How to Present Research Results / Stephen P. Glasser, p. 341-349.
- 2012 WileyNicky J. Welton, Alexander J. Sutton, Nicola J. Cooper, Keith R. Abrams, A.E. AdesIntroduction -- Bayesian methods and winBUGS -- Introduction to decision models -- Meta-analysis using Bayesian methods -- Exploring between study heterogeneity -- Model critique and evidence consistency in random effects meta-analysis -- Evidence synthesis in a decision modelling framework -- Multi-parameter evidence synthesis in epidemiological models -- Mixed treatment comparisons -- Markov models -- Generalised evidence synthesis -- Expected value of information for research prioritisation and study design.
- 2012 SpringerPedro J. Gutiérrez Diez, Irma H. Russo, Jose Russo.Historical Introduction -- Descriptive Biostatistics -- Inferential Biostatistics (I): Estimating Values of Biomedical Magnitudes -- Inferential Biostatistics (II): Estimating Biomedical Behaviors -- Equations: Formulating Biomedical Laws and Biomedical Magnitudes -- Systems of Equations: The Explanation of Biomedical Phenomena (I). Basic Questions -- Systems of Equations: The Explanation of Biomedical Phenomena (II). Dynamic Interdependencies -- Optimal Control Theory: From Knowledge to Control (I). Basic Concepts -- Optimal Control Theory: From Knowledge to Control (II). Biomedical Applications -- Game Theory.
- [v.6], 2014Jonathan Orsay.
- 2007 ProQuest SafariDavid and Raina Hawley.Reducing workbook and worksheet frustration -- Hacking Excel's built-in features -- Naming hacks -- Hacking PivotTables -- Hacking formulas and functions -- Macro hacks -- Cross-application hacks.
- 2007W. John Braun, Duncan J. Murdoch.
- 2009Leslie G. Portney, Mary P. Watkins.
- 2008 SpringerLuc Duchateau and Palul Janssen.
- 2011 CRCnetBASEAndreas Wienke.The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. Frailty Models in Survival Analysis presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of univariate and shared frailty models. The author analyzes similarities and differences between frailty and copula models; discusses problems related to frailty models, such as tests for homogeneity; and describes parametric and semiparametric models using both frequentist and Bayesian approaches. He also shows how to apply the models to real data using the statistical packages of R, SAS, and Stata. The appendix provides the technical mathematical results used throughout. Written in nontechnical terms accessible to nonspecialists, this book explains the basic ideas in frailty modeling and statistical techniques, with a focus on real-world data application and interpretation of the results. By applying several models to the same data, it allows for the comparison of their advantages and limitations under varying model assumptions. The book also employs simulations to analyze the finite sample size performance of the models.--From the publisher's website.
- 2011 CRCnetBASEJian Qing Shi, Taeryon Choi."Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables. Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dimensional data and variable selection. The remainder of the text explores advanced topics of functional regression analysis, including novel nonparametric statistical methods for curve prediction, curve clustering, functional ANOVA, and functional regression analysis of batch data, repeated curves, and non-Gaussian data. Many flexible models based on Gaussian processes provide efficient ways of model learning, interpreting model structure, and carrying out inference, particularly when dealing with large dimensional functional data. This book shows how to use these Gaussian process regression models in the analysis of functional data. Some MATLAB® and C codes are available on the first author's website"--Publisher's website.
- 2009 SpringerHadley Wickham.Describes ggplot2, a data visualization package for R and a powerful and flexible system for creating data graphics.
- 2006 NCBI BookshelfAlan D. Lopez ... [et al.], editors.
- 2011 WHO"This report sets out the statistics, evidence and experiences needed to launch a more forceful response to the growing threat posed by noncommunicable diseases. While advice and recommendations are universally relevant, the report gives particular attention to conditions in low- and middle-income countries, which now bear nearly 80% of the burden from diseases like cardiovascular disease, diabetes, cancer and chronic respiratory diseases. The health consequences of the worldwide epidemic of obesity are also addressed. The report takes an analytical approach, using global, regional and country-specific data to document the magnitude of the problem, project future trends, and assess the factors contributing to these trends. As noted, the epidemic of these diseases is being driven by forces now touching every region of the world: demographic aging, rapid unplanned urbanization, and the globalization of unhealthy lifestyles"--Publisher's description.Also available: Print – 2011
- 2014 WHOThis global status report on prevention and control of NCDs (2014), is framed around the nine voluntary global targets. The report provides data on the current situation, identifying bottlenecks as well as opportunities and priority actions for attaining the targets. The 2010 baseline estimates on NCD mortality and risk factors are provided so that countries can report on progress, starting in 2015. In addition, the report also provides the latest available estimates on NCD mortality (2012) and risk factors, 2010-2012. All ministries of health need to set national NCD targets and lead the development and implementation of policies and interventions to attain them. There is no single pathway to attain NCD targets that fits all countries, as they are at different points in their progress in the prevention and control of NCDs and at different levels of socioeconomic development. However all countries can benefit from the comprehensive response to attaining the voluntary global targets presented in this report.--Publisher description.Also available: Print – 2014
- 2014 WHOWorld Health Organization ; UNODC, United Nations Office on Drugs and Crime ; UNDP, United Nations Development Programme ; [Alexander Butchart and Christopher Mikton coordinated and wrote the report].The Global status report on violence prevention 2014, which reflects data from 133 countries, is the first report of its kind to assess national efforts to address interpersonal violence, namely child maltreatment, youth violence, intimate partner and sexual violence, and elder abuse. Jointly published by WHO, the United Nations Development Programme, and the United Nations Office on Drugs and Crime, the report reviews the current status of violence prevention efforts in countries, and calls for a scaling up of violence prevention programmes; stronger legislation and enforcement of laws relevant for violence prevention; and enhanced services for victims of violence.--Publisher descriptionAlso available: Print – 2014
- Harrison's Principles of Internal Medicine
- AAP Red Book Online
- Robbins & Cotran Pathologic Basis of Disease
- Sabiston Textbook of Surgery
- Nelson's Textbook of Pediatrics
- Surgical Exposures in Orthopaedics
- Mandell, Douglas, & Bennett's Principles & Practice of Infectious Diseases
- Red Book Online
- ICU Book
- Primary Care Medicine
- Campbell-Walsh Urology
Access restricted to Stanford community
Shortcut to Licensed Content
TO INSTALL, DRAG THIS BUTTON to your browser Bookmarks or Tools Bar.
Bookmark on Other Websites
- TO INSTALL, RIGHT CLICK this button.
- Select "Add to Favorites" (click “Continue” if you see a security alert)
- From the "Create in" menu, select “Favorites Bar” (IE8, IE9) to install
- Once installed it will look like this
- Click "Bookmark on Lane" to bookmark any webpage
- Your saved bookmark will appear on this page
Can't find it?
Look if we have it in print
- Springer Protocols
- Lange Series
- National Academy Press
- NCBI Bookshelf
- Thieme Atlases
A repository of medical knowledge from internal medicine, cardiology, genetics, pharmacy, diagnosis and management, basic sciences, patient care, and more.
Continuously expanding, all databases in the repository contain the latest editions of selected medical titles.MicroMedex: Premier pharmaceutical information source containing multiple databases and drug reference tools. Of particular value is DRUGDEX Evaluations, one of the most comprehensive drug sources available.DynaMed Plus is a clinical information resource used by physicians to answer clinical questions quickly and easily at the point of care. Topics are updated daily as new evidence becomes available.Scopus is the largest abstract and citation database of peer-reviewed literature: scientific journals, books and conference proceedings.A drug information resource containing: American Hospital Formulary System (AHFS), drug formulary for Lucile Packard Children's Hospital (LPCH) and Stanford Hospital & Clinics (SHC), Lexi-Drugs (adverse reactions, dosage and administration, mechanism of action, storage, use, and administration information), Lexi-Calc, Lexi-ID, Lexi-I.V. Compatibility (King Guide), Lexi-Interact, and Lexi-PALS.Cumulative Index to Nursing and Allied Health Literature (CINAHL) contains coverage of nursing and allied health literature.A knowledge database that provides access to topic reviews based on over 6000 clinically relevant articles. The evidence-based content, updated regularly, provides the latest practice guidelines in 59 medical specialtiesProvides critical assessments of systematic reviews compiled from a variety of medical journals.Selects from the biomedical literature original studies and systematic reviews that are immediately clinically relevant and then summarizes these articles in an enhanced abstract with expert commentary.
Multidisciplinary coverage of over 10,000 high-impact journals in the sciences, social sciences, and arts and humanities, as well as international proceedings coverage for over 120,000 conferences.
Includes cited reference searching, citation maps, and an analyze tool.
Features systematic reviews that summarize the effects of interventions and makes a determination whether the intervention is efficacious or not.
Cochrane reviews are created through a strict process of compiling and analyzing data from multiple randomized control trials to ensure comprehensiveness and reliability.Provides systematic coverage of the psychological literature from the 1800s to the present through articles, book chapters and dissertations.BMJ Clinical Evidence. A clinical information tool built around systematic reviews summarizing the current state of knowledge about prevention and treatment of clinical conditions.PIER (Physicians' Information and Education Resource) is a Web-based decision-support tool designed for rapid point-of-care delivery of up-to-date, evidence-based guidance for primary care physicians.Cochrane Central Register of Controlled Trials (CENTRAL) provides access to 300,000 controlled trials that have been identified the Cochrane Collaboration.Provides drug information targeted for patients.A continually updating drug monograph.The National Guideline Clearinghouse (NGC): A comprehensive database of evidence-based clinical practice guidelines and related documents.MedlinePlus: A repository of health information from the National Library of Medicine. Links are from trusted sites. No advertising, no endorsement of commercial companies or productsLPCH CareNotes via MicroMedex: Patient education handouts customized by LPCH clinical staffMicromedex Lab Advisor: Evidence based laboratory test informationA drug database organized by generic name, trade name and drug class.LPCH / Stanford Hospital Formulary.A goldmine of trusted consumer health information from the world's largest medical library.A trusted source of expert advice for and about kids, providing the information necessary to help patients and parents understand their unique needs.Provides patient handouts from the American Academy of Family Physician.Access to the Stanford Health Library for patients.Lane provides access to over 5,000 eBooks many of which provide helpful background material that will prepare you to better tackle primary literature.
Largest, broadest eBook package; covers all sciences, as well as technology (including software), medicine, and humanities.
In addition to covering Wiley and Springer, MyiLibrary is also the only provider for Oxford and Cambridge University Press titles. No seat restrictions.A collection of biomedical books that can be searched directly by concept, and linked to terms in PubMed abstracts.
A web-based, decision support system for infectious diseases, epidemiology, microbiology and antimicrobial chemotherapy. The database, updated weekly, currently includes 337 diseases, 224 countries, 1,147 microbial taxa and 306 antibacterial (-fungal, -parasitic, -viral) agents and vaccines.
Over 10,000 notes outline the status of specific infections within each country.
Provides online, full-text access to Springer's journal titles as well as journals from other publishers.
Subjects include: life sciences, chemical sciences, environmental sciences, geosciences, computer science, mathematics, medicine, physics and astronomy, engineering and economics. Also includes eBooks.Collection of over 8 thousand fulltext titles in engineering, math, and basic and applied biomedical research. Coverage is from 1967 to the present.A library of ebooks on a wide array of topics, digitized and made available online in conjunction with the original publishers.