Showing 25–48 of 75 results
A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years.
The past decade has seen a dramatic increase in the use of Bayesian methods in marketing due, in part, to computational and modelling breakthroughs, making its implementation ideal for many marketing problems. Bayesian analyses can now be conducted over a wide range of marketing problems, from new product introduction to pricing, and with a wide variety of different data sources. Bayesian Statistics and Marketing describes the basic advantages of the Bayesian approach, detailing the nature of the computational revolution.
Mastering R has never been easierPicking up R can be tough, even for seasoned statisticians and data analysts. "R For Dummies," "2nd Edition" provides a quick and painless way to master all the R you’ll ever need. Requiring no prior programming experience and packed with tons of practical examples, step-by-step exercises, and sample code, this friendly and accessible guide shows you how to know your way around lists, data frames, and other R data structures, while learning to interact with other programs, such as Microsoft Excel.
Suitable for self study. Use real examples and real data sets that will be familiar to the audience. Introduction to the bootstrap is included – this is a modern method missing in many other books.
Two fundamental theories are commonly debated in the study of random processes: the Bachelier Wiener model of Brownian motion, which has been the subject of many books, and the Poisson process. While nearly every book mentions the Poisson process, most hurry past to more general point processes or to Markov chains. This comparative neglect is ill judged, and stems from a lack of perception of the real importance of the Poisson process. This distortion partly comes about from a restriction to one dimension, while the theory becomes more natural in more general contexts.
Clustering remains a vibrant area of research in statistics. Although there are many books on this topic, there are relatively few that are well founded in the theoretical aspects. In Robust Cluster Analysis and Variable Selection, Gunter Ritter presents an overview of the theory and applications of probabilistic clustering and variable selection, synthesizing the key research results of the last 50 years.The author focuses on the robust clustering methods he found to be the most useful on simulated data and real-time applications.
Praise for the "Third Edition ""Future mathematicians, scientists, and engineers should find the book to be an excellent introductory text for coursework or self-study as well as worth its shelf space for reference." –MAA Reviews " Applied Mathematics, Fourth Edition" is a thoroughly updated and revised edition on the applications of modeling and analyzing natural, social, and technological processes. The book covers a wide range of key topics in mathematical methods and modeling and highlights the connections between mathematics and the applied and natural sciences.
This book provides an undergraduate introduction to discrete and continuous-time Markov chains and their applications. A large focus is placed on the first step analysis technique and its applications to average hitting times and ruin probabilities. Classical topics such as recurrence and transience, stationary and limiting distributions, as well as branching processes, are also covered. Two major examples (gambling processes and random walks) are treated in detail from the beginning, before the general theory itself is presented in the subsequent chapters.
This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and branching processes.
Das Buch behandelt die Theorie der Signale und (linearen) Systeme sowie ihrer Anwendungen. Nach einer Einführung anhand von Beispielen aus den verschiedenen Anwendungsgebieten werden die Grundtechniken zur Beschreibung zeitkontinuierlicher linearer zeitinvarianter Systeme und ihrer Wirkung auf Signale diskutiert. Der Autor stellt zahlreiche Beispiele mit echten Daten vor und bietet das Material sowie die zugehörigen MATLAB-Programme im Internet an. Das Buch enthält über 150, in vielen Fällen MATLAB/Simulink-basierte Übungsaufgaben.
"In this splendid new book, Jorma Rissanen, the originator of the minimum description length (MDL) Principle, puts forward a comprehensive theory of estimation which differs in several ways from the standard Bayesian and frequentist approaches. During the development of MDL over the last 30 years, it gradually emerged that MDL could be viewed, informally, as a maximum probability principle that directly extends Fisher’s classical maximum likelihood method to allow for estimation of a model’s structural properties.
Modern computer-intensive statistical methods play a key role in solving many problems across a wide range of scientific disciplines. This new edition of the bestselling Randomization, Bootstrap and Monte Carlo Methods in Biology illustrates the value of a number of these methods with an emphasis on biological applications.
This textbook focuses on three related areas in computational statistics: randomization, bootstrapping, and Monte Carlo methods of inference. The author emphasizes the sampling approach within randomization testing and confidence intervals.
The Seventh Symposium in Applied Mathematics, sponsored by the American Mathematical Society and the Office of Ordnance Research, and devoted to Mathematical Probability and Its Applications, was held at the Polytechnic Institute of Brooklyn on April 14 and 15, 1955. This volume contains the papers (one in abstract form) which were presented at the Symposium. Prolonged consideration by the members of the Program Committee, under the chairmanship of Dr. H. W. Bode, resulted in the decision that the Symposium should be concerned with three principal themes, viz.,
Multidimensional scaling covers a variety of statistical techniques in the area of multivariate data analysis. Geared toward dimensional reduction and graphical representation of data, it arose within the field of the behavioral sciences, but now holds techniques widely used in many disciplines.
The model investigated in this work, a particular cellular automaton with stochastic evolution, was introduced as the simplest case of self-organized-criticality, that is, a dynamical system which shows algebraic long-range correlations without any tuning of parameters. The author derives exact results which are potentially also interesting outside the area of critical phenomena. Exact means also site-by-site and not only ensemble average or coarse graining. Very complex and amazingly beautiful periodic patterns are often generated by the dynamics involved, especially in deterministic protocols in which the sand is added at chosen sites.
With the development of new fitting methods, their increased use in applications, and improved computer languages, the fitting of statistical distributions to data has come a long way since the introduction of the generalized lambda distribution (GLD) in 1969. Handbook of Fitting Statistical Distributions with R presents the latest and best methods, algorithms, and computations for fitting distributions to data. It also provides in-depth coverage of cutting-edge applications.The book begins with commentary by three GLD pioneers: John S.
Updated to reflect SAS 9.2, A Handbook of Statistical Analyses using SAS, Third Edition continues to provide a straightforward description of how to conduct various statistical analyses using SAS.
Each chapter shows how to use SAS for a particular type of analysis. The authors cover inference, analysis of variance, regression, generalized linear models, longitudinal data, survival analysis, principal components analysis, factor analysis, cluster analysis, discriminant function analysis, and correspondence analysis.
This is a history of the use of Bayes theoremfrom its discovery by Thomas Bayes to the rise of the statistical competitors in the first part of the twentieth century. The book focuses particularly on the development of one of the fundamental aspects of Bayesian statistics, and in this new edition readers will find new sections on contributors to the theory. In addition, this edition includes amplified discussion of relevant work.
“Elementary Statistics: A Step By Step Approach” is for introductory statistics courses with a basic algebra prerequisite. The book is non-theoretical, explaining concepts intuitively and teaching problem solving through worked examples and step-by-step instructions. In recent editions, Al Bluman has placed more emphasis on conceptual understanding and understanding results, along with increased focus on Excel, MINITAB, and the TI-83 Plus and TI-84 Plus graphing calculators; computing technologies commonly used in such courses.
Statistical Rethinking: A Bayesian Course with Examples in R and Stan builds readers’ knowledge of and confidence in statistical modeling. Reflecting the need for even minor programming in today’s model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy.
Give Your Students the Proper Groundwork for Future Studies in Optimization
A First Course in Optimization is designed for a one-semester course in optimization taken by advanced undergraduate and beginning graduate students in the mathematical sciences and engineering. It teaches students the basics of continuous optimization and helps them better understand the mathematics from previous courses.
The book focuses on general problems and the underlying theory. It introduces all the necessary mathematical tools and results.
Wolfgang Doeblin, one of the greatest probabilists of this century, died in action during World War II at the age of twenty-five. He left behind several seminal contributions which have profoundly influenced the field and continue to provide inspiration for current research. This book is based on papers presented at the conference, ‘Fifty Years after Doeblin: Developments in the Theory of Markov Chains, Markov Processes, and Sums of Random Variables’, held at Blaubeuren, Germany, in November 1991.
Of the two primary approaches to the classic source separation problem, only one does not impose potentially unreasonable model and likelihood constraints: the Bayesian statistical approach. Bayesian methods incorporate the available information regarding the model parameters and not only allow estimation of the sources and mixing coefficients, but also allow inferences to be drawn from them.
This volume presents a detailed description of the statistical distributions that are commonly applied to such fields as engineering, business, economics and the behavioural, biological and environmental sciences. The authors cover specific distributions, including logistic, slash, bathtub, F, non-central Chi-square, quadratic form, non-central F, non-central t, and other miscellaneous distributions.
Showing 25–48 of 75 results