Stochastic Processes: General Theory starts with the fundamental existence theorem of Kolmogorov, together with several of its extensions to stochastic processes.
The interdisciplinary subject of random heterogeneous materials has experienced remarkable growth since the publication of the well-known monograph Statistical Con- tinuum Theories by Beran ( 1968).
Probabilistic and Statistical Methods in Computer Science presents a large variety of applications of probability theory and statistics in computer science and more precisely in algorithm analysis, speech recognition and robotics.
Statistical models and methods for lifetime and other time-to-event data are widely used in many fields, including medicine, the environmental sciences, actuarial science, engineering, economics, management, and the social sciences.
Providing the first comprehensive treatment of the subject, this groundbreaking work is solidly founded on a decade of concentrated research, some of which is published here for the first time, as well as practical, 'hands on' classroom experience.
The objective of this book is to collect in a single volume the essentials of stochastic networks, from the classical product-form theory to the more re- cent developments such as diffusion and fluid limits, stochastic comparisons, stability, control (dynamic scheduling) and optimization.
In the past half-century the theory of probability has grown from a minor isolated theme into a broad and intensive discipline interacting with many other branches of mathematics.
Advanced Statistics provides a rigorous development of statistics that emphasizes the definition and study of numerical measures that describe population variables.
In this edition a large number of errors have been corrected, an occasional proof has been streamlined, and a number of references are made to recent pro- gress.
This book is an attempt to present a unified theory of rare event simulation and the variance reduction technique known as importance sampling from the point of view of the probabilistic theory of large deviations.
1 Audience Students seeking master's degrees in applied statistics in the late 1960s and 1970s typically took a year-long sequence in statistical methods.
This book was originally compiled for a course I taught at the University of Rochester in the fall of 1991, and is intended to give advanced graduate students in statistics an introduction to Edgeworth and saddlepoint approximations, and related techniques.
Quite apart from the fact that percolation theory had its ongm in an honest applied problem, it is a source of fascinating problems of the best kind for which a mathematician can wish: problems which are easy to state with a minimum of preparation, but whose solutions are apparently difficult and require new methods.
This book presents a collection of hands-on activities for students taking introductory statistics, and is designed to engage the student as a participant in the learning process.
The vast majority of important applications in science, engineering and applied science are characterized by the existence of multiple minima and maxima, as well as first, second and higher order saddle points.
Stochastic portfolio theory is a mathematical methodology for constructing stock portfolios and for analyzing the effects induced on the behavior of these portfolios by changes in the distribution of capital in the market.
Books on time series models deal mainly with models based on Box-Jenkins methodology which is generally represented by autoregressive integrated moving average models or some nonlinear extensions of these models, such as generalized autoregressive conditional heteroscedasticity models.
On October 16 and 17, 2000, we hosted an international workshop entitled "e;Statistical Design, Measurement, and Analysis of Health Related Quality of Life.
Conventional statistical methods have a very serious flaw: They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality.
It was none other than Henri Poincare who at the turn of the last century, recognised that initial-value sensitivity is a fundamental source of random- ness.
Many texts are excellent sources of knowledge about individual statistical tools, but the art of data analysis is about choosing and using multiple tools.
In the fall of 1999, I was asked to teach a course on computer intrusion detection for the Department of Mathematical Sciences of The Johns Hopkins University.
Since our first edition of this book, many developments in statistical mod- elling based on generalized linear models have been published, and our primary aim is to bring the book up to date.
In the period since the first edition was published, I have appreciated the corre- spondence from all parts of the world expressing thanks for the presentation of statistics from a user's perspective.
Semi-infinite programming (SIP) deals with optimization problems in which either the number of decision variables or the number of constraints is finite.
Queueing network models have been widely applied as a powerful tool for modelling, performance evaluation, and prediction of discrete flow systems, such as computer systems, communication networks, production lines, and manufacturing systems.
Extending the Cox Model is aimed at researchers, practitioners, and graduate students who have some exposure to traditional methods of survival analysis.
Combinatorial (or discrete) optimization is one of the most active fields in the interface of operations research, computer science, and applied math- ematics.