Showing 1–24 of 32 results
Authors: Peters, LindaDiscusses well-known Real Options Approaches and its applications step-by-step without the use of complex mathematicsEnables readers to reproduce these models and apply it to their own fieldContributes to one of the key challenges of Real Options, which is to reduce the gap between theory and practiceThis book explains the standard Real Options Analysis (ROA) literature in a straightforward, step by step manner without the use of complex mathematics. A lot of ROA literature is described through partial differential equations, probability density functions and simulation techniques, all of which may be unconvincing in the applicable qualities ROA possesses.
Statistik von Null auf Hundert nähert sich der Statistik über Kochrezepte und einfache Beispiele. Der Leser erhält schnell die erforderliche Kompetenzen, um selber Statistiken anfertigen, auch große Zahlenmengen anschaulich zu visualisieren und wesentliche statistische Kennwerte ermitteln zu können. Das schließt die „Statistiklesefähigkeit“ mit ein: Statistische Angaben in Zeitschriften und Büchern werden transparent, auch Manipulationen mit Statistik werden erkannt. Berechnungen und Lösungen von Kombinatorikfragestellungen werden einfach erklärt.
Multidimensional poverty measurement and analysis is evolving rapidly. Notably, it has informed the publication of the Multidimensional Poverty Index (MPI) estimates in the Human Development Reports of the United Nations Development Programme since 2010, and the release of national poverty measures in Mexico, Colombia, Bhutan, the Philippines and Chile. The academic response has been similarly swift, with related articles published in both theoretical and applied journals.
The high and insistent demand for in-depth and precise accounts of multidimensional poverty measurement motivates this book, which is aimed at graduate students in quantitative social sciences, researchers of poverty measurement, and technical staff in governments and international agencies who create multidimensional poverty measures.
The book is organized into four elements. The first introduces the framework for multidimensional measurement and provides a lucid overview of a range of multidimensional techniques and the problems each can address. The second part gives a synthetic introduction of ‘counting’ approaches to multidimensional poverty measurement and provides an in-depth account of the counting multidimensional poverty measurement methodology developed by Alkire and Foster, which is a straightforward extension of the well-known Foster-Greer-Thorbecke poverty measures that had a significant and lasting impact on income poverty measurement. The final two parts deal with the pre-estimation issues such as normative choices and distinctive empirical techniques used in measure design, and the post-estimation issues such as robustness tests, statistical inferences, comparisons over time, and assessments of inequality among the poor.
In the 1970s, at a time of shock, controversy and uncertainty over the direction of monetary and fiscal policy, Wynne Godley and the Cambridge Department of Applied Economics rose to prominence, challenging the accepted Keynesian wisdom of the time. This collection of essays brings together eminent scholars who have been influenced by Godley’s enormous contribution to the field of monetary economics and macroeconomic modeling.
Godley’s theoretical, applied and policy work is explored in detail, including an analysis of the insightful New Cambridge ‘three balances’ model, and its use in showing the progression of real capitalist economies over time.
Analyzing Event Statistics in Corporate Finance provides new alternative methodologies to increase accuracy when performing statistical tests for event studies within corporate finance. In contrast to conventional surveys or literature reviews, Jeng focuses on various methodological defects or deficiencies that lead to inaccurate empirical results, which ultimately produce bad corporate policies. This work discusses the issues of data collection and structure, the recursive smoothing for systematic components in excess returns, the choices of event windows, different time horizons for the events, and the consequences of applications of different methodologies.
Authors: Madsen, Birger StjernholmAimed at practitionersThe presentation is as non-mathematical as possibleIncludes many examples of the use of statistical functions in spreadsheetsEmploys a realistic sample survey as an exemplar throughout the bookFills a gap in the existing literature on statisticsAbout this TextbookThis book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes.
Economists generally accept as given the old adage that there’s no accounting for tastes. Gary Becker disagrees, and in this collection he confronts the problem of preferences and values: how they are formed and how they affect our behaviour. In the process he explores puzzles of social life as well as some of society’s problems. He observes, for example, that adjacent restaurants, which have roughly the same quality of food and similar prices, may differ greatly in the number of customers they are able to attract.
Computable general equilibrium (CGE) models are widely used by governmental organizations and academic institutions to analyze the economy-wide effects of events such as climate change, tax policies, and immigration. This book provides a practical, how-to guide to CGE models suitable for use at the undergraduate college level. Its introductory level distinguishes it from other available books and articles on CGE models. The book provides intuitive and graphical explanations of the economic theory that underlies a CGE model and includes many examples and hands-on modeling exercises.
The conduct of most of social science occurs outside the laboratory. Such studies in field science explore phenomena that cannot for practical, technical, or ethical reasons be explored under controlled conditions. These phenomena cannot be fully isolated from their environment or investigated by manipulation or intervention. Yet measurement, including rigorous or clinical measurement, does provide analysts with a sound basis for discerning what occurs under field conditions, and why.In Science Outside the Laboratory, Marcel Boumans explores the state of measurement theory, its reliability, and the role expert judgment plays in field investigations from the perspective of the philosophy of science.
The Oxford Handbook of Panel Data examines new developments in the theory and applications of panel data. It includes basic topics like non-stationary panels, co-integration in panels, multifactor panel models, panel unit roots, measurement error in panels, incidental parameters and dynamic panels, spatial panels, nonparametric panel data, random coefficients, treatment effects, sample selection, count panel data, limited dependent variable panel models, unbalanced panel models with interactive effects and influential observations in panel data.
The aim of this book is to give the reader a detailed introduction to the different approaches to generating multiply imputed synthetic datasets. It describes all approaches that have been developed so far, provides a brief history of synthetic datasets, and gives useful hints on how to deal with real data problems like nonresponse, skip patterns, or logical constraints. Each chapter is dedicated to one approach, first describing the general concept followed by a detailed application to a real dataset providing useful guidelines on how to implement the theory in practice.
Among the symmetrical distributions with an infinite domain, the most popular alternative to the normal variant is the logistic distribution as well as the Laplace or the double exponential distribution, which was first introduced in 1774. Occasionally, the Cauchy distribution is also used. Surprisingly, the hyperbolic secant distribution has led a charmed life, although Manoukian and Nadeau had already stated in 1988 that “… the hyperbolic-secant distribution … has not received sufficient attention in the published literature and may be useful for students and practitioners.”
This book is aimed at Security and IT practitioners (especially architects) in end-user organisations who are responsible for implementing an enterprise-wide Identity and Access Management (IAM) system. It is neither a conceptual treatment of Identity (for which we would refer the reader to Kim Cameron’s excellent work on the Laws of Identity) nor a detailed technical manual on a particular product. It describes a pragmatic and cost-effective architectural approach to implementing IAM within an organisation, based on the experience of the authors.
This book is intended for use in a rigorous introductory PhD level course in econometrics, or in a field course in econometric theory. It covers the measure-theoretical foundation of probability theory, the multivariate normal distribution with its application to classical linear regression analysis, various laws of large numbers, central limit theorems and related results for independent random variables as well as for stationary time series, with applications to asymptotic inference of M-estimators, and maximum likelihood theory
This highly accessible and innovative text uses Excel (R) workbooks powered by Visual Basic macros to teach the core concepts of econometrics without advanced mathematics. It enables students to run monte Carlo simulations in which they repeatedly sample from artificial data sets in order to understand the data generating process and sampling distribution. Coverage includes omitted variables, binary response models, basic time series, and simultaneous equations. The authors teach students how to construct their own real-world data sets drawn from the internet, which they can analyze with Excel (R) or with other econometric software.
Here at last is the fourth edition of the textbook that is required reading for economics students as well as those practising applied economics. Not only does it teach some of the basic econometric methods and the underlying assumptions behind them, but it also includes a simple and concise treatment of more advanced topics from spatial correlation to time series analysis. This book’s strength lies in its ability to present complex material in a simple, yet rigorous manner. This superb fourth edition updates identification and estimation methods in the simultaneous equation model.
The relentless decline in the prices of information technology (IT) has steadily enhanced the role of IT investment as a source of economic growth in the United States. Productivity growth in IT-producing industries has gradually risen in importance, and a productivity revival has taken place in the rest of the economy. In this book Dale Jorgenson shows that IT provides the foundation for the resurgence of American economic growth.Information technology rests in turn on the development and deployment of semiconductors–transistors, storage devices, and microprocessors.
Practice makes perfect. Therefore the best method of mastering models is working with them.This book contains a large collection of exercises and solutions which will help explain the statistics of financial markets. These practical examples are carefully presented and provide computational solutions to specific problems, all of which are calculated using R and Matlab. This study additionally looks at the concept of corresponding Quantlets, the name given to these program codes and which follow the name scheme SFSxyz123.T
When John Nash won the Nobel prize in economics in 1994, many people were surprised to learn that he was alive and well. Since then, Sylvia Nasar’s celebrated biography A Beautiful Mind, the basis of a new major motion picture, has revealed the man.
Empirical Studies on Volatility in International Stock Markets describes the existing techniques for the measurement and estimation of volatility in international stock markets with emphasis on the SV model and its empirical application. Eugenie Hol develops various extensions of the SV model, which allow for additional variables in both the mean and the variance equation. In addition, the forecasting performance of SV models is compared not only to that of the well-established GARCH model but also to implied volatility and so-called realised volatility models which are based on intraday volatility measures.T
One cannot exaggerate the importance of estimating how international trade responds to changes in income and prices. But there is a tension between whether one should use models that fit the data but that contradict certain aspects of the underlying theory or models that fit the theory but contradict certain aspects of the data. The essays in Estimating Trade Elasticities book offer one practical approach to deal with this tension. The analysis starts with the practical implications of optimising behaviour for estimation and it follows with a re-examination of the puzzling income elasticity for US imports that three decades of studies have not resolved.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process.
Are foreign exchange markets efficient? Are fundamentals important for predicting exchange rate movements? What is the signal-to-ratio of high frequency exchange rate changes? Is it possible to define a measure of the equilibrium exchange rate that is useful from an assessment perspective? The book is a selective survey of current thinking on key topics in exchange rate economics, supplemented throughout by new empirical evidence. The focus is on the use of advanced econometric tools to find answers to these and other questions which are important to practitioners, policy-makers and academic economists.
The purpose of models is not to fit the data but to sharpen the questions. S. Karlin, 11th R. A. Fisher Memorial Lecture, Royal Society, 20 April 1983 We are proud to offer this volume in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Not one to model solely for the sake of modelling, the above quotation nicely captures Professor Paelinck’s unceasing quest for the best question for which an answer is needed.
Showing 1–24 of 32 results