Showing all 18 results
THIRTY FIVE YEARS OF AUTOMATING MATHEMATICS: DEDICATED TO 35 YEARS OF DE BRUIJN’S AUTOMATH N. G. de Bruijn was a well established mathematician before deciding in 1967 at the age of 49 to work on a new direction related to Automating Mathematics. By then, his contributions in mathematics were numerous and extremely influential. His book on advanced asymptotic methods, North Holland 1958, was a classic and was subsequently turned into a book in the well known Dover book series. His work on combinatorics yielded influential notions and theorems of which we mention the de Bruijn-sequences of 1946 and the de Bruijn-Erdos theorem of 1948.
Volume IV continues the author’s odyssey on l-D cellular automata as chronicled in Volumes I, II and III, by uncovering a novel quasi-ergodicity phenomenon involving orbits meandering among omega-limit orbits of complex (group 5) and hyper (group 6) Bernoulli rules. This discovery is embellished with analytical formulas characterizing the fractal properties of characteristic functions, as well as explicit formulas for generating colorful and pedagogically revealing isomorphic basin tree diagrams.
The field of approximation theory has become so vast that it intersects with every other branch of analysis and plays an increasingly important role in applications in the applied sciences and engineering. Fundamentals of Approximation Theory presents a systematic, in-depth treatment of some basic topics in approximation theory designed to emphasize the rich connections of the subject with other areas of study.With an approach that moves smoothly from the very concrete to more and more abstract levels, this text provides an outstanding blend of classical and abstract topics.
Since the discovery of neutrino oscillations neutrino physics has become an interesting field of research in physics. They imply that neutrino must have a small mass and that the neutrinos, coupled to the charged leptons, are mixtures of the mass eigenstates, analogous to the flavor mixing of the quarks. The mixing angles for the quarks are small, but for the leptons two of the mixing angles are large. The masses of the three neutrinos must be very small, less than 1 eV, but from the oscillation experiments we only know the mass differences – the absolute masses are still unknown.
Defining a new development life-cycle methodology, together with a set of associated techniques and tools to develop highly critical systems using formal techniques, this book adopts a rigorous safety assessment approach explored via several layers (from requirements analysis to automatic source code generation).
This is assessed and evaluated via a standard case study: the cardiac pacemaker. Additionally a formalisation of an Electrocardiogram (ECG) is used to identify anomalies in order to improve existing medical protocols.
This book constitutes the refereed proceedings of the 12th International Conference on Formal Concept Analysis, ICFCA 2014, held in Cluj-Napoca, Romania, in June 2014. The 16 regular papers presented together with 3 invited talks were carefully reviewed and selected from 39 submissions. The papers in this volume cover a rich range of FCA aspects,
Turing’s famous 1936 paper introduced a formal definition of a computing machine, a Turing machine. This model led to both the development of actual computers and to computability theory, the study of what machines can and cannot compute. This book presents classical computability theory from Turing and Post to current results and methods, and their use in studying the information content of algebraic structures, models, and their relation to Peano arithmetic. The author presents the subject as an art to be practiced, and an art in the aesthetic sense of inherent beauty which all mathematicians recognize in their subject.
While all of us regularly use basic math symbols such as those for plus, minus, and equals, few of us know that many of these symbols weren’t available before the sixteenth century. What did mathematicians rely on for their work before then? And how did mathematical notations evolve into what we know today? In Enlightening Symbols, popular math writer Joseph Mazur explains the fascinating history behind the development of our mathematical notation system. He shows how symbols were used initially, how one symbol replaced another over time, and how written math was conveyed before and after symbols became widely adopted.
Traversing mathematical history and the foundations of numerals in different cultures, Mazur looks at how historians have disagreed over the origins of the numerical system for the past two centuries. He follows the transfigurations of algebra from a rhetorical style to a symbolic one, demonstrating that most algebra before the sixteenth century was written in prose or in verse employing the written names of numerals. Mazur also investigates the subconscious and psychological effects that mathematical symbols have had on mathematical thought, moods, meaning, communication, and comprehension. He considers how these symbols influence us (through similarity, association, identity, resemblance, and repeated imagery), how they lead to new ideas by subconscious associations, how they make connections between experience and the unknown, and how they contribute to the communication of basic mathematics.
From words to abbreviations to symbols, this book shows how math evolved to the familiar forms we use today.
Authors: Kane, Jonathan M.Teaches how to write proofs by describing what students should be thinking about when faced with writing a proofProvides proof templates for proofs that follow the same general structureBlends topics of logic into discussions of proofs in the context where they are neededThoroughly covers the concepts and theorems of introductory in Real Analysis including limits, continuity, differentiation, integration, infinite series, sequences of functions, topology of the real line, and metric spacesThis is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis.
Nonmonotonic reasoning provides formal methods that enable intelligent systems to operate adequately when faced with incomplete or changing information. In particular, it provides rigorous mechanisms for taking back conclusions that, in the presence of new information, turn out to be wrong and for deriving new, alternative conclusions instead. Nonmonotonic reasoning methods provide rigor similar to that of classical reasoning; they form a base for validation and verification and therefore increase confidence in intelligent systems that work with incomplete and changing information.F
Authors: Tkachuk, Vladimir V.Contains a wide variety of top-notch methods and results of Cp-theory and general topology presented with detailed proofsPresents and classifies 100 open problems in Cp-theory explaining their relationship with previous researchIntroduces the reader to the theories of u-equivalent spaces and l-equivalent spacesAbout this TextbookThis fourth volume in Vladimir Tkachuk’s series on Cp-theory gives reasonably complete coverage of the theory of functional equivalencies through 500 carefully selected problems and exercises.
This textbook gives an introduction to axiomatic set theory and examines the prominent questions that are relevant in current research in a manner that is accessible to students. Its main theme is the interplay of large cardinals, inner models, forcing and descriptive set theory.The following topics are covered:• Forcing and constructability• The Solovay-Shelah Theorem i.e. the equiconsistency of ‘every set of reals is Lebesgue measurable’ with one inaccessible cardinal• Fine structure theory and a modern approach to sharps• Jensen’s Covering Lemma• The equivalence of analytic determinacy with sharps• The theory of extenders and iteration trees• A proof of projective determinacy from Woodin cardinals.S
The aim of this handbook is to create, for the first time, a systematic account of the field of spatial logic. The book comprises a general introduction, followed by fourteen chapters by invited authors. Each chapter provides a self-contained overview of its topic, describing the principal results obtained to date, explaining the methods used to obtain them, and listing the most important open problems. Jointly, these contributions constitute a comprehensive survey of this rapidly expanding subject.
The prize-winning essays in this book address the fascinating but sometimes uncomfortable relationship between physics and mathematics. Is mathematics merely another natural science? Or is it the result of human creativity? Does physics simply wear mathematics like a costume, or is math the lifeblood of physical reality?The nineteen wide-ranging, highly imaginative and often entertaining essays are enhanced versions of the prize-winning entries to the FQXi essay competition “Trick or Truth”, which attracted over 200 submissions.T
This classic text deals with the conceptual problem posed by the continuum ― the set of all real numbers. Chapter 1 deals with the logic and mathematics of set and function, while Chapter 2 focuses on the concept of number and the continuum.
The articles in this book are based on talks given at the North Texas Logic Conference in October of 2004. The main goal of the editors was to collect articles representing diverse fields within logic that would both contain significant new results and be accessible to readers with a general background in logic. Included in the book is a problem list, jointly compiled by the speakers, that reflects some of the most important questions in various areas of logic. This book should be useful to graduate students and researchers alike across the spectrum of mathematical logic.
Arising from a special session held at the 2010 North American Annual Meeting of the Association for Symbolic Logic, this volume is an international cross-disciplinary collaboration with contributions from leading experts exploring connections across their respective fields. Themes range from philosophical examination of the foundations of physics and quantum logic, to exploitations of the methods and structures of operator theory, category theory, and knot theory in an effort to gain insight into the fundamental questions in quantum theory and logic.
The Semantic Web is characterized by the existence of a very large number of distributed semantic resources, which together define a network of ontologies. These ontologies in turn are interlinked through a variety of different meta-relationships such as versioning, inclusion, and many more. This scenario is radically different from the relatively narrow contexts in which ontologies have been traditionally developed and applied, and thus calls for new methods and tools to effectively support the development of novel network-oriented semantic applications.
This book by Suárez-Figueroa et al. provides the necessary methodological and technological support for the development and use of ontology networks, which ontology developers need in this distributed environment. After an introduction, in its second part the authors describe the NeOn Methodology framework. The book’s third part details the key activities relevant to the ontology engineering life cycle. For each activity, a general introduction, methodological guidelines, and practical examples are provided. The fourth part then presents a detailed overview of the NeOn Toolkit and its plug-ins. Lastly, case studies from the pharmaceutical and the fishery domain round out the work.
The book primarily addresses two main audiences: students (and their lecturers) who need a textbook for advanced undergraduate or graduate courses on ontology engineering, and practitioners who need to develop ontologies in particular or Semantic Web-based applications in general. Its educational value is maximized by its structured approach to explaining guidelines and combining them with case studies and numerous examples. The description of the open source NeOn Toolkit provides an additional asset, as it allows readers to easily evaluate and apply the ideas presented.
Showing all 18 results