This textbook covers topics of undergraduate mathematics in abstract algebra, geometry, topology and analysis with the purpose of connecting the underpinning key ideas.
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications.
This monograph presents a general theory of weakly implicative logics, a family covering a vast number of non-classical logics studied in the literature, concentrating mainly on the abstract study of the relationship between logics and their algebraic semantics.
This book centers around a dialogue between Roger Penrose and Emanuele Severino about one of most intriguing topics of our times, the comparison of artificial intelligence and natural intelligence, as well as its extension to the notions of human and machine consciousness.
This edited collection casts light on central issues within contemporary philosophy of mathematics such as the realism/anti-realism dispute; the relationship between logic and metaphysics; and the question of whether mathematics is a science of objects or structures.
This textbook covers key topics of Elementary Calculus through selected exercises, in a sequence that facilitates development of problem-solving abilities and techniques.
This survey of computability theory offers the techniques and tools that computer scientists (as well as mathematicians and philosophers studying the mathematical foundations of computing) need to mathematically analyze computational processes and investigate the theoretical limitations of computing.
This book describes some basic principles that allow developers of computer programs (computer scientists, software engineers, programmers) to clearly think about the artifacts they deal with in their daily work: data types, programming languages, programs written in these languages that compute from given inputs wanted outputs, and programs that describe continuously executing systems.
In this two-volume compilation of articles, leading researchers reevaluate the success of Hilbert's axiomatic method, which not only laid the foundations for our understanding of modern mathematics, but also found applications in physics, computer science and elsewhere.
In this two-volume compilation of articles, leading researchers reevaluate the success of Hilbert's axiomatic method, which not only laid the foundations for our understanding of modern mathematics, but also found applications in physics, computer science and elsewhere.
Model Validation and Uncertainty Quantification, Volume 3: Proceedings of the 39th IMAC, A Conference and Exposition on Structural Dynamics, 2021, the third volume of nine from the Conference brings together contributions to this important area of research and engineering.
For a brief time in history, it was possible to imagine that a sufficiently advanced intellect could, given sufficient time and resources, in principle understand how to mathematically prove everything that was true.
This textbook explores the foundations of real analysis using the framework of general ordered fields, demonstrating the multifaceted nature of the area.
This textbook can serve as a comprehensive manual of discrete mathematics and graph theory for non-Computer Science majors; as a reference and study aid for professionals and researchers who have not taken any discrete math course before.
This third volume continues Richard Routley's explorations of an improved Meinongian account of non-referring and intensional discourse (including joint work with Val Routley, later Val Plumwood).
This book presents peer-reviewed papers from the 4th International Conference on Applications of Mathematics and Informatics in Natural Sciences and Engineering (AMINSE2019), held in Tbilisi, Georgia, in September 2019.
The contributions in this book survey results on combinations of probabilistic and various other classical, temporal and justification logical systems.
Kurt Godel (1906-1978) shook the mathematical world in 1931 by a result that has become an icon of 20th century science: The search for rigour in proving mathematical theorems had led to the formalization of mathematical proofs, to the extent that such proving could be reduced to the application of a few mechanical rules.