Skip to main content

Books in Mathematics

The Mathematics collection presents a range of foundational and advanced research content across applied and discrete mathematics, including fields such as Computational Mathematics; Differential Equations; Linear Algebra; Modelling & Simulation; Numerical Analysis; Probability & Statistics.

    • Viability, Invariance and Applications

      • 1st Edition
      • Volume 207
      • June 4, 2007
      • Ovidiu Carja + 2 more
      • English
      • Paperback
        9 7 8 0 4 4 4 5 5 0 7 8 1
      • Hardback
        9 7 8 0 4 4 4 5 2 7 6 1 5
      • eBook
        9 7 8 0 0 8 0 5 2 1 6 6 4
      The book is an almost self-contained presentation of the most important concepts and results in viability and invariance. The viability of a set K with respect to a given function (or multi-function) F, defined on it, describes the property that, for each initial data in K, the differential equation (or inclusion) driven by that function or multi-function) to have at least one solution. The invariance of a set K with respect to a function (or multi-function) F, defined on a larger set D, is that property which says that each solution of the differential equation (or inclusion) driven by F and issuing in K remains in K, at least for a short time.The book includes the most important necessary and sufficient conditions for viability starting with Nagumo’s Viability Theorem for ordinary differential equations with continuous right-hand sides and continuing with the corresponding extensions either to differential inclusions or to semilinear or even fully nonlinear evolution equations, systems and inclusions. In the latter (i.e. multi-valued) cases, the results (based on two completely new tangency concepts), all due to the authors, are original and extend significantly, in several directions, their well-known classical counterparts.
    • Stochastic Differential Equations and Applications

      • 2nd Edition
      • December 30, 2007
      • X Mao
      • English
      • Paperback
        9 7 8 1 9 0 4 2 7 5 3 4 3
      • eBook
        9 7 8 0 8 5 7 0 9 9 4 0 2
      This advanced undergraduate and graduate text has now been revised and updated to cover the basic principles and applications of various types of stochastic systems, with much on theory and applications not previously available in book form. The text is also useful as a reference source for pure and applied mathematicians, statisticians and probabilists, engineers in control and communications, and information scientists, physicists and economists.
    • Stochastic Modelling in Process Technology

      • 1st Edition
      • Volume 211
      • July 3, 2007
      • Herold G. Dehling + 2 more
      • English
      • Paperback
        9 7 8 0 4 4 4 5 5 0 5 8 3
      • Hardback
        9 7 8 0 4 4 4 5 2 0 2 6 5
      • eBook
        9 7 8 0 0 8 0 5 4 8 9 7 5
      There is an ever increasing need for modelling complex processes reliably. Computational modelling techniques, such as CFD and MD may be used as tools to study specific systems, but their emergence has not decreased the need for generic, analytical process models. Multiphase and multicomponent systems, and high-intensity processes displaying a highly complex behaviour are becoming omnipresent in the processing industry. This book discusses an elegant, but little-known technique for formulating process models in process technology: stochastic process modelling. The technique is based on computing the probability distribution for a single particle's position in the process vessel, and/or the particle's properties, as a function of time, rather than - as is traditionally done - basing the model on the formulation and solution of differential conservation equations. Using this technique can greatly simplify the formulation of a model, and even make modelling possible for processes so complex that the traditional method is impracticable. Stochastic modelling has sporadically been used in various branches of process technology under various names and guises. This book gives, as the first, an overview of this work, and shows how these techniques are similar in nature, and make use of the same basic mathematical tools and techniques. The book also demonstrates how stochastic modelling may be implemented by describing example cases, and shows how a stochastic model may be formulated for a case, which cannot be described by formulating and solving differential balance equations.
    • Modeling and Simulation-Based Data Engineering

      • 1st Edition
      • August 7, 2007
      • Bernard P. Zeigler + 1 more
      • English
      • Hardback
        9 7 8 0 1 2 3 7 2 5 1 5 8
      • Paperback
        9 7 8 0 3 2 3 2 8 1 8 2 9
      • eBook
        9 7 8 0 0 8 0 5 5 0 5 4 1
      Data Engineering has become a necessary and critical activity for business, engineering, and scientific organizations as the move to service oriented architecture and web services moves into full swing. Notably, the US Department of Defense is mandating that all of its agencies and contractors assume a defining presence on the Net-centric Global Information Grid. This book provides the first practical approach to data engineering and modeling, which supports interoperabililty with consumers of the data in a service- oriented architectures (SOAs). Although XML (eXtensible Modeling Language) is the lingua franca for such interoperability, it is not sufficient on its own. The approach in this book addresses critical objectives such as creating a single representation for multiple applications, designing models capable of supporting dynamic processes, and harmonizing legacy data models for web-based co-existence. The approach is based on the System Entity Structure (SES) which is a well-defined structure, methodology, and practical tool with all of the functionality of UML (Unified Modeling Language) and few of the drawbacks. The SES originated in the formal representation of hierarchical simulation models. So it provides an axiomatic formalism that enables automating the development of XML dtds and schemas, composition and decomposition of large data models, and analysis of commonality among structures. Zeigler and Hammond include a range of features to benefit their readers. Natural language, graphical and XML forms of SES specification are employed to allow mapping of legacy meta-data. Real world examples and case studies provide insight into data engineering and test evaluation in various application domains. Comparative information is provided on concepts of ontologies, modeling and simulation, introductory linguistic background, and support options enable programmers to work with advanced tools in the area. The website of the Arizona Center for Integrative Modeling and Simulation, co-founded by Zeigler in 2001, provides links to downloadable software to accompany the book.
    • Sarbanes-Oxley IT Compliance Using Open Source Tools

      • 2nd Edition
      • December 14, 2007
      • Christian B Lahti + 1 more
      • English
      • eBook
        9 7 8 0 0 8 0 5 5 7 2 7 4
      The Sarbanes-Oxley Act (officially titled the Public Company Accounting Reform and Investor Protection Act of 2002), signed into law on 30 July 2002 by President Bush, is considered the most significant change to federal securities laws in the United States since the New Deal. It came in the wake of a series of corporate financial scandals, including those affecting Enron, Arthur Andersen, and WorldCom. The law is named after Senator Paul Sarbanes and Representative Michael G. Oxley. It was approved by the House by a vote of 423-3 and by the Senate 99-0. This book illustrates the many Open Source cost-saving opportunities that public companies can explore in their IT enterprise to meet mandatory compliance requirements of the Sarbanes-Oxley act. This book will also demonstrate by example and technical reference both the infrastructure components for Open Source that can be made compliant, and the Open Source tools that can aid in the journey of compliance. Although many books and reference material have been authored on the financial and business side of Sox compliance, very little material is available that directly address the information technology considerations, even less so on how Open Source fits into that discussion.The format of the book will begin each chapter with the IT business and executive considerations of Open Source and SOX compliance. The remaining chapter verbiage will include specific examinations of Open Source applications and tools which relate to the given subject matter.
    • Residuated Lattices: An Algebraic Glimpse at Substructural Logics

      • 1st Edition
      • Volume 151
      • April 25, 2007
      • Nikolaos Galatos + 3 more
      • English
      • Paperback
        9 7 8 0 4 4 4 5 5 0 6 6 8
      • Hardback
        9 7 8 0 4 4 4 5 2 1 4 1 5
      • eBook
        9 7 8 0 0 8 0 4 8 9 6 4 3
      The book is meant to serve two purposes. The first and more obvious one is to present state of the art results in algebraic research into residuated structures related to substructural logics. The second, less obvious but equally important, is to provide a reasonably gentle introduction to algebraic logic. At the beginning, the second objective is predominant. Thus, in the first few chapters the reader will find a primer of universal algebra for logicians, a crash course in nonclassical logics for algebraists, an introduction to residuated structures, an outline of Gentzen-style calculi as well as some titbits of proof theory - the celebrated Hauptsatz, or cut elimination theorem, among them. These lead naturally to a discussion of interconnections between logic and algebra, where we try to demonstrate how they form two sides of the same coin. We envisage that the initial chapters could be used as a textbook for a graduate course, perhaps entitled Algebra and Substructural Logics. As the book progresses the first objective gains predominance over the second. Although the precise point of equilibrium would be difficult to specify, it is safe to say that we enter the technical part with the discussion of various completions of residuated structures. These include Dedekind-McNeille completions and canonical extensions. Completions are used later in investigating several finiteness properties such as the finite model property, generation of varieties by their finite members, and finite embeddability. The algebraic analysis of cut elimination that follows, also takes recourse to completions. Decidability of logics, equational and quasi-equational theories comes next, where we show how proof theoretical methods like cut elimination are preferable for small logics/theories, but semantic tools like Rabin's theorem work better for big ones. Then we turn to Glivenko's theorem, which says that a formula is an intuitionistic tautology if and only if its double negation is a classical one. We generalise it to the substructural setting, identifying for each substructural logic its Glivenko equivalence class with smallest and largest element. This is also where we begin investigating lattices of logics and varieties, rather than particular examples. We continue in this vein by presenting a number of results concerning minimal varieties/maximal logics. A typical theorem there says that for some given well-known variety its subvariety lattice has precisely such-and-such number of minimal members (where values for such-and-such include, but are not limited to, continuum, countably many and two). In the last two chapters we focus on the lattice of varieties corresponding to logics without contraction. In one we prove a negative result: that there are no nontrivial splittings in that variety. In the other, we prove a positive one: that semisimple varieties coincide with discriminator ones. Within the second, more technical part of the book another transition process may be traced. Namely, we begin with logically inclined technicalities and end with algebraically inclined ones. Here, perhaps, algebraic rendering of Glivenko theorems marks the equilibrium point, at least in the sense that finiteness properties, decidability and Glivenko theorems are of clear interest to logicians, whereas semisimplicity and discriminator varieties are universal algebra par exellence. It is for the reader to judge whether we succeeded in weaving these threads into a seamless fabric.
    • Epidemiology and Medical Statistics

      • 1st Edition
      • Volume 27
      • November 6, 2007
      • English
      • Hardback
        9 7 8 0 4 4 4 5 2 8 0 1 8
      • eBook
        9 7 8 0 0 8 0 5 5 4 2 1 1
      This volume, representing a compilation of authoritative reviews on a multitude of uses of statistics in epidemiology and medical statistics written by internationally renowned experts, is addressed to statisticians working in biomedical and epidemiological fields who use statistical and quantitative methods in their work. While the use of statistics in these fields has a long and rich history, explosive growth of science in general and clinical and epidemiological sciences in particular have gone through a see of change, spawning the development of new methods and innovative adaptations of standard methods. Since the literature is highly scattered, the Editors have undertaken this humble exercise to document a representative collection of topics of broad interest to diverse users. The volume spans a cross section of standard topics oriented toward users in the current evolving field, as well as special topics in much need which have more recent origins. This volume was prepared especially keeping the applied statisticians in mind, emphasizing applications-oriente... methods and techniques, including references to appropriate software when relevant.
    • Stability of Dynamical Systems

      • 1st Edition
      • Volume 5
      • August 1, 2007
      • Xiaoxin Liao + 2 more
      • English
      • Paperback
        9 7 8 0 4 4 4 6 0 3 4 6 3
      • Hardback
        9 7 8 0 4 4 4 5 3 1 1 0 0
      • eBook
        9 7 8 0 0 8 0 5 5 0 6 1 9
      The main purpose of developing stability theory is to examine dynamic responses of a system to disturbances as the time approaches infinity. It has been and still is the object of intense investigations due to its intrinsic interest and its relevance to all practical systems in engineering, finance, natural science and social science. This monograph provides some state-of-the-art expositions of major advances in fundamental stability theories and methods for dynamic systems of ODE and DDE types and in limit cycle, normal form and Hopf bifurcation control of nonlinear dynamic systems.
    • Mathematical Modelling

      • 1st Edition
      • August 1, 2007
      • C Haines + 3 more
      • English
      • Paperback
        9 7 8 1 9 0 4 2 7 5 2 0 6
      • eBook
        9 7 8 0 8 5 7 0 9 9 4 1 9
      This book continues the ICTMA tradition of influencing teaching and learning in the application of mathematical modelling. Each chapter shows how real life problems can be discussed during university lectures, in school classrooms and industrial research. International experts contribute their knowledge and experience by providing analysis, insight and comment whilst tackling large and complex problems by applying mathematical modelling. This book covers the proceedings from the Twelfth International Conference on the Teaching of Mathematical Modelling and Applications.
    • Reconfigurable Computing

      • 1st Edition
      • Volume 1
      • November 2, 2007
      • Scott Hauck + 1 more
      • English
      • Hardback
        9 7 8 0 1 2 3 7 0 5 2 2 8
      • eBook
        9 7 8 0 0 8 0 5 5 6 0 1 7
      Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design— the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardware and software computing to provide an introduction to the entire range of issues relating to reconfigurable computing. FPGAs (field programmable gate arrays) act as the “computing vehicles” to implement this powerful technology. Readers will be guided into adopting a completely new way of handling existing design concerns and be able to make use of the vast opportunities possible with reconfigurable logic in this rapidly evolving field.