Skip to main content

Books in Computer science

The Computing collection presents a range of foundational and applied content across computer and data science, including fields such as Artificial Intelligence; Computational Modelling; Computer Networks, Computer Organization & Architecture, Computer Vision & Pattern Recognition, Data Management; Embedded Systems & Computer Engineering; HCI/User Interface Design; Information Security; Machine Learning; Network Security; Software Engineering.

    • Quantum Information Processing and Quantum Error Correction

      • 1st Edition
      • April 16, 2012
      • Ivan B. Djordjevic
      • English
      • Hardback
        9 7 8 0 1 2 3 8 5 4 9 1 9
      • eBook
        9 7 8 0 1 2 3 8 5 4 9 2 6
      Quantum Information Processing and Quantum Error Correction is a self-contained, tutorial-based introduction to quantum information, quantum computation, and quantum error-correction. Assuming no knowledge of quantum mechanics and written at an intuitive level suitable for the engineer, the book gives all the essential principles needed to design and implement quantum electronic and photonic circuits. Numerous examples from a wide area of application are given to show how the principles can be implemented in practice. This book is ideal for the electronics, photonics and computer engineer who requires an easy- to-understand foundation on the principles of quantum information processing and quantum error correction, together with insight into how to develop quantum electronic and photonic circuits. Readers of this book will be ready for further study in this area, and will be prepared to perform independent research. The reader completed the book will be able design the information processing circuits, stabilizer codes, Calderbank-Shor-Stea... (CSS) codes, subsystem codes, topological codes and entanglement-assiste... quantum error correction codes; and propose corresponding physical implementation. The reader completed the book will be proficient in quantum fault-tolerant design as well. Unique Features Unique in covering both quantum information processing and quantum error correction – everything in one book that an engineer needs to understand and implement quantum-level circuits. Gives an intuitive understanding by not assuming knowledge of quantum mechanics, thereby avoiding heavy mathematics. In-depth coverage of the design and implementation of quantum information processing and quantum error correction circuits. Provides the right balance among the quantum mechanics, quantum error correction, quantum computing and quantum communication. Dr. Djordjevic is an Assistant Professor in the Department of Electrical and Computer Engineering of College of Engineering, University of Arizona, with a joint appointment in the College of Optical Sciences. Prior to this appointment in August 2006, he was with University of Arizona, Tucson, USA (as a Research Assistant Professor); University of the West of England, Bristol, UK; University of Bristol, Bristol, UK; Tyco Telecommunications, Eatontown, USA; and National Technical University of Athens, Athens, Greece. His current research interests include optical networks, error control coding, constrained coding, coded modulation, turbo equalization, OFDM applications, and quantum error correction. He presently directs the Optical Communications Systems Laboratory (OCSL) within the ECE Department at the University of Arizona.
    • Writing Effective Business Rules

      • 1st Edition
      • January 27, 2012
      • Graham Witt
      • English
      • Paperback
        9 7 8 0 1 2 3 8 5 0 5 1 5
      • eBook
        9 7 8 0 1 2 3 8 5 0 5 2 2
      Writing Effective Business Rules moves beyond the fundamental dilemma of system design: defining business rules either in natural language, intelligible but often ambiguous, or program code (or rule engine instructions), unambiguous but unintelligible to stakeholders. Designed to meet the needs of business analysts, this book provides an exhaustive analysis of rule types and a set of syntactic templates from which unambiguous natural language rule statements of each type can be generated. A user guide to the SBVR specification, it explains how to develop an appropriate business vocabulary and generate quality rule statements using the appropriate templates and terms from the vocabulary. The resulting rule statements can be reviewed by business stakeholders for relevance and correctness, providing for a high level of confidence in their successful implementation.
    • Logical Foundations of Artificial Intelligence

      • 1st Edition
      • July 5, 2012
      • Michael R. Genesereth + 1 more
      • English
      • Paperback
        9 7 8 1 4 9 3 3 0 5 9 8 8
      • eBook
        9 7 8 0 1 2 8 0 1 5 5 4 4
      Intended both as a text for advanced undergraduates and graduate students, and as a key reference work for AI researchers and developers, Logical Foundations of Artificial Intelligence is a lucid, rigorous, and comprehensive account of the fundamentals of artificial intelligence from the standpoint of logic.The first section of the book introduces the logicist approach to AI--discussing the representation of declarative knowledge and featuring an introduction to the process of conceptualization, the syntax and semantics of predicate calculus, and the basics of other declarative representations such as frames and semantic nets. This section also provides a simple but powerful inference procedure, resolution, and shows how it can be used in a reasoning system.The next several chapters discuss nonmonotonic reasoning, induction, and reasoning under uncertainty, broadening the logical approach to deal with the inadequacies of strict logical deduction. The third section introduces modal operators that facilitate representing and reasoning about knowledge. This section also develops the process of writing predicate calculus sentences to the metalevel--to permit sentences about sentences and about reasoning processes. The final three chapters discuss the representation of knowledge about states and actions, planning, and intelligent system architecture.End-of-... bibliographic and historical comments provide background and point to other works of interest and research. Each chapter also contains numerous student exercises (with solutions provided in an appendix) to reinforce concepts and challenge the learner. A bibliography and index complete this comprehensive work.
    • Client-Side Attacks and Defense

      • 1st Edition
      • September 28, 2012
      • Sean-Philip Oriyano + 1 more
      • English
      • Paperback
        9 7 8 1 5 9 7 4 9 5 9 0 5
      • eBook
        9 7 8 1 5 9 7 4 9 5 9 1 2
      Client-Side Attacks and Defense offers background networks against its attackers. The book examines the forms of client-side attacks and discusses different kinds of attacks along with delivery methods including, but not limited to, browser exploitation, use of rich internet applications, and file format vulnerabilities. It also covers defenses, such as antivirus and anti-spyware, intrusion detection systems, and end-user education. The book explains how to secure Web browsers, such as Microsoft Internet Explorer, Mozilla Firefox, Google Chrome, Apple Safari, and Opera. It discusses advanced Web attacks and advanced defenses against them. Moreover, it explores attacks on messaging, Web applications, and mobiles. The book concludes with a discussion on security measures against client-side attacks, starting from the planning of security. This book will be of great value to penetration testers, security consultants, system and network administrators, and IT auditors.
    • MATLAB® by Example

      • 1st Edition
      • December 31, 2012
      • Munther Gdeisat + 1 more
      • English
      • Hardback
        9 7 8 0 1 2 4 0 5 2 1 2 3
      • Paperback
        9 7 8 0 3 2 3 2 8 2 8 9 5
      • eBook
        9 7 8 0 1 2 4 0 5 8 5 3 8
      MATLAB By Example guides the reader through each step of writing MATLAB programs. The book assumes no previous programming experience on the part of the reader, and uses multiple examples in clear language to introduce concepts and practical tools. Straightforward and detailed instructions allow beginners to learn and develop their MATLAB skills quickly. The book consists of ten chapters, discussing in detail the integrated development environment (IDE), scalars, vectors, arrays, adopting structured programming style using functions and recursive functions, control flow, debugging, profiling, and structures. A chapter also describes Symbolic Math Toolbox, teaching readers how to solve algebraic equations, differentiation, integration, differential equations, and Laplace and Fourier transforms. Containing hundreds of examples illustrated using screen shots, hundreds of exercises, and three projects, this book can be used to complement coursework or as a self-study book, and can be used as a textbook in universities, colleges and high schools.
    • CUDA Programming

      • 1st Edition
      • November 13, 2012
      • Shane Cook
      • English
      • Paperback
        9 7 8 0 1 2 4 1 5 9 3 3 4
      • eBook
        9 7 8 0 1 2 4 1 5 9 8 8 4
      If you need to learn CUDA but don't have experience with parallel computing, CUDA Programming: A Developer's Introduction offers a detailed guide to CUDA with a grounding in parallel fundamentals. It starts by introducing CUDA and bringing you up to speed on GPU parallelism and hardware, then delving into CUDA installation. Chapters on core concepts including threads, blocks, grids, and memory focus on both parallel and CUDA-specific issues. Later, the book demonstrates CUDA in practice for optimizing applications, adjusting to new hardware, and solving common problems.
    • An Analysis of the Information Technology Standardization Process

      • 1st Edition
      • December 2, 2012
      • J.L. Berg + 1 more
      • English
      • Paperback
        9 7 8 0 4 4 4 5 6 7 7 0 3
      • eBook
        9 7 8 0 4 4 4 5 9 7 2 2 9
      A number of important issues form the basis of this book: How can the Information Technology (IT) standardization process, leading to unified products which are needed on the market, be made more efficient? Which current IT standards are of high quality, what factors have led to that high quality, and can those factors be re-created for other IT standards? What improvements to the quality of IT standards are needed? Which organizations should be involved? What permanent changes in the IT standardization scene are necessary? At what point in the evolution of a technology is it appropriate to produce standards? Is strategic planning feasible in the current standardization approach? Diverse disciplines contributed to the findings in this book: computer scientists, standardization leaders and professionals, users and vendors, economists, auditors, software implementors, and communication specialists.
    • Measuring Data Quality for Ongoing Improvement

      • 1st Edition
      • December 31, 2012
      • Laura Sebastian-Coleman
      • English
      • Paperback
        9 7 8 0 1 2 3 9 7 0 3 3 6
      • eBook
        9 7 8 0 1 2 3 9 7 7 5 4 0
      The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies.
    • Introduction to Data Compression

      • 4th Edition
      • October 4, 2012
      • Khalid Sayood
      • English
      • eBook
        9 7 8 0 1 2 4 1 6 0 0 0 2
      Introduction to Data Compression, Fourth Edition, is a concise and comprehensive guide to the art and science of data compression. This new edition includes all the cutting edge updates the reader will need during the work day and in class. It provides an extensive introduction to the theory underlying today’s compression techniques with detailed instruction for their applications using several examples to explain the concepts.Encompassin... the entire field of data compression, this book covers lossless and lossy compression, Huffman coding, arithmetic coding, dictionary techniques, context based compression, scalar and vector quantization. New to this fourth edition is a more detailed description of the JPEG 2000 standard as well as speech coding for internet applications. A source code is also provided via a companion web site that gives readers the opportunity to build their own algorithms, choose and implement techniques in their own applications.This text will appeal to professionals, software and hardware engineers, students, and anyone interested in digital libraries and multimedia.
    • Computation and Storage in the Cloud

      • 1st Edition
      • December 31, 2012
      • Dong Yuan + 2 more
      • English
      • Paperback
        9 7 8 0 1 2 4 0 7 7 6 7 6
      • eBook
        9 7 8 0 1 2 4 0 7 8 7 9 6
      Computation and Storage in the Cloud is the first comprehensive and systematic work investigating the issue of computation and storage trade-off in the cloud in order to reduce the overall application cost. Scientific applications are usually computation and data intensive, where complex computation tasks take a long time for execution and the generated datasets are often terabytes or petabytes in size. Storing valuable generated application datasets can save their regeneration cost when they are reused, not to mention the waiting time caused by regeneration. However, the large size of the scientific datasets is a big challenge for their storage. By proposing innovative concepts, theorems and algorithms, this book will help bring the cost down dramatically for both cloud users and service providers to run computation and data intensive scientific applications in the cloud. Covers cost models and benchmarking that explain the necessary tradeoffs for both cloud providers and users Describes several novel strategies for storing application datasets in the cloud Includes real-world case studies of scientific research applications