Skip to main content

Books in Data

51-60 of 75 results in All results

Cyber Security and IT Infrastructure Protection

  • 1st Edition
  • August 22, 2013
  • John Vacca
  • English
  • Paperback
    9 7 8 - 0 - 1 2 - 4 1 6 6 8 1 - 3
  • eBook
    9 7 8 - 0 - 1 2 - 4 2 0 0 4 7 - 0
This book serves as a security practitioner’s guide to today’s most crucial issues in cyber security and IT infrastructure. It offers in-depth coverage of theory, technology, and practice as they relate to established technologies as well as recent advancements. It explores practical solutions to a wide range of cyber-physical and IT infrastructure protection issues. Composed of 11 chapters contributed by leading experts in their fields, this highly useful book covers disaster recovery, biometrics, homeland security, cyber warfare, cyber security, national infrastructure security, access controls, vulnerability assessments and audits, cryptography, and operational and organizational security, as well as an extensive glossary of security terms and acronyms. Written with instructors and students in mind, this book includes methods of analysis and problem-solving techniques through hands-on exercises and worked examples as well as questions and answers and the ability to implement practical solutions through real-life case studies. For example, the new format includes the following pedagogical elements:• Checklists throughout each chapter to gauge understanding• Chapter Review Questions/Exercises and Case Studies• Ancillaries: Solutions Manual; slide package; figure files This format will be attractive to universities and career schools as well as federal and state agencies, corporate security training programs, ASIS certification, etc.

Principles of Big Data

  • 1st Edition
  • May 20, 2013
  • Jules J. Berman
  • English
  • Paperback
    9 7 8 - 0 - 1 2 - 4 0 4 5 7 6 - 7
  • eBook
    9 7 8 - 0 - 1 2 - 4 0 4 7 2 4 - 2
Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endowed with semantic support (i.e., organized in classes of uniquely identified data objects). Readers will learn how their data can be integrated with data from other resources, and how the data extracted from Big Data resources can be used for purposes beyond those imagined by the data creators.

Simple Steps to Data Encryption

  • 1st Edition
  • April 25, 2013
  • Peter Loshin
  • English
  • Paperback
    9 7 8 - 0 - 1 2 - 4 1 1 4 8 3 - 8
  • eBook
    9 7 8 - 0 - 1 2 - 4 0 7 8 8 2 - 6
Everyone wants privacy and security online, something that most computer users have more or less given up on as far as their personal data is concerned. There is no shortage of good encryption software, and no shortage of books, articles and essays that purport to be about how to use it. Yet there is precious little for ordinary users who want just enough information about encryption to use it safely and securely and appropriately--WITHOUT having to become experts in cryptography. Data encryption is a powerful tool, if used properly. Encryption turns ordinary, readable data into what looks like gibberish, but gibberish that only the end user can turn back into readable data again. The difficulty of encryption has much to do with deciding what kinds of threats one needs to protect against and then using the proper tool in the correct way. It's kind of like a manual transmission in a car: learning to drive with one is easy; learning to build one is hard. The goal of this title is to present just enough for an average reader to begin protecting his or her data, immediately. Books and articles currently available about encryption start out with statistics and reports on the costs of data loss, and quickly get bogged down in cryptographic theory and jargon followed by attempts to comprehensively list all the latest and greatest tools and techniques. After step-by-step walkthroughs of the download and install process, there's precious little room left for what most readers really want: how to encrypt a thumb drive or email message, or digitally sign a data file. There are terabytes of content that explain how cryptography works, why it's important, and all the different pieces of software that can be used to do it; there is precious little content available that couples concrete threats to data with explicit responses to those threats. This title fills that niche. By reading this title readers will be provided with a step by step hands-on guide that includes: Simple descriptions of actual threat scenarios Simple, step-by-step instructions for securing data How to use open source, time-proven and peer-reviewed cryptographic software Easy to follow tips for safer computing Unbiased and platform-independent coverage of encryption tools and techniques

Measuring Data Quality for Ongoing Improvement

  • 1st Edition
  • December 31, 2012
  • Laura Sebastian-Coleman
  • English
  • Paperback
    9 7 8 - 0 - 1 2 - 3 9 7 0 3 3 - 6
  • eBook
    9 7 8 - 0 - 1 2 - 3 9 7 7 5 4 - 0
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies.

Architecture and Patterns for IT Service Management, Resource Planning, and Governance

  • 2nd Edition
  • September 23, 2011
  • Charles T. Betz
  • English
  • Paperback
    9 7 8 - 0 - 1 2 - 3 8 5 0 1 7 - 1
  • eBook
    9 7 8 - 0 - 1 2 - 3 8 5 0 1 8 - 8
Information technology supports efficient operations, enterprise integration, and seamless value delivery, yet itself is too often inefficient, un-integrated, and of unclear value. This completely rewritten version of the bestselling Architecture and Patterns for IT Service Management, Resource Planning and Governance retains the original (and still unique) approach: apply the discipline of enterprise architecture to the business of large scale IT management itself. Author Charles Betz applies his deep practitioner experience to a critical reading of ITIL 2011, COBIT version 4, the CMMI suite, the IT portfolio management literature, and the Agile/Lean IT convergence, and derives a value stream analysis, IT semantic model, and enabling systems architecture (covering current topics such as CMDB/CMS, Service Catalog, and IT Portfolio Management). Using the concept of design patterns, the book then presents dozens of visual models documenting challenging problems in integrating IT management, showing how process, data, and IT management systems must work together to enable IT and its business partners. The edition retains the fundamental discipline of traceable process, data, and system analysis that has made the first edition a favored desk reference for IT process analysts around the world. This best seller is a must read for anyone charged with enterprise architecture, IT planning, or IT governance and management.

Digital Forensics for Legal Professionals

  • 1st Edition
  • September 2, 2011
  • Larry Daniel + 1 more
  • English
  • Paperback
    9 7 8 - 1 - 5 9 7 4 9 - 6 4 3 - 8
  • eBook
    9 7 8 - 1 - 5 9 7 4 9 - 6 4 4 - 5
Digital Forensics for Legal Professionals is a complete non-technical guide for legal professionals and students to understand digital forensics. In the authors’ years of experience in working with attorneys as digital forensics experts, common questions arise again and again: "What do I ask for?" "Is the evidence relevant?" "What does this item in the forensic report mean?" "What should I ask the other expert?" "What should I ask you?" "Can you explain that to a jury?" This book answers many of those questions in clear language that is understandable by non-technical people. With many illustrations and diagrams that will be usable in court, it explains technical concepts such as unallocated space, forensic copies, timeline artifacts and metadata in simple terms that make these concepts accessible to both attorneys and juries. The book also explains how to determine what evidence to ask for, evidence that might be discoverable, and furthermore, it provides an overview of the current state of digital forensics, the right way to select a qualified expert, what to expect from that expert, and how to properly use experts before and during trial. With this book, readers will clearly understand different types of digital evidence and examples of direct and cross examination questions. It includes a reference of definitions of digital forensic terms, relevant case law, and resources. This book will be a valuable resource for attorneys, judges, paralegals, and digital forensic professionals.

Windows Registry Forensics

  • 1st Edition
  • January 3, 2011
  • Harlan Carvey
  • English
  • eBook
    9 7 8 - 1 - 5 9 7 4 9 - 5 8 1 - 3
Windows Registry Forensics provides the background of the Windows Registry to help develop an understanding of the binary structure of Registry hive files. Approaches to live response and analysis are included, and tools and techniques for postmortem analysis are discussed at length. Tools and techniques are presented that take the student and analyst beyond the current use of viewers and into real analysis of data contained in the Registry, demonstrating the forensic value of the Registry. Named a 2011 Best Digital Forensics Book by InfoSec Reviews, this book is packed with real-world examples using freely available open source tools. It also includes case studies and a CD containing code and author-created tools discussed in the book. This book will appeal to computer forensic and incident response professionals, including federal government and commercial/private sector contractors, consultants, etc.

Handbook of Blind Source Separation

  • 1st Edition
  • February 17, 2010
  • Pierre Comon + 1 more
  • English
  • Hardback
    9 7 8 - 0 - 1 2 - 3 7 4 7 2 6 - 6
  • eBook
    9 7 8 - 0 - 0 8 - 0 8 8 4 9 4 - 3
Edited by the people who were forerunners in creating the field, together with contributions from 34 leading international experts, this handbook provides the definitive reference on Blind Source Separation, giving a broad and comprehensive description of all the core principles and methods, numerical algorithms and major applications in the fields of telecommunications, biomedical engineering and audio, acoustic and speech processing. Going beyond a machine learning perspective, the book reflects recent results in signal processing and numerical analysis, and includes topics such as optimization criteria, mathematical tools, the design of numerical algorithms, convolutive mixtures, and time frequency approaches. This Handbook is an ideal reference for university researchers, R&D engineers and graduates wishing to learn the core principles, methods, algorithms, and applications of Blind Source Separation.

Joint Source-Channel Decoding

  • 1st Edition
  • November 26, 2009
  • Pierre Duhamel + 1 more
  • English
  • eBook
    9 7 8 - 0 - 0 8 - 0 9 2 2 4 4 - 7
Treats joint source and channel decoding in an integrated way Gives a clear description of the problems in the field together with the mathematical tools for their solution Contains many detailed examples useful for practical applications of the theory to video broadcasting over mobile and wireless networks Traditionally, cross-layer and joint source-channel coding were seen as incompatible with classically structured networks but recent advances in theory changed this situation. Joint source-channel decoding is now seen as a viable alternative to separate decoding of source and channel codes, if the protocol layers are taken into account. A joint source/protocol/channel approach is thus addressed in this book: all levels of the protocol stack are considered, showing how the information in each layer influences the others. This book provides the tools to show how cross-layer and joint source-channel coding and decoding are now compatible with present-day mobile and wireless networks, with a particular application to the key area of video transmission to mobiles. Typical applications are broadcasting, or point-to-point delivery of multimedia contents, which are very timely in the context of the current development of mobile services such as audio (MPEG4 AAC) or video (H263, H264) transmission using recent wireless transmission standards (DVH-H, DVB-SH, WiMAX, LTE). This cross-disciplinary book is ideal for graduate students, researchers, and more generally professionals working either in signal processing for communications or in networking applications, interested in reliable multimedia transmission. This book is also of interest to people involved in cross-layer optimization of mobile networks. Its content may provide them with other points of view on their optimization problem, enlarging the set of tools which they could use. Pierre Duhamel is director of research at CNRS/ LSS and has previously held research positions at Thomson-CSF, CNET, and ENST, where he was head of the Signal and Image Processing Department. He has served as chairman of the DSP committee and associate Editor of the IEEE Transactions on Signal Processing and Signal Processing Letters, as well as acting as a co-chair at MMSP and ICASSP conferences. He was awarded the Grand Prix France Telecom by the French Science Academy in 2000. He is co-author of more than 80 papers in international journals, 250 conference proceedings, and 28 patents. Michel Kieffer is an assistant professor in signal processing for communications at the Université Paris-Sud and a researcher at the Laboratoire des Signaux et Systèmes, Gif-sur-Yvette, France. His research interests are in joint source-channel coding and decoding techniques for the reliable transmission of multimedia contents. He serves as associate editor of Signal Processing (Elsevier). He is co-author of more than 90 contributions to journals, conference proceedings, and book chapters.

Pricing, Risk, and Performance Measurement in Practice

  • 1st Edition
  • October 22, 2009
  • Wolfgang Schwerdt + 1 more
  • English
  • eBook
    9 7 8 - 0 - 0 8 - 0 9 2 3 0 4 - 8
How can managers increase their ability to calculate price and risk data for financial instruments while decreasing their dependence on a myriad of specific instrument variants? Wolfgang Schwerdt and Marcelle von Wendland created a simple and consistent way to handle and process large amounts of complex financial data. By means of a practical framework, their approach analyzes market and credit risk exposure of financial instruments and portfolios and calculates risk adjusted performance measures. Its emphasis on standardization yields significant improvements in speed and accuracy.Schwerdt and von Wendland's focus on practical implementation directly addresses limitations imposed by the complex and costly processing time required for advanced risk management models and pricing hundreds of thousands of securities each day. Their many examples and programming codes demonstrate how to use standards to build financial instruments, how to price them, and how to measure the risk and performance of the portfolios that include them.