Skip to main content

Morgan Kaufmann

    • Agile Systems Engineering

      • 1st Edition
      • September 24, 2015
      • Bruce Powel Douglass
      • English
      • Paperback
        9 7 8 0 1 2 8 0 2 1 2 0 0
      • eBook
        9 7 8 0 1 2 8 0 2 3 4 9 5
      Agile Systems Engineering presents a vision of systems engineering where precise specification of requirements, structure, and behavior meet larger concerns as such as safety, security, reliability, and performance in an agile engineering context. World-renown author and speaker Dr. Bruce Powel Douglass incorporates agile methods and model-based systems engineering (MBSE) to define the properties of entire systems while avoiding errors that can occur when using traditional textual specifications. Dr. Douglass covers the lifecycle of systems development, including requirements, analysis, design, and the handoff to specific engineering disciplines. Throughout, Dr. Douglass couples agile methods with SysML and MBSE to arm system engineers with the conceptual and methodological tools they need to avoid specification defects and improve system quality while simultaneously reducing the effort and cost of systems engineering.
    • Agile Data Warehousing for the Enterprise

      • 1st Edition
      • September 19, 2015
      • Ralph Hughes
      • English
      • Paperback
        9 7 8 0 1 2 3 9 6 4 6 4 9
      • eBook
        9 7 8 0 1 2 3 9 6 5 1 8 9
      Building upon his earlier book that detailed agile data warehousing programming techniques for the Scrum master, Ralph's latest work illustrates the agile interpretations of the remaining software engineering disciplines: Requirements management benefits from streamlined templates that not only define projects quickly, but ensure nothing essential is overlooked. Data engineering receives two new "hyper modeling" techniques, yielding data warehouses that can be easily adapted when requirements change without having to invest in ruinously expensive data-conversion programs. Quality assurance advances with not only a stereoscopic top-down and bottom-up planning method, but also the incorporation of the latest in automated test engines. Use this step-by-step guide to deepen your own application development skills through self-study, show your teammates the world's fastest and most reliable techniques for creating business intelligence systems, or ensure that the IT department working for you is building your next decision support system the right way.
    • Building a Scalable Data Warehouse with Data Vault 2.0

      • 1st Edition
      • September 15, 2015
      • Daniel Linstedt + 1 more
      • English
      • Paperback
        9 7 8 0 1 2 8 0 2 5 1 0 9
      • eBook
        9 7 8 0 1 2 8 0 2 6 4 8 9
      The Data Vault was invented by Dan Linstedt at the U.S. Department of Defense, and the standard has been successfully applied to data warehousing projects at organizations of different sizes, from small to large-size corporations. Due to its simplified design, which is adapted from nature, the Data Vault 2.0 standard helps prevent typical data warehousing failures. "Building a Scalable Data Warehouse" covers everything one needs to know to create a scalable data warehouse end to end, including a presentation of the Data Vault modeling technique, which provides the foundations to create a technical data warehouse layer. The book discusses how to build the data warehouse incrementally using the agile Data Vault 2.0 methodology. In addition, readers will learn how to create the input layer (the stage layer) and the presentation layer (data mart) of the Data Vault 2.0 architecture including implementation best practices. Drawing upon years of practical experience and using numerous examples and an easy to understand framework, Dan Linstedt and Michael Olschimke discuss: How to load each layer using SQL Server Integration Services (SSIS), including automation of the Data Vault loading processes. Important data warehouse technologies and practices. Data Quality Services (DQS) and Master Data Services (MDS) in the context of the Data Vault architecture.
    • Presumptive Design

      • 1st Edition
      • September 10, 2015
      • Leo Frishberg + 1 more
      • English
      • Paperback
        9 7 8 0 1 2 8 0 3 0 8 6 8
      • eBook
        9 7 8 0 1 2 8 0 3 0 8 7 5
      Everything you know about the future is wrong. Presumptive Design: Design Provocations for Innovation is for people “inventing” the future: future products, services, companies, strategies and policies. It introduces a design-research method that shortens time to insights from months to days. Presumptive Design is a fundamentally agile approach to identifying your audiences’ key needs. Offering rapidly crafted artifacts, your teams collaborate with your customers to identify preferred and profitable elements of your desired outcome. Presumptive Design focuses on your users’ problem space, informing your business strategy, your project’s early stage definition, and your innovation pipeline. Comprising discussions of design theory with case studies and how-to’s, the book offers business leadership, management and innovators the benefits of design thinking and user experience in the context of early stage problem definition. Presumptive Design is an advanced technique and quick to use: within days of reading this book, your research and design teams can apply the approach to capture a risk-reduced view of your future.
    • Embedded Systems

      • 1st Edition
      • September 3, 2015
      • Jason D. Bakos
      • English
      • Paperback
        9 7 8 0 1 2 8 0 0 3 4 2 8
      • eBook
        9 7 8 0 1 2 8 0 0 4 1 2 8
      Embedded Systems: ARM Programming and Optimization combines an exploration of the ARM architecture with an examination of the facilities offered by the Linux operating system to explain how various features of program design can influence processor performance. It demonstrates methods by which a programmer can optimize program code in a way that does not impact its behavior but improves its performance. Several applications, including image transformations, fractal generation, image convolution, and computer vision tasks, are used to describe and demonstrate these methods. From this, the reader will gain insight into computer architecture and application design, as well as gain practical knowledge in the area of embedded software design for modern embedded systems.
    • Problem-solving in High Performance Computing

      • 1st Edition
      • September 1, 2015
      • Igor Ljubuncic
      • English
      • Paperback
        9 7 8 0 1 2 8 0 1 0 1 9 8
      • eBook
        9 7 8 0 1 2 8 0 1 0 6 4 8
      Problem-Solving in High Performance Computing: A Situational Awareness Approach with Linux focuses on understanding giant computing grids as cohesive systems. Unlike other titles on general problem-solving or system administration, this book offers a cohesive approach to complex, layered environments, highlighting the difference between standalone system troubleshooting and complex problem-solving in large, mission critical environments, and addressing the pitfalls of information overload, micro, and macro symptoms, also including methods for managing problems in large computing ecosystems. The authors offer perspective gained from years of developing Intel-based systems that lead the industry in the number of hosts, software tools, and licenses used in chip design. The book offers unique, real-life examples that emphasize the magnitude and operational complexity of high performance computer systems.
    • The Art and Science of Analyzing Software Data

      • 1st Edition
      • August 27, 2015
      • Christian Bird + 2 more
      • English
      • Paperback
        9 7 8 0 1 2 4 1 1 5 1 9 4
      • eBook
        9 7 8 0 1 2 4 1 1 5 4 3 9
      The Art and Science of Analyzing Software Data provides valuable information on analysis techniques often used to derive insight from software data. This book shares best practices in the field generated by leading data scientists, collected from their experience training software engineering students and practitioners to master data science. The book covers topics such as the analysis of security data, code reviews, app stores, log files, and user telemetry, among others. It covers a wide variety of techniques such as co-change analysis, text analysis, topic analysis, and concept analysis, as well as advanced topics such as release planning and generation of source code comments. It includes stories from the trenches from expert data scientists illustrating how to apply data analysis in industry and open source, present results to stakeholders, and drive decisions.
    • Structured Search for Big Data

      • 1st Edition
      • August 26, 2015
      • Mikhail Gilula
      • English
      • Paperback
        9 7 8 0 1 2 8 0 4 6 3 1 9
      • eBook
        9 7 8 0 1 2 8 0 4 6 5 2 4
      The WWW era made billions of people dramatically dependent on the progress of data technologies, out of which Internet search and Big Data are arguably the most notable. Structured Search paradigm connects them via a fundamental concept of key-objects evolving out of keywords as the units of search. The key-object data model and KeySQL revamp the data independence principle making it applicable for Big Data and complement NoSQL with full-blown structured querying functionality. The ultimate goal is extracting Big Information from the Big Data. As a Big Data Consultant, Mikhail Gilula combines academic background with 20 years of industry experience in the database and data warehousing technologies working as a Sr. Data Architect for Teradata, Alcatel-Lucent, and PayPal, among others. He has authored three books, including The Set Model for Database and Information Systems and holds four US Patents in Structured Search and Data Integration.
    • Applied Computing in Medicine and Health

      • 1st Edition
      • August 21, 2015
      • Dhiya Al-Jumeily + 3 more
      • English
      • Paperback
        9 7 8 0 1 2 8 0 3 4 6 8 2
      • eBook
        9 7 8 0 1 2 8 0 3 4 9 8 9
      Applied Computing in Medicine and Health is a comprehensive presentation of on-going investigations into current applied computing challenges and advances, with a focus on a particular class of applications, primarily artificial intelligence methods and techniques in medicine and health. Applied computing is the use of practical computer science knowledge to enable use of the latest technology and techniques in a variety of different fields ranging from business to scientific research. One of the most important and relevant areas in applied computing is the use of artificial intelligence (AI) in health and medicine. Artificial intelligence in health and medicine (AIHM) is assuming the challenge of creating and distributing tools that can support medical doctors and specialists in new endeavors. The material included covers a wide variety of interdisciplinary perspectives concerning the theory and practice of applied computing in medicine, human biology, and health care. Particular attention is given to AI-based clinical decision-making, medical knowledge engineering, knowledge-based systems in medical education and research, intelligent medical information systems, intelligent databases, intelligent devices and instruments, medical AI tools, reasoning and metareasoning in medicine, and methodological, philosophical, ethical, and intelligent medical data analysis.
    • Topics in Parallel and Distributed Computing

      • 1st Edition
      • August 21, 2015
      • Sushil K Prasad + 4 more
      • English
      • Paperback
        9 7 8 0 1 2 8 0 3 8 9 9 4
      • eBook
        9 7 8 0 1 2 8 0 3 9 3 8 0
      Topics in Parallel and Distributed Computing provides resources and guidance for those learning PDC as well as those teaching students new to the discipline. The pervasiveness of computing devices containing multicore CPUs and GPUs, including home and office PCs, laptops, and mobile devices, is making even common users dependent on parallel processing. Certainly, it is no longer sufficient for even basic programmers to acquire only the traditional sequential programming skills. The preceding trends point to the need for imparting a broad-based skill set in PDC technology. However, the rapid changes in computing hardware platforms and devices, languages, supporting programming environments, and research advances, poses a challenge both for newcomers and seasoned computer scientists. This edited collection has been developed over the past several years in conjunction with the IEEE technical committee on parallel processing (TCPP), which held several workshops and discussions on learning parallel computing and integrating parallel concepts into courses throughout computer science curricula.