This book constitutes the proceedings of the 18th International Workshop on Computer Algebra in Scientific Computing, CASC 2016, held in Bucharest, Romania, in September 2016. The 32 papers presented in this volume were carefully reviewed and selected from 39 submissions. They deal with cutting-edge research in all major disciplines of Computer Algebra.
This third edition presents an expanded and updated treatment of convex analysis methods, incorporating many new results that have emerged in recent years. These additions are essential for grasping the practical applications of convex function theory in solving contemporary real-world problems.
To reflect these advancements, the material has been meticulously reorganized, with a greater emphasis on topics relevant to current research. Additionally, great care has been taken to ensure that the text remains accessible to a broad audience, including both students and researchers focused on the application of mathematics.
Ideal for undergraduate courses, graduate seminars, or as a comprehensive reference, this book is an indispensable resource for those seeking to understand the extensive potential of convex function theory.
In addition, this book:
This book looks at how Australia's migrant population composition is likely to change over coming decades. The book divides Australia's population into 48 countries of birth groupings and projects the birthplace populations out to 2066 according to the range of scenarios. These projections indicate a massive shift in Australia's migrant composition from a European to an Asian-dominated population over the coming decades-a change which can be interpreted as a third demographic transition. By providing detailed consideration of the implications of the changing population composition, this book is a great resource for academics, government and private sector services.
This book provides a comprehensive introduction to embedded flash memory, describing the history, current status, and future projections for technology, circuits, and systems applications. The authors describe current main-stream embedded flash technologies from floating-gate 1Tr, floating-gate with split-gate (1.5Tr), and 1Tr/1.5Tr SONOS flash technologies and their successful creation of various applications. Comparisons of these embedded flash technologies and future projections are also provided. The authors demonstrate a variety of embedded applications for auto-motive, smart-IC cards, and low-power, representing the leading-edge technology developments for eFlash. The discussion also includes insights into future prospects of application-driven non-volatile memory technology in the era of smart advanced automotive system, such as ADAS (Advanced Driver Assistance System) and IoE (Internet of Everything). Trials on technology convergence and future prospects of embedded non-volatile memory in the new memory hierarchy are also described.
This book is the culmination of the authors¿ industry-academic collaboration in the past several years. The investigation is largely motivated by bank balance sheet management problems. The main difference between a bank balance sheet management problem and a typical portfolio optimization problem is that the former involves multiple risks. The related theoretical investigation leads to a significant extension of the scope of portfolio theories. The book combines practitioners¿ perspectives and mathematical rigor. For example, to guide the bank managers to trade off different Pareto efficient points, the topological structure of the Pareto efficient set is carefully analyzed. Moreover, on top of computing solutions, the authors focus the investigation on the qualitative properties of those solutions and their financial meanings. These relations, such as the role of duality, are most useful in helping bank managers to communicate their decisions to the different stakeholders. Finally, bank balance sheet management problems of varying levels of complexity are discussed to illustrate how to apply the central mathematical results. Although the primary motivation and application examples in this book are focused in the area of bank balance sheet management problems, the range of applications of the general portfolio theory is much wider. As a matter of fact, most financial problems involve multiple types of risks. Thus, the book is a good reference for financial practitioners in general and students who are interested in financial applications. This book can also serve as a nice example of a case study for applied mathematicians who are interested in engaging in industry-academic collaboration.
This book addresses Birkhoff and Mal'cev's problem of describing subquasivariety lattices. The text begins by developing the basics of atomic theories and implicational theories in languages that may, or may not, contain equality. Subquasivariety lattices are represented as lattices of closed algebraic subsets of a lattice with operators, which yields new restrictions on the equaclosure operator. As an application of this new approach, it is shown that completely distributive lattices with a dually compact least element are subquasivariety lattices. The book contains many examples to illustrate these principles, as well as open problems. Ultimately this new approach gives readers a set of tools to investigate classes of lattices that can be represented as subquasivariety lattices.
This book is devoted to Killing vector fields and the one-parameter isometry groups of Riemannian manifolds generated by them. It also provides a detailed introduction to homogeneous geodesics, that is, geodesics that are integral curves of Killing vector fields, presenting both classical and modern results, some very recent, many of which are due to the authors. The main focus is on the class of Riemannian manifolds with homogeneous geodesics and on some of its important subclasses.
To keep the exposition self-contained the book also includes useful general results not only on geodesic orbit manifolds, but also on smooth and Riemannian manifolds, Lie groups and Lie algebras, homogeneous Riemannian manifolds, and compact homogeneous Riemannian spaces.
The intended audience is graduate students and researchers whose work involves differential geometry and transformation groups.
This book presents medical image watermarking techniques and algorithms for telemedicine and other emerging applications. This book emphasizes on medical image watermarking to ensure the authenticity of transmitted medical information. It begins with an introduction of digital watermarking, important characteristics, novel applications, different watermarking attacks and standard benchmark tools. This book also covers spatial and transform domain medical image watermarking techniques and their merits and limitations.
The authors have developed improved/novel watermarking techniques for telemedicine applications that offer higher robustness, better perceptual quality and increased embedding capacity and secure watermark. The suggested methods may find potential applications in the prevention of patient identity theft and health data management issues which is a growing concern in telemedicine applications.
This book provides a sound platform for understanding
the medical image watermarking paradigm for researchers in the field and advanced-level students. Industry professionals working in this field, as well as other emerging applications demanding robust and secure watermarking will find this book useful as a reference.This book provides developers, engineers, researchers and students with detailed knowledge about the High Efficiency Video Coding (HEVC) standard. HEVC is the successor to the widely successful H.264/AVC video compression standard, and it provides around twice as much compression as H.264/AVC for the same level of quality. The applications for HEVC will not only cover the space of the well-known current uses and capabilities of digital video - they will also include the deployment of new services and the delivery of enhanced video quality, such as ultra-high-definition television (UHDTV) and video with higher dynamic range, wider range of representable color, and greater representation precision than what is typically found today. HEVC is the next major generation of video coding design - a flexible, reliable and robust solution that will support the next decade of video applications and ease the burden of video on world-wide network traffic. This book provides a detailed explanation of the various parts of the standard, insight into how it was developed, and in-depth discussion of algorithms and architectures for its implementation. The book serves the video engineering community by:
? Providing video application developers with invaluable reference to the latest video standard, High Efficiency Video Coding (HEVC);
? Serving as a companion reference that is complementary to the HEVC standards document produced by the JCT-VC - a joint team of ITU-T VCEG and ISO/IEC MPEG;
? Including in-depth discussion of algorithms and architectures for HEVC by some of the key video experts who have been directly involved in developing and deploying the standard;
? Giving insight into the reasoning behind the development of the HEVC feature set, which will aid in understanding the standard and how to use it.
This book is the culmination of the authors' industry-academic collaboration in the past several years. The investigation is largely motivated by bank balance sheet management problems. The main difference between a bank balance sheet management problem and a typical portfolio optimization problem is that the former involves multiple risks. The related theoretical investigation leads to a significant extension of the scope of portfolio theories.
The book combines practitioners' perspectives and mathematical rigor. For example, to guide the bank managers to trade off different Pareto efficient points, the topological structure of the Pareto efficient set is carefully analyzed. Moreover, on top of computing solutions, the authors focus the investigation on the qualitative properties of those solutions and their financial meanings. These relations, such as the role of duality, are most useful in helping bank managers to communicate their decisions to the different stakeholders. Finally, bank balance sheet management problems of varying levels of complexity are discussed to illustrate how to apply the central mathematical results. Although the primary motivation and application examples in this book are focused in the area of bank balance sheet management problems, the range of applications of the general portfolio theory is much wider. As a matter of fact, most financial problems involve multiple types of risks. Thus, the book is a good reference for financial practitioners in general and students who are interested in financial applications. This book can also serve as a nice example of a case study for applied mathematicians who are interested in engaging in industry-academic collaboration.This second edition provides a thorough introduction to contemporary convex function theory with many new results. A large variety of subjects are covered, from the one real variable case to some of the most advanced topics. The new edition includes considerably more material emphasizing the rich applicability of convex analysis to concrete examples. Chapters 4, 5, and 6 are entirely new, covering important topics such as the Hardy-Littlewood-Pólya-Schur theory of majorization, matrix convexity, and the Legendre-Fenchel-Moreau duality theory.
This book can serve as a reference and source of inspiration to researchers in several branches of mathematics and engineering, and it can also be used as a reference text for graduate courses on convex functions and applications.
This book provides a concise yet rigorous introduction to probability theory. Among the possible approaches to the subject, the most modern approach based on measure theory has been chosen: although it requires a higher degree of mathematical abstraction and sophistication, it is essential to provide the foundations for the study of more advanced topics such as stochastic processes, stochastic differential calculus and statistical inference. The text originated from the teaching experience in probability and applied mathematics courses within the mathematics degree program at the University of Bologna; it is suitable for second- or third-year students in mathematics, physics, or other natural sciences, assuming multidimensional differential and integral calculus as a prerequisite. The four chapters cover the following topics: measures and probability spaces; random variables; sequences of random variables and limit theorems; and expectation and conditional distribution. The text includes a collection of solved exercises.
This is a book on nonlinear dynamical systems and their bifurcations under parameter variation. It provides a reader with a solid basis in dynamical systems theory, as well as explicit procedures for application of general mathematical results to particular problems. Special attention is given to efficient numerical implementations of the developed techniques. Several examples from recent research papers are used as illustrations.
The book is designed for advanced undergraduate or graduate students in applied mathematics, as well as for Ph.D. students and researchers in physics, biology, engineering, and economics who use dynamical systems as model tools in their studies. A moderate mathematical background is assumed, and, whenever possible, only elementary mathematical tools are used.
This new edition preserves the structure of the previous editions, while updating the context to incorporate recent theoretical and software developments and moderntechniques for bifurcation analysis.
From reviews of earlier editions:
"I know of no other book that so clearly explains the basic phenomena of bifurcation theory." - Math Reviews
"The book is a fine addition to the dynamical systems literature. It is good to see, in our modern rush to quick publication, that we, as a mathematical community, still have time to bring together, and in such a readable and considered form, the important results on our subject." - Bulletin of the AMS
"It is both a toolkit and a primer" - UK Nonlinear News
"The material is presented in a systematic and very readable form. It covers recent developments in bifurcation theory, with special attention to efficient numerical implementations." - Bulletin of the Belgian Mathematical Society
This book presents the first in-depth academic investigation published in English about one of the most radical incarnations of the current global wave of new right-wing movements and governments: the movement that brought to power the current Brazilian president, Jair Bolsonaro. The rise of this new right-wing movement in Brazil came as a surprise to many analysts who used to see the country as a successful example of the implementation of progressive social policies in the first decade of the 21st century, and posed many questions to those seeking to understand the role Brazil now plays in the development of this international far-right wave. The authors of this volume try to answer some of these questions by presenting the results of an extensive field research conducted over the years with Bolsonaro supporters and members of the new Brazilian right-wing movements. They have analyzed quantitative and especially qualitative data to accompany the accelerated transformations of the Brazilian public sphere, starting from small liberal and conservative groups on social media towards larger audiences via book publishing, the education system, the mainstream media, and the political-party system. By framing the Brazilian case in the wider international political scenario, The Bolsonaro Paradox: The Public Sphere and Right-Wing Counterpublicity in Contemporary Brazil will be an invaluable resource for sociologists, political scientists, international relations scholars and other social scientists ¿ as well as to journalists and political analysts ¿ interested in better understanding the role Brazil plays in the global rise of new far-right movements and governments.
This book takes the small rural town of La Calera, in the outskirts of the Colombian capital of Bogotá, as a case study to analyze how residents from different social classes - wealthier ex-urban newcomers arriving to traditionally peasant and rural areas - interact to decide how nature will be used in the face of further urban expansion. Contrary to the conflicts in other gentrification cases, including those of "green" gentrification, this book shows how newcomers and longtimers in La Calera use environmental concerns to bridge social class rifts and push the state to provide water, public space, and decision-making power.
Residents see abundant ecological resources like water and land around them, but they do not have access to aqueducts, green public space or power over planning decisions affecting the distribution of these resources. As a response, and to challenge the state more effectively, newcomers and longtimers create inter-class alliances through what the author calls third nature: the way residents try to both protect and keep using existing ecological goods. To do so, despite high levels of class inequality, residents had a similar goal of protecting ecological resources around them by intervening in the physical and political landscapes against a state that induces scarcity, selectively enforcing environmental policies to the detriment of Calerunos.
As cities all around the Global South continue to grow, urban expansion posits a threat to the environment by transforming agricultural and protected areas into denser residential or touristic spaces. Moreover, as natural resources become scarcer in the face of climate change, inequality might further existing environmental privileges and vulnerabilities. By examining closely how Calerunos bridge class inequalities for environmental reasons, this case highlights processes that inform other gentrifying rural spaces around the world.
In the last few years, power dissipation has become an important design constraint, on par with performance, in the design of new computer systems. Whereas in the past, the primary job of the computer architect was to translate improvements in operating frequency and transistor count into performance, now power efficiency must be taken into account at every step of the design process. While for some time, architects have been successful in delivering 40% to 50% annual improvement in processor performance, costs that were previously brushed aside eventually caught up. The most critical of these costs is the inexorable increase in power dissipation and power density in processors. Power dissipation issues have catalyzed new topic areas in computer architecture, resulting in a substantial body of work on more power-efficient architectures. Power dissipation coupled with diminishing performance gains, was also the main cause for the switch from single-core to multi-core architectures and a slowdown in frequency increase. This book aims to document some of the most important architectural techniques that were invented, proposed, and applied to reduce both dynamic power and static power dissipation in processors and memory hierarchies. A significant number of techniques have been proposed for a wide range of situations and this book synthesizes those techniques by focusing on their common characteristics. Table of Contents: Introduction / Modeling, Simulation, and Measurement / Using Voltage and Frequency Adjustments to Manage Dynamic Power / Optimizing Capacitance and Switching Activity to Reduce Dynamic Power / Managing Static (Leakage) Power / Conclusions
This book adopts a multidimensional approach to analyze both the historical and emerging factors that contribute to make Latin America and the Caribbean the most unequal region in the world. Social inequality is a historical characteristic of the region, but at the beginning of the 21st century, a handful of progressive governments seemed to be adopting policies that could reduce this historical trend. Many of these efforts, however, were blocked or reversed by the COVID-19 pandemic, which both exposed the persistence of historical trends and contributed to the emergency of new forms of inequality in the region. The different chapters in this contributed volume adopt a multidimensional, intersectional, perspective to analyze both the persistence and the emergency of social devices of production and reproduction of inequalities in the diverse Latin American and Caribbean temporal spatialities. The issues analyzed in the different chapters revolve around four main axes: a) persistence of generational and intergenerational inequalities; b) structural gender inequality; c) intertwined social inequalities: race, class and social structure and; c) historical and economic dimension of inequality. Persistence and Emergencies of Inequalities in Latin America: A Multidimensional Approach will be of interest to researchers interested in the study of social inequality and social justice in different fields of the human and social sciences, such as sociology, political science, history, economics, anthropology and education. It will also be a valuable tool for policy makers and social activists engaged in the discussion, advocacy and implementation of public policies aimed at reducing social inequalities.
This book provides computer engineers, academic researchers, new graduate students, and seasoned practitioners an end-to-end overview of virtual memory. We begin with a recap of foundational concepts and discuss not only state-of-the-art virtual memory hardware and software support available today, but also emerging research trends in this space. The span of topics covers processor microarchitecture, memory systems, operating system design, and memory allocation. We show how efficient virtual memory implementations hinge on careful hardware and software cooperation, and we discuss new research directions aimed at addressing emerging problems in this space.
Virtual memory is a classic computer science abstraction and one of the pillars of the computing revolution. It has long enabled hardware flexibility, software portability, and overall better security, to name just a few of its powerful benefits. Nearly all user-level programs today take for granted that they will have been freed from the burden of physical memory management by the hardware, the operating system, device drivers, and system libraries.
However, despite its ubiquity in systems ranging from warehouse-scale datacenters to embedded Internet of Things (IoT) devices, the overheads of virtual memory are becoming a critical performance bottleneck today. Virtual memory architectures designed for individual CPUs or even individual cores are in many cases struggling to scale up and scale out to today's systems which now increasingly include exotic hardware accelerators (such as GPUs, FPGAs, or DSPs) and emerging memory technologies (such as non-volatile memory), and which run increasingly intensive workloads (such as virtualized and/or "big data" applications). As such, many of the fundamental abstractions and implementation approaches for virtual memory are being augmented, extended, or entirely rebuilt in order to ensure that virtual memory remains viable and performant in the years to come.
This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPC) who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigurable parallel codes. We hope to show that FPGA acceleration, based on the exploitation of the data parallelism, pipelining and concurrency remains promising in view of the diminishing improvements in traditional processor and system design. Table of Contents: FPGA Technology / Reconfigurable Supercomputing / Algorithmic Considerations / FPGA Programming Languages / Case Study: Sorting / Alternative Technologies and Concluding Remarks
This synthesis lecture presents the current state-of-the-art in applying low-latency, lossless hardware compression algorithms to cache, memory, and the memory/cache link. There are many non-trivial challenges that must be addressed to make data compression work well in this context. First, since compressed data must be decompressed before it can be accessed, decompression latency ends up on the critical memory access path. This imposes a significant constraint on the choice of compression algorithms. Second, while conventional memory systems store fixed-size entities like data types, cache blocks, and memory pages, these entities will suddenly vary in size in a memory system that employs compression. Dealing with variable size entities in a memory system using compression has a significant impact on the way caches are organized and how to manage the resources in main memory. We systematically discuss solutions in the open literature to these problems. Chapter 2 provides the foundations of data compression by first introducing the fundamental concept of value locality. We then introduce a taxonomy of compression algorithms and show how previously proposed algorithms fit within that logical framework. Chapter 3 discusses the different ways that cache memory systems can employ compression, focusing on the trade-offs between latency, capacity, and complexity of alternative ways to compact compressed cache blocks. Chapter 4 discusses issues in applying data compression to main memory and Chapter 5 covers techniques for compressing data on the cache-to-memory links. This book should help a skilled memory system designer understand the fundamental challenges in applying compression to the memory hierarchy and introduce him/her to the state-of-the-art techniques in addressing them.
This book provides a structured treatment of the key principles and techniques for enabling efficient processing of deep neural networks (DNNs). DNNs are currently widely used for many artificial intelligence (AI) applications, including computer vision, speech recognition, and robotics. While DNNs deliver state-of-the-art accuracy on many AI tasks, it comes at the cost of high computational complexity. Therefore, techniques that enable efficient processing of deep neural networks to improve key metrics-such as energy-efficiency, throughput, and latency-without sacrificing accuracy or increasing hardware costs are critical to enabling the wide deployment of DNNs in AI systems.
The book includes background on DNN processing; a description and taxonomy of hardware architectural approaches for designing DNN accelerators; key metrics for evaluating and comparing different designs; features of DNN processing that are amenable to hardware/algorithm co-design to improve energy efficiency and throughput; and opportunities for applying new technologies. Readers will find a structured introduction to the field as well as formalization and organization of key concepts from contemporary work that provide insights that may spark new ideas.
Shrinking feature size and diminishing supply voltage are making circuits sensitive to supply voltage fluctuations within the microprocessor, caused by normal workload activity changes. If left unattended, voltage fluctuations can lead to timing violations or even transistor lifetime issues that degrade processor robustness. Mechanisms that learn to tolerate, avoid, and eliminate voltage fluctuations based on program and microarchitectural events can help steer the processor clear of danger, thus enabling tighter voltage margins that improve performance or lower power consumption. We describe the problem of voltage variation and the factors that influence this variation during processor design and operation. We also describe a variety of runtime hardware and software mitigation techniques that either tolerate, avoid, and/or eliminate voltage violations. We hope processor architects will find the information useful since tolerance, avoidance, and elimination are generalizable constructs that can serve as a basis for addressing other reliability challenges as well. Table of Contents: Introduction / Modeling Voltage Variation / Understanding the Characteristics of Voltage Variation / Traditional Solutions and Emerging Solution Forecast / Allowing and Tolerating Voltage Emergencies / Predicting and Avoiding Voltage Emergencies / Eliminiating Recurring Voltage Emergencies / Future Directions on Resiliency
This book introduces readers to emerging persistent memory (PM) technologies that promise the performance of dynamic random-access memory (DRAM) with the durability of traditional storage media, such as hard disks and solid-state drives (SSDs). Persistent memories (PMs), such as Intel's Optane DC persistent memories, are commercially available today. Unlike traditional storage devices, PMs can be accessed over a byte-addressable load-store interface with access latency that is comparable to DRAM. Unfortunately, existing hardware and software systems are ill-equipped to fully avail the potential of these byte-addressable memory technologies as they have been designed to access traditional storage media over a block-based interface. Several mechanisms have been explored in the research literature over the past decade to design hardware and software systems that provide high-performance access to PMs.Because PMs are durable, they can retain data across failures, such as power failures and program crashes. Upon a failure, recovery mechanisms may inspect PM data, reconstruct state and resume program execution. Correct recovery of data requires that operations to the PM are properly ordered during normal program execution. Memory persistency models define the order in which memory operations are performed at the PM. Much like memory consistency models, memory persistency models may be relaxed to improve application performance. Several proposals have emerged recently to design memory persistency models for hardware and software systems and for high-level programming languages. These proposals differ in several key aspects; they relax PM ordering constraints, introduce varying programmability burden, and introduce differing granularity of failure atomicity for PM operations.This primer provides a detailed overview of the various classes of the memory persistency models, their implementations in hardware, programming languages and software systems proposed in the recent research literature, and the PM ordering techniques employed by modern processors.
The advent of multicore processors has renewed interest in the idea of incorporating transactions into the programming model used to write parallel programs. This approach, known as transactional memory, offers an alternative, and hopefully better, way to coordinate concurrent threads. The ACI (atomicity, consistency, isolation) properties of transactions provide a foundation to ensure that concurrent reads and writes of shared data do not produce inconsistent or incorrect results. At a higher level, a computation wrapped in a transaction executes atomically - either it completes successfully and commits its result in its entirety or it aborts. In addition, isolation ensures the transaction produces the same result as if no other transactions were executing concurrently. Although transactions are not a parallel programming panacea, they shift much of the burden of synchronizing and coordinating parallel computations from a programmer to a compiler, to a language runtime system, or to hardware. The challenge for the system implementers is to build an efficient transactional memory infrastructure. This book presents an overview of the state of the art in the design and implementation of transactional memory systems, as of early spring 2010. Table of Contents: Introduction / Basic Transactions / Building on Basic Transactions / Software Transactional Memory / Hardware-Supported Transactional Memory / Conclusions
This book presents an original contribution to the study of care and care work by addressing pressing issues in the field from a Latin American and intersectional perspective. The expansion of professional care and its impacts on public policies related to care are global phenomena, but so far the international literature on the subject has focused mainly on the Global North. This volume aims to enrich this literature by presenting results of research projects conducted in five Latin American countries - Argentina, Brazil, Chile, Colombia and Uruguay -, and comparing them with researches conducted in other countries, such as France, Japan and the USA.
Latin America is a social space where professional care has expanded dramatically over the past twenty years. However, unlike Japan, USA and European countries, such expansion took place in a context of heterogeneous and poorly structured markets, in societies which stand out for its reliance on domestic workers to provide care work in the household as paid workers, in both formal and informal arrangements.
CareandCareWorkers: A Latin American Perspective will be a useful tool for sociologists, anthropologists, social workers, gerontologists and other social scientists dedicated to the study of the growing demand for care services worldwide, as well as to decision makers dealing with public policies related to care services.
"Society cannot function without the unpaid (and poorly and informally paid) work of caregivers. Having the data - and this book presents this data - allows public policy to be based on the realities rather than on the prejudices, habits, or structural injustices of a previous time about gender roles, class, ethnicity, race, migrant status. (?) This volume not only presents the data, then, but also shows how some countries have begun to innovate to provide solutions to the problem that some people are overburdened by care while others do little of it. (?) Scholars and activists in Latin American countries lead the way in showing both how resistance remains and how to innovate. So the rest of the world has much to learn from this volume." - Excerpt from the Foreword by Professor Joan C. Tronto
This book, divided in two volumes, originates from Techno-Societal 2020: the 3rd International Conference on Advanced Technologies for Societal Applications, Maharashtra, India, that brings together faculty members of various engineering colleges to solve Indian regional relevant problems under the guidance of eminent researchers from various reputed organizations.
The focus of this volume is on technologies that help develop and improve society, in particular on issues such as sensor and ICT based technologies for the betterment of people, Technologies for agriculture and healthcare, micro and nano technological applications.
This handbook is a practical guide covering pertinent topics in the specialty of Oral Medicine, which focuses on the diagnosis, prevention, and management of local and systemic conditions affecting the oral and maxillofacial region. Each topic covers a specific disease or disorder with overarching emphasis on diseases presenting with oral mucosal involvement and orofacial pain conditions. Each topic is presented as a summary of key clinical information for quick reference prepared by leading experts in the field. This clinical guide will serve as a valuable resource for students and clinicians in practice.
This book aims to fill a gap in research on women's political representation by developing a multidimensional assessment of female participation in subnational legislatures in a federal political system like Mexico. The Mexican experience in terms of women's political representation at the federal and subnational levels has been very successful, as the reforms created a more robust "gender electoral regime" that promoted an increase in the number of elected female legislators (1987-2021). Still, little is known about the impact of the rise in women's presence in Congresses on other dimensions of political representation, such as symbolic or substantive. Although previous studies on women's political representation in Mexico have yielded exciting conclusions based on empirical evidence and strengthened a theory focused on the analysis of presence, it is still insufficient to explain the other dimensions of representation and the relationship between them. Therefore, this book contributes to the comparative scholarship from the perspective of feminist neo-institutionalism, expanding the understanding of the relationship between women's formal and descriptive representation, the content of legislative work in terms of preferences and interests (substantive representation), and its symbolic effects on women and politics in general (symbolic representation). Women in Mexican Subnational Legislatures: From Descriptive to Substantive Representation will be of interest to political scientists, sociologists, and jurists interested in gender and politics. The book fills a theoretical and empirical gap on the effects of gender parity in the programmatic and symbolic scope of power building. The findings on good practices and challenges are discussed within a broader body of comparative research, providing knowledge to academia, policymakers, and international cooperation agencies about the remaining obstacles to strengthening Latin American democracies and the need to continue exploring the links between subnational politics and democratization of federal political systems.
This book frames a series of protests occurred in Brazil from 2013 to 2016 as exemplary cases of global trends in contentious politics to analyze the tension between two forms of collective action: the militant (militante) and the prefigurative activist (ativista). Building on sociology, political science, and psychology, it explores the relationship between protestors' activities and conceptions of political participation with their subjectivity and agency. The protest cycle triggered by the June 2013 events in Brazil gave strength and popularity to repertoires and strategies of collective action uncommon and innovative. Those praxes defied political parties' conventions, highlighted the limitations of militant unionist tradition, and brought prefigurative activism to the Brazilian left-wing agenda. In this book, Andre Luis Leite de Figueirêdo Sales combines theoretical tools and traditions from South and North America to build an interdisciplinary approach to Political Psychology and answer the question: what psycho-political differences lie behind the disparate forms of political action adopted by militantes (militants) and ativistas (prefigurative activists) in Brazil? Inspired by books of short stories, the chapters discuss different aspects of the distinction between militancy and prefigurative activism. On them, the author deals with problems such as: how are the ongoing changes in Brazilian protest culture connected with the rising popularity of autonomist movements across the globe? What differences does it make rooting protest strategies in principles like resistance or refusal? How does the culture informing militants and prefigurative activists' conduct affect their political goals and horizons? How does militant and prefigurative activist culture relate to militants and prefigurative activists' forms of political consciousness? A Political Psychology Approach to Militancy and Prefigurative Activism: The Case of Brazil will be a valuable tool for social movement researchers from different disciplines interested in understanding how can subjectivity be, at the same time, a determiner of activities performed in collective action, and determined by these same transformative deeds.
This book presents the major advances and technological updates in diagnostic ultrasound procedures, focusing on the principal technological aspects and multiple exam procedures for the pertinent anatomy, both under basal conditions and using Doppler techniques. It offers a comprehensive and precise evaluation of the ultrasound semiotics of the musculoskeletal apparatus, with descriptions of numerous rheumatic and orthopedic disease patterns. It also discusses in detail the vital role of ultrasound in monitoring chronic inflammatory joint disease during therapy, and brand-new highly sensitive Doppler techniques. In view of the tremendous impact of ultrasound-guided interventional procedures on the management of drug delivery in a musculoskeletal setting, the book also includes a chapter on the practical aspects of performing US-guided diagnostic and therapeutic procedures. Providing outstanding diagrams, dynamic images and videos to guide readers, it is a valuableresource for radiologists and clinicians (rheumatologists, orthopedists, physiatrists and anesthesiologists) with different levels of experience ¿ ranging from physicians in training to those who already perform US examinations and US-guided procedures.
This is the first book in English to present a comprehensive analysis of the October 2019 social outbreak in Chile and its consequences for the country¿s political system. For almost 30 years (1990-2019), Chile was recognized as a model of political and economic stability in Latin America, but the 2019 protests put into question the whole structure of representation based on programmatic political parties. This contributed volume analyzes the causes of the social outbreak by examining the interaction between political parties and social movements in Chile since 2000, establishing bridges between the sociology of social movements and the political science of parties and forms of traditional political representation. The book is organized in three parts. The first part analyzes the collapse of the political party system in Chile. The second part shows how social movements introduced innovative forms of political mobilization that challenged the traditional forms of politicalrepresentation. Finally, the third part presents case studies focusing on specific social movements and their contributions to the renewal of political representation in Chile. The Social Outburst and Political Representation in Chile will be a valuable resource for sociologists, political scientists and other social scientists interested in understanding the challenges posed to political parties and institutions by social movements formed by citizens who no longer see themselves represented by the traditional forms political participation.
This book collects contributions which showcase the impact of new augmented reality (AR) and artificial intelligence (AI) technologies considered jointly in the fields of cultural heritage and innovative design. AR is an alternative path of analysis and communication if applied to several fields of research, in particular if related to space and artifacts in it. This happens because the neural network development strengthens the relationship between augmented reality and artificial intelligence, creating processes close to human thought in shorter times. In the last years, the AR/AI expansion and the future scenarios have raised a deep trans-disciplinary speculation. The disciplines of representation (drawing, surveying, visual communication), as a convergence place of multidisciplinary theoretical and applicative studies related to architecture, city, environment, tangible and intangible cultural heritage, are called to contribute to the international debate. The book chapters dealwith augmented reality and artificial intelligence, analyzing their connections as research tools for knowing the environment. In particular, the topics focus on the intersection between real and virtual world and on the heuristic role of drawing in the enhancement and management of cultural heritage, in planning and monitoring the architecture, the environment, or the infrastructures. Scientists involved in AR and AI research applied separately or together in the field of cultural heritage, architectural design, urban planning, and infrastructures analysis, as well as members of public and private organizations make up interdisciplinary groups that fuel the discussion focusing on the priorities and aims of the research related to the disciplines of representation.
This book presents the major advances and technological updates in diagnostic ultrasound procedures, focusing on the principal technological aspects and multiple exam procedures for the pertinent anatomy, both under basal conditions and using Doppler techniques.
It offers a comprehensive and precise evaluation of the ultrasound semiotics of the musculoskeletal apparatus, with descriptions of numerous rheumatic and orthopedic disease patterns. It also discusses in detail the vital role of ultrasound in monitoring chronic inflammatory joint disease during therapy, and brand-new highly sensitive Doppler techniques.
In view of the tremendous impact of ultrasound-guided interventional procedures on the management of drug delivery in a musculoskeletal setting, the book also includes a chapter on the practical aspects of performing US-guided diagnostic and therapeutic procedures. Providing outstanding diagrams, dynamic images and videos to guide readers, it is a valuableresource for radiologists and clinicians (rheumatologists, orthopedists, physiatrists and anesthesiologists) with different levels of experience - ranging from physicians in training to those who already perform US examinations and US-guided procedures.
The updated 2nd edition of Healthcare Management Engineering In Action in the Business Guides on the Go series provides a comprehensive exploration of healthcare management operations. Through a systematic comparison of predictive and analytic decision-making methodologies with traditional management approaches, the book employs case studies derived from real-world hospital and clinic scenarios. It addresses a spectrum of problem encompassing patient flow, capacity management, resource allocation, staffing and scheduling, statistical data analytics, and cost distribution among cooperating providers.
Readers will particularly benefit from the self-paced online video-based learning modules provided with the innovative Book+Course format.
This book aims to fill a gap in research on women's political representation by developing a multidimensional assessment of female participation in subnational legislatures in a federal political system like Mexico. The Mexican experience in terms of women's political representation at the federal and subnational levels has been very successful, as the reforms created a more robust "gender electoral regime" that promoted an increase in the number of elected female legislators (1987-2021). Still, little is known about the impact of the rise in women's presence in Congresses on other dimensions of political representation, such as symbolic or substantive.
Although previous studies on women's political representation in Mexico have yielded exciting conclusions based on empirical evidence and strengthened a theory focused on the analysis of presence, it is still insufficient to explain the other dimensions of representation and the relationship between them. Therefore, this book contributes to the comparative scholarship from the perspective of feminist neo-institutionalism, expanding the understanding of the relationship between women's formal and descriptive representation, the content of legislative work in terms of preferences and interests (substantive representation), and its symbolic effects on women and politics in general (symbolic representation).
Women in Mexican Subnational Legislatures: From Descriptive to Substantive Representation will be of interest to political scientists, sociologists, and jurists interested in gender and politics. The book fills a theoretical and empirical gap on the effects of gender parity in the programmatic and symbolic scope of power building. The findings on good practices and challenges are discussed within a broader body of comparative research, providing knowledge to academia, policymakers, and international cooperation agencies about the remaining obstacles to strengthening Latin American democracies and the need to continue exploring the links between subnational politics and democratization of federal political systems.
The book provides a comprehensive description of the basic ultrasound principles, normal anatomy of the lower limb muscles and classification of muscle strain injuries. Ultrasound images are coupled with anatomical schemes explaining probe positioning and scanning technique for the various muscles of the thigh and leg. For each muscle, a brief explanation of normal anatomy is also provided, together with a list of tricks and tips and advice on how to perform the ultrasound scan in clinical practice. This book is an excellent practical teaching guide for beginners and a useful reference for more experienced sonographers.
This book offers a comprehensive but straightforward, practical handbook on ultrasound (US)-guided nerve blocks. It presents the normal US anatomy of peripheral nerves, clinical aspects of nerve entrapment and different procedures / techniques for each block. Axial or peripheral chronic radicular pain can be particularly severe and debilitating for the patient. The aim of treatment is to provide medium-/ long-term pain relief, and consequently to restore function. The therapeutic nerve block, performed with a perineural injection of anaesthetic, steroid or painkiller, is generally used once conservative treatments have proven unsuccessful and is aimed to avoid surgical options.
Ultrasound guidance, offering the direct and real-time visualization of the needle and adjacent relevant anatomic structures, significantly increases the accuracy and safety of nerve blocks reducing the risk of intraneural or intravascular injection and the potential damage to the surrounding structures, but also enhances the efficacy of the block itself, reducing its onset and drug doses.
This practical volume addresses the needs of physicians dealing with pain management, e.g. anaesthesiologists, radiologists, orthopaedists and physiatrists, with various levels of experience, ranging from physicians in training to those who already perform peripheral nerve blocks with traditional techniques and who want to familiarize with US guided procedures.
This book presents the first in-depth academic investigation published in English about one of the most radical incarnations of the current global wave of new right-wing movements and governments: the movement that brought to power the current Brazilian president, Jair Bolsonaro. The rise of this new right-wing movement in Brazil came as a surprise to many analysts who used to see the country as a successful example of the implementation of progressive social policies in the first decade of the 21st century, and posed many questions to those seeking to understand the role Brazil now plays in the development of this international far-right wave.
The authors of this volume try to answer some of these questions by presenting the results of an extensive field research conducted over the years with Bolsonaro supporters and members of the new Brazilian right-wing movements. They have analyzed quantitative and especially qualitative data to accompany the accelerated transformations of the Brazilian public sphere, starting from small liberal and conservative groups on social media towards larger audiences via book publishing, the education system, the mainstream media, and the political-party system.By framing the Brazilian case in the wider international political scenario, The Bolsonaro Paradox: The Public Sphere and Right-Wing Counterpublicity in Contemporary Brazil will be an invaluable resource for sociologists, political scientists, international relations scholars and other social scientists - as well as to journalists and political analysts - interested in better understanding the role Brazil plays in the global rise of new far-right movements and governments.Hardware acceleration in the form of customized datapath and control circuitry tuned to specific applications has gained popularity for its promise to utilize transistors more efficiently. Historically, the computer architecture community has focused on general-purpose processors, and extensive research infrastructure has been developed to support research efforts in this domain. Envisioning future computing systems with a diverse set of general-purpose cores and accelerators, computer architects must add accelerator-related research infrastructures to their toolboxes to explore future heterogeneous systems. This book serves as a primer for the field, as an overview of the vast literature on accelerator architectures and their design flows, and as a resource guidebook for researchers working in related areas.
This book offers a comprehensive but straightforward, practical handbook on ultrasound (US)-guided nerve blocks. It presents the normal US anatomy of peripheral nerves, clinical aspects of nerve entrapment and different procedures / techniques for each block. Axial or peripheral chronic radicular pain can be particularly severe and debilitating for the patient. The aim of treatment is to provide medium-/ long-term pain relief, and consequently to restore function. The therapeutic nerve block, performed with a perineural injection of anaesthetic, steroid or painkiller, is generally used once conservative treatments have proven unsuccessful and is aimed to avoid surgical options.
Ultrasound guidance, offering the direct and real-time visualization of the needle and adjacent relevant anatomic structures, significantly increases the accuracy and safety of nerve blocks reducing the risk of intraneural or intravascular injection and the potential damage to the surrounding structures, but also enhances the efficacy of the block itself, reducing its onset and drug doses.
This practical volume addresses the needs of physicians dealing with pain management, e.g. anaesthesiologists, radiologists, orthopaedists and physiatrists, with various levels of experience, ranging from physicians in training to those who already perform peripheral nerve blocks with traditional techniques and who want to familiarize with US guided procedures.
"The book is well written, concise yet comprehensive and will prove invaluable for anaesthetists, radiologists and pain physicians who are utilising or starting to perform peripheral nerve blocks under ultrasound guidance. Highly recommended." (A D Taylor, RAD Magazine, August, 2019)
"The purpose is to provide a better understanding of ultrasound physics, sonoanatomy of nerves, and injection techniques. ? Pain physicians as well as primary care physicians, ER physicians, and orthopedic physicians will find it very helpful. ? There are plenty of illustrations and ultrasound images of anatomy, the text is simple and to the point, and scanning techniques are clearly described." (Tariq M. Malik, Doody's Book Reviews, September, 2018)
The material is tutorial for electrical and computer engineers on the topic of transient signals on transmission lines. Emphasis has been placed on aspects of the subject that have application to signal integrity and high-speed digital circuit design issues, including proper termination schemes to avoid impedance discontinuities, reactive and nonlinear loads, and an introduction to crosstalk.
The coverage focuses on the very important topic of transmission line transients which have been de-emphasized in most current textbooks. This book is prepared to supplement traditional texts for advanced students studying electromagnetics and for a vast array of practicing electrical engineers, computer engineers and material scientists with interests in signal integrity and high-speed digital design.
In this second edition, examples and new problems have been added throughout. A new chapter on differential transmission lines has also been incorporated.
The first section of the book aims at summarizing the debate of where the song comes from. It discusses undeveloped topics, methodological hints, and epistemological questions in the different areas of contemporary psychological sciences. The second section of the book presents concrete examples of case-studies and methodological issues (the new melodies in psychological research) to stimulate further explorations. The book aims to bring art back into psychology, to provide an understanding for the art of psychology.
An Old Melody in a New Song will be of interest to advanced students and researchers in the fields of educational and developmental psychology, cultural psychology, history of ideas, aesthetics, and art-based research.
This book contains selected papers presented at the second international Conference on Progress in Digital and Physical Manufacturing (ProDPM'21), organized by the School of Technology and Management (ESTG) of the Polytechnic Institute of Leiria (IPL), from the 27th to 29th of October 2021. It represents a significant contribution to the current advances in digital and physical manufacturing issues as it contains topical research in this field.
The book is an essential reading for all of those working on digital and physical manufacturing, promoting better links between the academia and the industry. The conference papers cover a wide range of important topics like biomanufacturing, advanced rapid prototyping technologies, rapid tooling and manufacturing, micro-fabrication, 3D CAD and data acquisition, and collaborative design.
This volume collects cutting-edge expert reviews in the oxytocin field and will be of interest to a broad scientific audience ranging from social neuroscience to clinical psychiatry. The role of the neuropeptide oxytocin in social behaviors is one of the earliest and most significant discoveries in social neuroscience. Influential studies in animal models have delineated many of the neural circuits and genetic components that underlie these behaviors. These discoveries have inspired researchers to investigate the effects of oxytocin on brain and behavior in humans and its potential relevance as a treatment for psychiatric disorders including borderline personality disorder and autism and schizophrenia spectrum disorders. In fact, there is no established social psychopharmacology in Psychiatry, and oxytocin can be seen as the first endogenous agent specifically addressing social-cognitive impairment in psychiatric disorders, with animal research suggesting that it could be especially efficient in the early postnatal period. From a human perspective, it is crucial to understand more precisely who can benefit from potential oxytocin-related treatments, which outcome measures will best represent their effects, how they should be administered, and what brain mechanisms are likely involved in mediating their effects. This type of "precision medicine" approach is in line with the research domain criteria defined by the U.S. National Institute of Mental Health.
Technology is essential to the delivery of health care but it is still only a tool that needs to be deployed wisely to ensure beneficial outcomes at reasonable costs. Among various categories of health technology, medical equipment has the unique distinction of requiring both high initial investments and costly maintenance during its entire useful life. This characteristic does not, however, imply that medical equipment is more costly than other categories, provided that it is managed properly. The foundation of a sound technology management process is the planning and acquisition of equipment, collectively called technology incorporation. This lecture presents a rational, strategic process for technology incorporation based on experience, some successful and many unsuccessful, accumulated in industrialized and developing countries over the last three decades. The planning step is focused on establishing a Technology Incorporation Plan (TIP) using data collected from an audit of existing technology, evaluating needs, impacts, costs, and benefits, and consolidating the information collected for decision making. The acquisition step implements TIP by selecting equipment based on technical, regulatory, financial, and supplier considerations, and procuring it using one of the multiple forms of purchasing or agreements with suppliers. This incorporation process is generic enough to be used, with suitable adaptations, for a wide variety of health organizations with different sizes and acuity levels, ranging from health clinics to community hospitals to major teaching hospitals and even to entire health systems. Such a broadly applicable process is possible because it is based on a conceptual framework composed of in-depth analysis of the basic principles that govern each stage of technology lifecycle. Using this incorporation process, successful TIPs have been created and implemented, thereby contributing to the improvement of healthcare services and limiting the associated expenses. Table of Contents: Introduction / Conceptual Framework / The Incorporation Process / Discussion / Conclusions
This book presents a range of fundamentally new approaches to solving problems involving traditional molecular models. Fundamental molecular symmetry is shown to open new avenues for describing molecular dynamics beyond standard perturbation techniques. Traditional concepts used to describe molecular dynamics are based on a few fundamental assumptions, the ball-and-stick picture of molecular structure and the respective perturbative treatment of different kinds of couplings between otherwise separate motions. The book points out the conceptual limits of these models and, by focusing on the most essential idea of theoretical physics, namely symmetry, shows how to overcome those limits by introducing fundamentally new concepts.
Sleeve gastrectomy (SG) is the most common bariatric procedure, accounting for more than 55% of all such surgeries performed worldwide.
Obesity has become a major global problem that continues to spread in both developed and developing countries. While prevention of obesity is the best approach for the future, the current challenge is managing those who are already obese or morbidly obese, who constitute close to two thirds of the population in many countries, such as the US. Today, bariatric surgery is the only evidence-based treatment for morbid obesity with a low complication rate and acceptable results in the long-term for both weight loss and resolution of comorbidities.
This book details all the approaches used in sleeve gastrectomy (SG), offering readers the tools needed to perform the perfect SG. Each chapter focuses on the clinical problems and the indications for the sleeve, and describes the technique step-by-step (including videos), the staplers, the different sizes of bougies, reinforcement of the sutures after the sleeve and the metabolic effects of surgery.
The book also presents nontraditional SG techniques, such as the endoscopic approach, stapled sleeve and robotic technologies, discussing how to immediately identify complications and their treatment using endoscopy, laparoscopy and percutaneous image guided surgery.
Further, it includes a chapter on revision surgery and revision procedures, not only from sleeve to other procedures, but also from other procedures to SG. The last section offers an overview of what the authors imagine the future holds for this bariatric procedure.The Perfect Sleeve Gastrectomy - A Clinical Guide to Evaluation, Treatment, and Techniques is an ideal reference resource for general surgeons, bariatric surgeons, endoscopists and gastroenterologists as well as researchers with an interest in obesity and its management. It also appeals to residents and fellows, dietitians, diabetes specialist, psychotherapists and hospital administrators and quality officers.
Considerable progress has been made in recent years in the development of dialogue systems that support robust and efficient human-machine interaction using spoken language. Spoken dialogue technology allows various interactive applications to be built and used for practical purposes, and research focuses on issues that aim to increase the system's communicative competence by including aspects of error correction, cooperation, multimodality, and adaptation in context. This book gives a comprehensive view of state-of-the-art techniques that are used to build spoken dialogue systems. It provides an overview of the basic issues such as system architectures, various dialogue management methods, system evaluation, and also surveys advanced topics concerning extensions of the basic model to more conversational setups. The goal of the book is to provide an introduction to the methods, problems, and solutions that are used in dialogue system development and evaluation. It presents dialogue modelling and system development issues relevant in both academic and industrial environments and also discusses requirements and challenges for advanced interaction management and future research. Table of Contents: Preface / Introduction to Spoken Dialogue Systems / Dialogue Management / Error Handling / Case Studies: Advanced Approaches to Dialogue Management / Advanced Issues / Methodologies and Practices of Evaluation / Future Directions / References / Author Biographies