Marknadens största urval
Snabb leverans

Böcker utgivna av now publishers Inc

Filter
Filter
Sortera efterSortera Populära
  • av Warren B. Powell
    1 336,-

  • av Chen-Ching Liu
    856,-

    Provides the basic concepts of cyber vulnerabilities of distribution systems and cyber-physical system security. Important ICT subjects for distribution systems covered include Supervisory Control and Data Acquisition and Distributed Energy Resources, including renewable energy and smart meters.

  • av Ozlem Tugfe Demir
    1 366,-

    Considers the cell-free network architecture that is designed to reach the goal of uniformly high data rates everywhere. The authors introduce the concept of a cell-free network before laying out the foundations of what is required to design and build such a network.

  • av Pooya Hatami
    1 270,-

    In this comprehensive survey of unconditional pseudorandom generators (PRGs), the authors present the reader with an intuitive introduction to some of the most important frameworks and techniques for constructing unconditional PRGs for restricted models of computation. The authors discuss four major paradigms for designing PRGs: several PRGs based on k-wise uniform generators, small-bias generators, and simple combinations thereof, several PRGs based on "recycling" random bits to take advantage of communication Bottlenecks, connections between PRGs and computational hardness, and PRG frameworks based on random restrictions. The authors explain how to use these paradigms to construct PRGs that work unconditionally, with no unproven mathematical assumptions. The PRG constructions use ingredients such as finite field arithmetic, expander graphs, and randomness extractors. The analyses use techniques such as Fourier analysis, sandwiching approximators, and simplification-under-restrictions lemmas. Paradigms for Unconditional Pseudorandom Generators offers the reader a grounding in an important topic widely used in theoretical computer science and cryptography.

  • av David Jacob Kedziora
    1 286,-

    Over the last decade, the long-running endeavour to automate high-level processes in machine learning (ML) has risen to mainstream prominence. Beyond this, an even loftier goal is the pursuit of autonomy, which describes the capability of the system to independently adjust an ML solution over a lifetime of changing contexts. This monograph provides an expansive perspective on what constitutes an automated/autonomous ML system. In doing so, the authors survey developments in hyperparameter optimisation, multicomponent models, neural architecture search, automated feature engineering, meta-learning, multi-level ensembling, dynamic adaptation, multi-objective evaluation, resource constraints, flexible user involvement, and the principles of generalisation. Furthermore, they develop a conceptual framework throughout to illustrate one possible way of fusing high-level mechanisms into an autonomous ML system. This monograph lays the groundwork for students and researchers to understand the factors limiting architectural integration, without which the field of automated ML risks stifling both its technical advantages and general uptake.

  • av Tom Engsted
    800,-

    Non-Experimental Data, Hypothesis Testing, and the Likelihood Principle: A Social Science Perspective argues that frequentist hypothesis testing - the dominant statistical evaluation paradigm in empirical research - is fundamentally unsuited for analysis of the non-experimental data prevalent in economics and other social sciences. Frequentist tests comprise incompatible repeated sampling frameworks that do not obey the Likelihood Principle (LP). For probabilistic inference, methods that are guided by the LP, that do not rely on repeated sampling, and that focus on model comparison instead of testing (e.g., subjectivist Bayesian methods) are better suited for passively observed social science data and are better able to accommodate the huge model uncertainty and highly approximative nature of structural models in the social sciences. In addition to formal probabilistic inference, informal model evaluation along relevant substantive and practical dimensions should play a leading role. The authors sketch the ideas of an alternative paradigm containing these elements.

  • av Graziano Chesi
    1 270,-

    The study of uncertain systems has played a significant role throughout the history of control engineering due to unknown quantities often being present in the mathematical model of a plant. In this monograph the author provides a unified framework for the fundamental and challengingarea of robustness analysis of uncertain systems, where even the most basic problem of establishing robust stability may be still present. This framework uses linear matrix inequalities (LMIs) to exploit polynomials that can be expressed as sums of squares of polynomials (SOS). The author guides the reader through the motivations for using the framework including considering various types of uncertainties; providing guarantees for robust stability and robust performance; requiring the solution of convex optimization problems; allowing for trade-off between conservatism and complexity; and concluding with a number of special case methods. This monograph can be used by researchers and students to understand the issues and use the numerical examples to identify the use of the framework in modern controls systems.

  • av Shao-Lun Huang
    1 270,-

    In many contemporary and emerging applications of machine learning and statistical inference, the phenomena of interest are characterized by variables defined over large alphabets. This increasing size of both the data and the number of inferences, and the limited available training data means there is a need to understand which inference tasks can be most effectively carriedout, and, in turn, what features of the data are most relevant to them. In this monograph, the authors develop the idea of extracting "universally good" features, and establish that diverse notions of such universality lead to precisely the same features. The information-theoretic approach used results in a local information geometric analysis that facilitates their computation in a host of applications. The authors provide a comprehensive treatment that guides the reader through the basic principles to the advanced techniques including many new results. They emphasize a development from first-principles together with common, unifying terminology and notation, and pointers to the rich embodying literature, both historical and contemporary. Written for students and researchers, this monograph is a complete treatise on the information theoretic treatment of a recognized and current problem in machine learning and statistical inference.

  • av Konstantinos Loupos
    1 610,-

    Infrastructures are ageing and have become a subject of profound concern, awakening our collective consciousness to the imperatives of civil and structural integrity. This concern, compounded by the unrelenting impact of climate change and the burgeoning demand on transport networks and infrastructures, underscores the critical necessity of preserving infrastructural functionality, safety, and alignment with their original design objectives. In doing so, we strive to mitigate health, financial, societal, and environmental risks that these infrastructures may pose. Presently, the imperative is to maintain a vigilant and continuous watch over these critical assets, employing factual, real-time data to support efficient maintenance strategies.This book embarks on a comprehensive exploration of the contemporary industrial challenges surrounding critical infrastructure inspection and maintenance. It offers a detailed examination of current inspection methodologies, manual intervention practices, and the intricate challenges involved in the sustained, systematic evaluation of structural integrity. Subsequently, it delivers an exhaustive analysis and technical elucidation of recent research breakthroughs across various infrastructures and industrial domains. These innovations, deeply rooted in robotics, automation, and digital technologies, augment, support, streamline, and often redefine established paradigms in the realms of inspection and maintenance. Among these innovations are the outcomes of research endeavours co-funded by the European Commission, addressing a spectrum of research priorities and thematic areas.The solutions delineated in this book encompass an array of cutting-edge technologies, from robotic ground vehicles to unmanned aerial systems, digital platforms, advanced sensing solutions, and state-of-the-art visualization and 3D imaging technologies. These technologies foster enhanced data acquisition, precise reporting, and ultimately, more efficient inspection and maintenance approaches. As we navigate through the years ahead, the inevitability of ageing infrastructures looms large, while the tangible value of modern technologies is already manifesting in numerous industrial applications. Though the journey may be arduous, the assimilation of digital technologies into existing infrastructure inspection and maintenance frameworks is gaining traction and acceptance, aligning with the overarching vision articulated within this book.

  • av Drago Ple¿ko
    1 270,-

    The recent surge of interest in AI systems has raised concerns in moral quarters about their ethical use and whether they can demonstrate fair decision taking processes. Issues of unfairness and discrimination are pervasive when decisions are being made by humans, and are potentially amplified when decisions are made using machines with little transparency, accountability, and fairness. In this monograph, the authors introduce a framework for causal fairness analysis to understand, model, and possibly solve issues of fairness in AI decision-making settings. The authors link the quantification of the disparities present in the observed data with the underlying, often unobserved, collection of causal mechanisms that generate the disparity in the first place, a challenge they call the Fundamental Problem of Causal Fairness Analysis (FPCFA). In order to solve the FPCFA, they study the mapping variations and empirical measures of fairness to structural mechanisms and different units of the population, culminating in the Fairness Map.This monograph presents the first systematic attempt to organize and explain the relationship between various criteria in fairness and studies which causal assumptions are needed for performing causal fairness analysis. The resulting Fairness Cookbook allows anyone to assess the existence of disparate impact and disparate treatment. It is a timely and important introduction to developing future AI systems incorporating inherent fairness and as such will be of wide interest not only to AI system designers, but all who are interested in the wider impact AI will have on society.

  • av Joris van de Klundert
    1 610,-

    Human operations, whether in business, at home, or otherwise, cause a transgression of the boundaries of a safe and just operating space for planet Earth and humankind. Developments in operations that have steadily grown over the long course of history, and which have especially gained momentum since the uptake of fossil fuel powered machines in the recent and on-going industrial revolutions, now threaten to cause irreversible damage to ecosystems and society. The present situation calls for new perspectives and understanding of operations and operations management that enable to change the course of development and for operations to provide sustainable solutions for the planet and humankind.In pursuit of sustainable operations, this book analyses the past, present, and future of operations. It first examines the history of operations while explicitly reflecting on its environmental and social sustainability and on the corresponding development of operations management practices. Chapters 2 and 3 provide corresponding theoretical foundations and start studying the operations on planet Earth prior to the appearance of humankind. Chapters 4 to 9 cover the history of human operations until now, from stone tool manufacturing to lights out manufacturing, and from clay tokens to service robots. Chapter 10 synthesizes extant unsustainable operations and operations management practices and the present 4th industrial revolution. Chapter 11 identifies the transition in operations and operations management needed towards a future safe and just operating space, and how the 4th industrial revolution can contribute to this transition.The book is firstly written for all practitioners, scientists, and students of operations management and operations research. It offers extensive and historical insight into the relationship between operations and sustainability that has not yet appeared in the operations management literature. The final two chapters help the operations community to understand current problems, to find directions towards sustainable operations, and to contribute to the necessary transition. The book may also serve as a valuable resource for policy makers, business strategists, technology managers, and others devoting themselves to creating a sustainable future, as it may build the necessary understanding of the present operations that cause transgressions of the safe and just operating space, and of the transition towards sustainable operations.

  • av Emil Bjornson
    1 850,-

    Wireless communication is the backbone of the digitized society, where everything is connected and intelligent. Access points and devices are nowadays equipped with multiple antennas to achieve higher data rates, better reliability, and support more users than in the past. This book gives a gentle introduction to multiple antenna communications with a focus on system modeling, channel capacity theory, algorithms, and practical implications. The basics of wireless localization, radar sensing, and controllable reflection through reconfigurable surfaces are also covered. The goal is to provide the reader with a solid understanding of this transformative technology that changes how wireless networks are designed and operated, today and in the future.The first three chapters cover the fundamentals of wireless channels, and the main benefits of using multiple antennas are identified: beamforming, diversity, and spatial multiplexing. The theory and signal processing algorithms for multiple-input multiple-output (MIMO) communications with antenna arrays at the transmitter and receiver are progressively developed. The next two chapters utilize these results to study point-to-point MIMO channels under line-of-sight (LOS) and non-LOS conditions, covering the shape of signal beams, impact of array geometry, polarization, and ways to achieve reliable communication over fading channels. The book then shifts focus to multi-user MIMO channels, where interference between devices is managed by spatial processing. The next chapter extends the theory to multicarrier channels and explains practical digital, analog, and hybrid hardware implementations. The last two chapters cover the role of multiple antennas in localization and sensing, and how reconfigurable surfaces can improve both communication and sensing systems.The text was developed as the textbook for a university course and builds on the reader's previous knowledge of signals and systems, linear algebra, probability theory, and digital communications. Each chapter contains numerous examples, exercises, and simulation results that can be reproduced using accompanying code. The accompanying code and material is available at https://github.com/emilbjornson/mimobook

  • av Warren B. Powell
    920,-

    Optimization should be the science of making the best possible decisions. Making decisions is a virtually universal human activity encountered by professionals (in any field) or people in their everyday lives. You would think, then, that the study of making good decisions is a subject that should be taught broadly to students throughout engineering, the physical and social sciences, business, and policy. Yet today, "optimization" is widely taught as a mathematically sophisticated subject, often limited to graduate students in specialized fields.In operations research (or industrial engineering), "optimization" is equivalent to deterministic math programming, starting with linear programs (and the simplex algorithm), and then transitioning through integer linear programs and nonlinear programs. If you are in departments like electrical or mechanical engineering, optimization means teaching optimal control. And if you are in computer science, optimization today could be interpreted in the context of machine learning (such as fitting models to data) or as reinforcement learning.This book claims that the traditional style of teaching optimization is misguided and out of date. First, while the simplex algorithm is a powerful strategy for solving linear programs, the details of the simplex algorithm are completely inappropriate in an introductory course in optimization. Second, while linear programs are appropriate for solving many problems, they are only applicable to a tiny fraction of all decisions. Third, linear programs (along with integer and nonlinear programs) are static models for problems with (typically) vector-valued decisions. By contrast, most decisions are sequential since they are made periodically over time as new information is arriving. In addition, the vast majority of these decisions are scalar (possibly continuous or discrete).This book is designed for instructors (or potential instructors) looking to introduce the science of making good decisions to the broadest possible audience. It should also be of interest to anyone who has already had a traditional course in optimization of any type. The presentation is organized around a series of topics that suggest a fundamentally different approach to teaching "optimization" spanning both sequential decision problems (which offer the simplest problem settings) before transitioning to more complex vector-valued decisions. It also makes the case that most problems which are modeled as linear (or integer, or nonlinear programs) are actually methods for making decisions in a sequential setting. For this reason, these topics are introduced with much less emphasis on algorithms than is traditionally used, both in static and sequential settings.

  • av Pierre Alquier
    1 226,-

    Probably almost correct (PAC) bounds have been an intensive field of research over the last two decades. Hundreds of papers have been published and much progress has been made resulting in PAC-Bayes bounds becoming an important technique in machine learning. The proliferation of research has made the field for a newcomer somewhat daunting. In this tutorial, the author guides the reader through the topic's complexity and large body of publications. Covering both empirical and oracle PAC-bounds, this book serves as a primer for students and researchers who want to get to grips quickly with the subject. It provides a friendly introduction that illuminates the basic theory and points to the most important publications to gain deeper understanding of any particular aspect.

  • av Alexander Scriven
    1 286,-

    The Technological Emergence of AutoML presents a comprehensive snapshot of how AutoML has permeated into mainstream use within the early 2020s. This work surveys both their implementation and application in the context of industry. It also defines what a 'performant' AutoML system is - HCI support is valued highly here - and assesses how the current crop of available packages and services lives up to expectations. To do so in a systematic manner, this survey is structured as follows. Section 2 begins by elaborating on the notion of an ML workflow, conceptually framing AutoML in terms of the high-level operations required to develop, deploy and maintain an ML model. Section 3 uses this workflow to support the introduction of industry-related stakeholders and their interests/obligations. These requirements are unified into a comprehensive set of criteria, supported by methods of assessment, that determine whether an AutoML system can be considered performant. Section 4 launches the survey in earnest, assessing the nature and capabilities of existing AutoML technology beginning with an examination of open-source AutoML packages. The section additionally investigates AutoML systems that are designed for specific domains, as well as commercial products. Subsequently, Section 5 assesses where AutoML technology has been used and how it has fared. Academic work focusing on real-world applications is surveyed, as are vendor-based case studies. All key findings and assessments are then synthesized in Section 6, with commentary around how mature AutoML technology is, as well as whether there are obstacles and opportunities for future uptake. Finally, Section 7 provides a concluding overview on the technological emergence of AutoML.

  • av Anna Stuhlmacher
    1 040,-

    The electrical distribution system has undergone significant transformations, which have had a profound impact on distribution system development and expansion. These changes have been primarily driven by changing load profiles, distributed generation sources, and increasingly extreme weather events. Advancements in sensor and communication technologies have played a pivotal role in addressing and adapting to these changes. These changes have also led to an increased focus on reliability and resilience in planning, with priority placed on ensuring robust grid connectivity and flexibility. Three decades ago, power distribution systems were primarily radial with unidirectional power flow. Today's electrical distribution systems have distributed energy resources, leading to bidirectional power flow. The utility's geographic information system network, advanced metering infrastructure, and other technologies are leveraged to allow feeders and distributed energy resources to be interconnected. This has facilitated the integration of the electric grid with networked microgrids, which has improved the overall resilience and efficiency of the distribution system. While there have been notable improvements in grid planning, the power grid remains vulnerable to high-impact, low-frequency events caused by climate change, such as hurricanes and tornadoes. This book outlines potential solutions for addressing future electric grid issues, including transformer overloading due to electric vehicles, optimization challenges, advanced feeder reconfiguration, and contingency planning for extreme events. The proposed approach focuses on the implementation and operation of new technologies, such as renewable energy sources, batteries, flexible loads, and advanced sensors, that have the potential to transform distribution network planning and operation. From traditional methods to innovative networked microgrids within existing infrastructure and non-wire alternative strategies, this book provides a comprehensive overview of state-of-the-art strategies for future problems.

  • av Yunan Chen
    1 040,-

    Data-driven health informatics technologies such as mobile health apps and wearable and smart medical devices have become ubiquitous in people's daily lives. As these technologies advance and become more pervasive, the datafication of personal health research has grown substantially in recent years. The field is however primarily focused on adult users, leaving a limited understanding of children's data practices and technology for managing their health and well-being. In this work, the authors aim to delve deeper into children's health datafication practices, navigating the landscape of their technology use, caregiver involvement, and the distinct factors associated with their development and literacy. The authors' intention is to catalyze future innovations, improving the design and utility of health technologies tailored for children. The authors present an overview of the history of personal health datafication research, child development theories, and child-computer interaction studies. This work contributes to the literature by characterizing the trends in children's health datafication research, reflecting on key research themes to guide future health datafication research focused on children, and by providing recommendations for future research and design of data-driven technologies that support children's health and wellbeing.

  • av Andrea Montanari
    1 270,-

    Spin glass models were introduced by physicists in the 1970s to model the statistical properties of certain magnetic materials. Over the last half century, these models have motivated a blossoming line of mathematical work with applications to multiple fields, at first sight distant from physics. This tutorial is deliberately written in a somewhat non-standard style, from several viewpoints. Rather than developing the theory in the most general setting, the authors focus on two concrete problems that are motivated by questions in statistical estimation. Their treatment is far from exhaustive, but they do not hesitate to pursue detours that are interesting, but indirectly related to the original questions posed by the examples. The authors also present a mixture of non-rigorous and rigorous techniques. The authors clearly indicate when something is proven and explain non-rigorous techniques on examples for which rigorous alternatives are available. Written by two recognized experts and based on a course given at Stanford University, this tutorial is a unique introduction to a topic that has many avenues for furthering research in statistics, mathematics, and computer science. It provides an accessible tutorial to understand and use the theories being deployed in physics for over 50 years.

  • av Ziran Wang
    1 270,-

    The recent development of cloud computing and edge computing shows great promise for the Connected and Automated Vehicle (CAV), by enabling CAVs to offload their massive on-board data and heavy computing tasks. Leveraging the Internet of Things (IoT) technology, different entities in the intelligent transportation system (e.g., vehicles, infrastructure, traffic management centers, etc.) get connected with each other, thus making the entire system smarter, faster, and more efficient. However, these advances also bring significant challenges to public authorities, industry, as well as scientific communities. In terms of system design and control, current cloud and edge architecture of CAVs need to be refined or even redesigned to better function under uncertainties in demand, and to better cooperate with existing conventional vehicles and infrastructure. From the performance assessment perspective, models and simulation tools based on artificial intelligence and big data have been widely developed for validation and evaluation of cloud computing and edge computing, but the validity of these models needs to be re-examined with field implementations. Finally, while the increasing connectivity among vehicles and infrastructures may help improve their perception of the environment and enable coordinated decision making, it also presents new challenges to ensure system safety and security, with inherent disturbances to wireless communication networks and also the inevitably larger attack surface that may be exploited by malicious attacks. In this tutorial, experts from academia and industry introduce the trends and challenges of applying cloud and edge computing for CAVs, highlight representative works in the literature and discuss their limitations, present new promising solutions, and outline future directions in research and engineering. Particular focus will be given to methodologies and tools for building digital twin frameworks with cloud and edge computing for CAVs, quantitative and formal analysis for ensuring CAV safety under disturbances and uncertainties, system-level CAV security threat landscape and defense solution space, and experiences from practical deployment of cloud and edge computing for CAVs.

  • av Zhiang Chen
    1 010,-

    Environmental monitoring is a crucial field encompassing diverse applications, including marine exploration, wildlife conservation, ecosystem assessment, and air quality monitoring. Collecting accurate and timely data from inaccessible locations and challenging environments is essential for understanding and addressing environmental issues. Robots offer a promising solution by enabling data collection at unprecedented spatio-temporal scales. However, relying solely on teleoperation is impractical and limits the efficiency and effectiveness of environmental monitoring efforts. Autonomy plays a pivotal role in unlocking the full potential of robots, allowing them to operate independently and intelligently in complex environments. This monograph focuses on high-level decision-making problems in autonomous environmental monitoring robots. Decision-making at the high level involves strategic planning and coordination to optimize data collection. Addressing these challenges allows robots to autonomously navigate, explore, and gather scientific data in a wide range of environmental monitoring applications. Included in the monograph are representations for different environments, as well as a discussion of using these presentations to solve tasks of interest, such as learning, localization, and monitoring. To efficiently implement the tasks, decision-theoretic optimization algorithms consider: (1) where to take measurements from, (2) which tasks to be assigned, (3) what samples to collect, (4) when to collect samples, (5) how to learn environment; and (6) who to communicate. Finally, this work concludes by presenting the challenges and opportunities in robotic environmental monitoring.

  • av Yao Chen
    1 010,-

    Smart Grid is a power grid system that uses digital communication technologies. By deploying intelligent devices throughout the power grid infrastructure, from power generation to consumption, and enabling communication among them, it revolutionizes the modern power grid industry with increased efficiency, reliability, and availability. However, reliance on information and communication technologies has also made the smart grids exposed to new vulnerabilities and complications that may negatively impact the availability and stability of electricity services, which are vital for people's daily lives. The purpose of this monograph is to provide an up-to-date and comprehensive survey and tutorial on the cybersecurity aspect of smart grids. The monograph focuses on the sources of the cybersecurity issues, the taxonomy of threats, and the survey of various approaches to overcome or mitigate such threats. It covers the state-of-the-art research results in recent years, along with remaining open challenges. This monograph can be used both as learning materials for beginners who are embarking on research in this area and as a useful reference for established researchers in this field.

  • av Sebastian Fixson
    936,-

    An Operations Management Perspective on Design Thinking provides a map of what is known about mechanisms of design thinking when looking through an operations management lens and identifies areas where knowledge gaps still exist. In applying the operations management lens, the author constructs a simple framework for how to assess progress in design thinking activities. To provide improved design thinking progress measures, the author expands this framework by considering multiple dimensions of these measures in greater detail: the outcomes of an operation and its transformation function. Applying the reference set to these multiple dimensions of the expanded framework identifies contributions from other disciplines that can help explain the conditions under which design thinking operations can be managed successfully and pinpoints unexplained gaps that are worthy of future research. The monograph first prepares the methodological ground by putting the attempt to search for better design thinking process measures in the context of existing research approaches. The next section summarizes the origins and characteristics of design thinking and provides an overview of the progress measures that have been proposed for design thinking. The monograph then introduces an operations management perspective for design thinking as an innovation production process. The next section expands this perspective by introducing multiple dimensions and finer grained measures and apply this extended framework to the data set from earlier sections to pull together the current understanding of design thinking and to identify future research opportunities. The monograph concludes with some broader reflections.

  • av Mikhail Chernov
    816,-

    Currency Risk Premiums: A Multi-Horizon Perspective reviews the literature on multi-horizon currency risk premiums. It shows how the multi-horizon implications arise from the classic present-value relationship. The authors further show how these implications manifest themselves in the interaction between bond and currency risk premiums. This link is strengthened by explicitly accounting for stochastic discount factors. Information about currency risk premiums at different horizons presents a wealth of new evidence and challenges for existing models.

  • av Kasper Johansson
    936,-

    A Simple Method for Predicting Covariance Matrices of Financial Returns makes three contributions. First, it proposes a new method for predicting the time-varying covariance matrix of a vector of financial returns, building on a specific covariance estimator suggested by Engle in 2002. The second contribution proposes a new method for evaluating a covariance predictor, by considering the regret of the log-likelihood over some time period such as a quarter. The third contribution is an extensive empirical study of covariance predictors. The authors compare their method to other popular predictors, including rolling window, exponentially weighted moving average (EWMA) and generalized autoregressive conditional heteroscedastic (GARCH) type methods. After an introduction, Section 2 describes some common predictors, including the one that this method builds on. Section 3 introduces the proposed covariance predictor. Section 4 discusses methods for validating covariance predictors that measure both overall performance and reactivity to market changes. Section 5 describes the data used in the authors' first empirical studies and the results are provided in Section 6. The authors then discuss some extensions of and variations on the method, including realized covariance prediction (Section 7), handling large universes via factor models (Section 8), obtaining smooth covariance estimates (Section 9), and using the authors' covariance model to generate simulated returns (Section 10).

  • av Yong-Shik Lee
    756,-

    A seminal case in corporate law (Dodge v. Ford Motor Co), set the cardinal principle that corporations must serve the interests of shareholders rather than the interests of employees, customers, or the community. This principle, referred to as "shareholder primacy," has been considered a tenet of the fiduciary duty owed by corporate directors. The shareholder primacy norm has influenced corporate behavior and encouraged short-term profit-seeking behavior with significant social ramifications. Corporations have been criticized for undermining the interests of employees, customers, and the community in the name of profit maximization. Shareholder Primacy as an Untenable Corporate Norm argues that corporate interests and broader social interests, such as benefits to consumers and employees, are not mutually exclusive and can be reconciled by allowing corporate managers and majority shareholders to define corporate interests more broadly, beyond the narrow confines of shareholder primacy. This article examines the flaws of shareholder primacy as the principle for corporate governance and discuss an alternative approach (the stakeholder approach). It also discusses the necessity of a statutory adjustment and propose legal reform to clarify the current ambiguity about the legal status of shareholder primacy.

  • av Albert N. Link
    950,-

    The primary purpose of Entrepreneurs' Search for Sources of Knowledge is to explore the search process for knowledge used by entrepreneurs and entrepreneurial firms in pursuit of new opportunities, new product innovation opportunities in particular. The second purpose of this monograph is to present empirical evidence about the sources of knowledge that entrepreneurs and entrepreneurial firms actually use (and actually do not use) in an effort to allow observed behavior to inform future economics and management theory about the search for and use of knowledge sources. And, the third purpose of this monograph is to generate new and more complete empirical efforts to construct databases and to conduct analyses-empirical analyses and case studies-related not only to entrepreneur's and entrepreneurial firm's search for and use of sources of knowledge but also to measure the trends in the impacts of their use.

  • av Henrik Hagtvedt
    966,-

    Aesthetic design is pervasive in the marketplace, where it influences consumer behavior, endows products with value, and differentiates between brands. In fact, research suggests that aesthetic appeal drives sales across most product categories. The time is ripe for taking stock of the state of research in this domain. Aesthetics in Marketing begins with a characterization of this domain of research and then organizes extant literature in two ways: First, it provides an overview of aesthetics principles, outcomes stemming from these principles, and contexts in which these principles operate. Second, it zooms in on the principle of ambiguity in specific to provide a detailed discussion of ambiguous versus accessible aesthetic elements. The author also provides directions for future research.

  • av Bryan Kelly
    1 300,-

    Financial Machine Learning surveys the nascent literature on machine learning in the study of financial markets. The authors highlight the best examples of what this line of research has to offer and recommend promising directions for future research. This survey is designed for both financial economists interested in grasping machine learning tools, as well as for statisticians and machine learners seeking interesting financial contexts where advanced methods may be deployed.This survey is organized as follows. Section 2 analyzes the theoretical benefits of highly parameterized machine learning models in financial economics. Section 3 surveys the variety of machine learning methods employed in the empirical analysis of asset return predictability. Section 4 focuses on machine learning analyses of factor pricing models and the resulting empirical conclusions for risk-return tradeoffs. Section 5 presents the role of machine learning in identifying optimal portfolios and stochastic discount factors. Section 6 offers brief conclusions and directions for future work.

  • av Tim Kraft
    906,-

    Supply Chain Transparency and Sustainability examines the academic literature that investigates both the visibility and disclosure dimensions of supply chain transparency within the context of social and environmental responsibility. In order to present a clear picture of the research landscape for the operations management community, the discussions are focused on research from the behavioral and analytical modeling literature. The primary goal is to discuss the most representative and emerging works in this space so as to highlight future research directions and inspire more research on supply chain transparency. While supply chain transparency is a topic of relevance for many management contexts, this monograph focuses on its role in the context of sustainability. The monograph is organized as follows. First, there is a brief background on the topic of supply chain transparency. The authors then review the behavioral literature on supply chain transparency. This is then followed by a review of the analytical modeling literature that examines transparency-related contexts. Finally, the monograph concludes by discussing potential future research directions.

  • av Stanley H. Chan
    1 306,-

    Since the seminal work of Andrey Kolmogorov in the early 1940's, imaging through atmospheric turbulence has grown from a pure scientific pursuit to an important subject across a multitude of civilian, space-mission, and national security applications. Fueled by the recent advancement of deep learning, the field is further experiencing a new wave of momentum. However, for these deep learning methods to perform well, new efforts are needed to build faster and more accurate computational models while at the same time maximizing the performance of image reconstruction. The goal of this book is to present the basic concepts of turbulence physics while accomplishing the goal of image reconstruction. Starting with an exploration of optical modeling and computational imaging in Chapter 1, the book continues to Chapter 2, discussing the essential optical foundations required for the subsequent chapters. Chapter 3 introduces a statistical model elucidating atmospheric conditions and the propagation of waves through it. The practical implementation of the Zernike-based simulation is discussed in Chapter 4, paving the way for the machine learning solutions to reconstruction in Chapter 5. In this concluding chapter, classical and contemporary trends in turbulence mitigation are discussed, providing readers with a comprehensive understanding of the field's evolution and a sense of its direction. The book is written primarily for image processing engineers, computer vision scientists, and engineering students who are interested in the field of atmospheric turbulence, statistical optics, and image processing. The book can be used as a graduate text, or advanced topic classes for undergraduates.

Gör som tusentals andra bokälskare

Prenumerera på vårt nyhetsbrev för att få fantastiska erbjudanden och inspiration för din nästa läsning.