首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The biological species concept (BSC) is the cornerstone of neo-Darwinian thinking. In BSC, species do not exchange genes either during or after speciation. However, as gene flow during speciation is increasingly being reported in a substantial literature, it seems time to reassess the revered, but often doubted, BSC. Contrary to the common perception, BSC should expect substantial gene flow at the onset of speciation, not least because geographical isolation develops gradually. Although BSC does not stipulate how speciation begins, it does require a sustained period of isolation for speciation to complete its course. Evidence against BSC must demonstrate that the observed gene flow does not merely occur at the onset of speciation but continues until its completion. Importantly, recent genomic analyses cannot reject this more realistic version of BSC, although future analyses may still prove it wrong. The ultimate acceptance or rejection of BSC is not merely about a historical debate; rather, it is about the fundamental nature of species – are species (and, hence, divergent adaptations) driven by a relatively small number of genes, or by thousands of them? Many levels of biology, ranging from taxonomy to biodiversity, depend on this resolution.  相似文献   

2.
Knowledge Management Research & Practice - A knowledge management (KM) scope denotes our conception about what is relevant and useful, concerns what and how we are to manage, in KM. KM scopes...  相似文献   

3.
Paul DB 《Endeavour》1999,23(4):159-161
How the term ‘genetic test’ is defined, matters for social policy. The past few years have witnessed many efforts to enact legal barriers specifically against genetic discrimination. To the extent that information derived from genetic tests receives special protection, both enthusiasts for genetic medicine and those who stress its perils have an incentive to adopt a broad interpretation of genetic testing. However, the consequences have not always been those anticpated.  相似文献   

4.
5.
Knowledge Management Research & Practice - The knowledge management (KM) mindset is a precursor to a knowledge-sharing culture. It is often assumed that developing a KM mindset is somewhat...  相似文献   

6.
“The important thing is not to win, it is to take part,” this famous saying by Pierre de Coubertin asserts that the value athletes draw from Olympic games lies in their participation in the event and not in the gold they collect during it. We find similar evidence for scientists involved in grant competitions. Relying on unique data from a Swiss funding program, we find that scientists taking part in a research grant competition boost their number of publications and average impact factor while extending their knowledge base and their collaboration network regardless of the result of the competition. Receiving the funds increases the probability of co-authoring with co-applicants but has no additional impact on the individual productivity.  相似文献   

7.
Ethics and Information Technology - Developing and implementing artificial intelligence (AI) systems in an ethical manner faces several challenges specific to the kind of technology at hand,...  相似文献   

8.
Previous research has contributed to our understanding of technology-intensive firms by proposing alternative typologies or classifications of these firms. This study explores the Koberg typology that characterizes firms by growth stage and production technology into four types of organizations: Embryonic, Start-up, Growth, and Mature Multiline. We propose that, consistent with the typology, firms in each type will differ in strategy, structure, practices, and leadership and that performance is contingent on the fit between organizational factors and typology type. In a study of 377 technology-intensive firms, we find support for differences among organizations in organizational variables by typology type. The characteristics of the organizations as well as the typology are significant in explaining organizational performance. However, except for organizational structure, the firm characteristics that are related to performance do not significantly vary by type. The results are discussed in the context of theory development and managerial implications.  相似文献   

9.
In spite of often compelling reasons for why people should seek information, they persistently engage in lower levels of it than might be expected, at times seeking no information at all. The idealized model reflecting dogged persistence, pursuing rational search strategies, until a high-quality answer is found is often assumed in the design of information systems. However, many people confronted with health problems engage in avoidance and denial, making a health care system dependent on proactivity problematic. This essay explores six conditions, the idealized model, avoidance, bewilderment, serendipity, ignorance is bliss, and indolence, that arise from low and high effort strategies when typed by good, contingent, and bad health outcomes. Since a substantial proportion of the population does not act in accordance with our assumptions, it may be time for policy makers, system designers and researchers to revisit their approaches to facilitating health-related information seeking.  相似文献   

10.
Taylor (1968) dramatically stated that information seekers/searchers do not use their real Q1-level of information need when formulating their query to the system. Instead, they use a compromised Q4-level form of their need. The article directly confronts what Taylor's (1968) Q1-level information need is–the “actual” or “real” information need of the searcher. The article conceptually and operationally defines what Taylor's Q1-level of information need is using Belkin's (1980) ASK concept as a basis for designing a system intervention that shifts the searcher from representing the Q4-level compromised form of the need in her query to representing instead her Q1-level real information need. The article describes the Q1 Actualizing Intervention Model, which can be built into a system capable of actualizing the uncertainty distribution of the searcher's belief ASK so that information search is directed by the searcher's real Q1-level information need. The objective of the Q1 Actualizing Intervention Model is to enable in our Knowledge Age the introduction of intervention IR systems that are organic and human-centric, designed to initiate organic knowledge production processes in the searcher.  相似文献   

11.
This paper is concerned with whether there is a moral difference between simulating wrongdoing and consuming non-simulatory representations of wrongdoing. I argue that simulating wrongdoing is (as such) a pro tanto wrong whose wrongness does not tarnish other cases of consuming representations of wrongdoing. While simulating wrongdoing (as such) constitutes a disrespectful act, consuming representations of wrongdoing (as such) does not. I aim to motivate this view in part by bringing a number of intuitive moral judgements into reflective equilibrium, and in part by describing the case of a character that I call The Devious Super Geek who simulates wrong to particular people that he knows personally. I build bridging cases from the case of the Devious Super Geek to capture games in which one simulates wrong to imaginary members of extant, morally salient categories. The surprising conclusions that we are led to include not just that simulated wrongdoing is pro tanto wrong, but that simulated Just killing is pro tanto wrong, and also that the simulated killing of zombies and aliens is also pro tanto wrong. Finally, I describe how I propose to handle some potential objections and attempt to weigh the pro tanto wrong identified in the paper against some countervailing considerations in some all things considered judgements.  相似文献   

12.
To determine the normal range of Hemoglobin and cutoff values in healthy adults of Southern India, blood samples were analyzed for parameters of RBC and iron metabolism in 177 male and 203 female medical students. The data were compared with the American white population (NHANES III) and the WHO criteria for detection of anemia. The mean values for hemoglobin and hematocrit in male students differed minimally from American white males. However, values for parameters of iron metabolism were lower except total iron binding capacity (TIBC) which was higher. In female students, hemoglobin, hematocrit and parameters of iron metabolism were lower than American white females, except TIBC which was higher. Lower 5th percentile cutoff point (Mean − 1.645 SD) in males and females were 13.5 and 10 g/dl respectively. In conclusion, South Indian adult males have Hb values similar to American male adults, but South Indian females have considerably lower Hb levels than American females, raising the questions about appropriateness of WHO or US criteria for detection of anemia in Indian females.  相似文献   

13.
One of the most significant recent technological developments concerns the development and implementation of ‘intelligent machines’ that draw on recent advances in artificial intelligence (AI) and robotics. However, there are growing tensions between human freedoms and machine controls. This article reports the findings of a workshop that investigated the application of the principles of human freedom throughout intelligent machine development and use. Forty IS researchers from ten different countries discussed four contemporary AI and humanity issues and the most relevant IS domain challenges. This article summarizes their experiences and opinions regarding four AI and humanity themes: Crime & conflict, Jobs, Attention, and Wellbeing. The outcomes of the workshop discussions identify three attributes of humanity that need preservation: a critique of the design and application of AI, and the intelligent machines it can create; human involvement in the loop of intelligent machine decision-making processes; and the ability to interpret and explain intelligent machine decision-making processes. The article provides an agenda for future AI and humanity research.  相似文献   

14.
XML has become a universal standard for information exchange over the Web due to features such as simple syntax and extensibility. Processing queries over these documents has been the focus of several research groups. In fact, there is broad literature in efficient XML query processing which explore indexes, fragmentation techniques, etc. However, for answering complex queries, existing approaches mainly analyze information that is explicitly defined in the XML document. A few work investigate the use of Prolog to increase the query possibilities, allowing inference over the data content. This can cause a significant increase in the query possibilities and expressive power, allowing access to non-obvious information. However, this requires translating the XML documents into Prolog facts. But for regular queries (which do not require inference), is this a good alternative? What kind of queries could benefit from the Prolog translation? Can we always use Prolog engines to execute XML queries in an efficient way? There are many questions involved in adopting an alternative approach to run XML queries. In this work, we investigate this matter by translating XML queries into Prolog queries and comparing the query processing times using Prolog and native XML engines. Our work contributes by providing a set of heuristics that helps users to decide when to use Prolog engines to process a given XML query. In summary, our results show that queries that search elements by a key value or by its position (simple search) are more efficient when run in Prolog than in native XML engines. Also, queries over large datasets, or that searches for substrings perform better when run by native XML engines.  相似文献   

15.
《Research Policy》2022,51(5):104505
In this study we attempt to shed more light on the relationship between speed of new technology imitation and the sales performance of the imitator compared to the innovator, with a particular focus on the performance outcomes resulting from the rapid imitation of technologies introduced by the market leader. Using data on handset technologies mounted on more than 600 devices introduced to the UK market by 14 mobile phone vendors operating from 1997 to 2008, we study hundreds of imitative actions to test hypotheses on the extent to which an imitator can catch up (i.e., reduce the market share gap) with the market leader by rapidly imitating its innovations. First, we show that gaining advantage by rapidly imitating a technology pioneer is contingent on whether the pioneer is the market leader or a non-leader rival. Second, we find that the risks of rapid imitation of the market leader's technologies are mitigated when industry clockspeed is high, i.e., during a period of fast innovation and imitation cycles in an industry, resulting in rapid variations in product design. Third, we observe that the degree of competitive responsiveness of the technology pioneer when its innovations are imitated represents an important mechanism that can explain why speed of imitation may affect how an imitator can improve its market share gains relative to the pioneer. This paper advances competitive dynamics and imitation as predictive theories of how rapid imitators might catch up with market leaders in technology-intensive industries.  相似文献   

16.
17.
Narratives are comprised of stories that provide insight into social processes. To facilitate the analysis of narratives in a more efficient manner, natural language processing (NLP) methods have been employed in order to automatically extract information from textual sources, e.g., newspaper articles. Existing work on automatic narrative extraction, however, has ignored the nested character of narratives. In this work, we argue that a narrative may contain multiple accounts given by different actors. Each individual account provides insight into the beliefs and desires underpinning an actor’s actions. We present a pipeline for automatically extracting accounts, consisting of NLP methods for: (1) named entity recognition, (2) event extraction, and (3) attribution extraction. Machine learning-based models for named entity recognition were trained based on a state-of-the-art neural network architecture for sequence labelling. For event extraction, we developed a hybrid approach combining the use of semantic role labelling tools, the FrameNet repository of semantic frames, and a lexicon of event nouns. Meanwhile, attribution extraction was addressed with the aid of a dependency parser and Levin’s verb classes. To facilitate the development and evaluation of these methods, we constructed a new corpus of news articles, in which named entities, events and attributions have been manually marked up following a novel annotation scheme that covers over 20 event types relating to socio-economic phenomena. Evaluation results show that relative to a baseline method underpinned solely by semantic role labelling tools, our event extraction approach optimises recall by 12.22–14.20 percentage points (reaching as high as 92.60% on one data set). Meanwhile, the use of Levin’s verb classes in attribution extraction obtains optimal performance in terms of F-score, outperforming a baseline method by 7.64–11.96 percentage points. Our proposed approach was applied on news articles focused on industrial regeneration cases. This facilitated the generation of accounts of events that are attributed to specific actors.  相似文献   

18.
Even though knowledge assets have been widely recognized as the principal drivers of firm's competitive advantage, few are the frameworks that have explained how these strategic assets are transformed into value and how the value creation process occurs. Also there is a confusing terminology in the literature surrounding many concepts explaining the dynamics of value creation. By conducting a Systematic Review – an evidence-based methodology for theory building – this paper seeks to define a ‘common language’ of the concepts used to explain this phenomenon, and build the assumptions of a theoretical model that explains how knowledge assets, through learning mechanisms, are linked, renewed, and leveraged into socio-technical processes or organizational routines, that in turn form the basis of organizational capabilities. As they are socially constructed, these organizational capabilities, when leveraged into products and services, generate value and provide firms with a sustainable competitive advantage and long-term superior performance. The model should therefore serve as a theoretical contribution to the literature and it has a further potential benefit to begin an inquiry, for both theory building and management, about the nature of firm's knowledge assets and organizational capabilities, and the sources of sustainable competitive advantage. Some of these avenues are outlined in this paper.  相似文献   

19.

Self-driving cars promise solutions to some of the hazards of human driving but there are important questions about the safety of these new technologies. This paper takes a qualitative social science approach to the question ‘how safe is safe enough?’ Drawing on 50 interviews with people developing and researching self-driving cars, I describe two dominant narratives of safety. The first, safety-in-numbers, sees safety as a self-evident property of the technology and offers metrics in an attempt to reassure the public. The second approach, safety-by-design, starts with the challenge of safety assurance and sees the technology as intrinsically problematic. The first approach is concerned only with performance—what a self-driving system does. The second is also concerned with why systems do what they do and how they should be tested. Using insights from workshops with members of the public, I introduce a further concern that will define trustworthy self-driving cars: the intended and perceived purposes of a system. Engineers’ safety assurances will have their credibility tested in public. ‘How safe is safe enough?’ prompts further questions: ‘safe enough for what?’ and ‘safe enough for whom?’

  相似文献   

20.
In the face of ubiquitous information communication technology, the presence of blogs, personal websites, and public message boards give the illusion of uncensored criticism and discussion of the ethical implications of business activities. However, little attention has been paid to the limitations on free speech posed by the control of access to the Internet by private entities, enabling them to censor content that is deemed critical of corporate or public policy. The premise of this research is that transparency alone will not achieve the desired results if ICT is used in a one way system, controlled by the provider of information. Stakeholders must have an avenue using the same technology to respond to and interact with the information. We propose a model that imposes on corporations a public trust, requiring these gatekeepers of communication technology to preserve individual rights to criticism and review.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号