首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3382篇
  免费   45篇
  国内免费   5篇
教育   2540篇
科学研究   264篇
各国文化   39篇
体育   268篇
综合类   13篇
文化理论   52篇
信息传播   256篇
  2023年   9篇
  2022年   39篇
  2021年   71篇
  2020年   91篇
  2019年   132篇
  2018年   177篇
  2017年   186篇
  2016年   159篇
  2015年   98篇
  2014年   112篇
  2013年   592篇
  2012年   115篇
  2011年   94篇
  2010年   112篇
  2009年   71篇
  2008年   90篇
  2007年   86篇
  2006年   74篇
  2005年   104篇
  2004年   109篇
  2003年   122篇
  2002年   101篇
  2001年   87篇
  2000年   58篇
  1999年   39篇
  1998年   20篇
  1997年   14篇
  1996年   19篇
  1995年   25篇
  1994年   23篇
  1993年   23篇
  1992年   14篇
  1991年   22篇
  1990年   20篇
  1989年   26篇
  1988年   10篇
  1987年   19篇
  1986年   10篇
  1985年   22篇
  1984年   12篇
  1983年   9篇
  1982年   13篇
  1981年   9篇
  1980年   10篇
  1979年   11篇
  1978年   9篇
  1977年   13篇
  1976年   12篇
  1974年   12篇
  1973年   8篇
排序方式: 共有3432条查询结果,搜索用时 15 毫秒
921.
The current study reports on the process of developing a self-assessment instrument for vocational education students’ generic working life competencies. The instrument was developed based on a competence framework and in close collaboration with several vocational education teachers and intermediary organisations offering various human resource services. A first version of the questionnaire was presented to 26 students and 5 recent graduates who were asked to comment on the items. The pilot version of the questionnaire was completed by 826 students. Half of the data were used to explore the structure of the questionnaire (n?=?413) and the other half were used to confirm the structure (n?=?413). The results showed that 8 factors could be distinguished. Further analysis reduced this to 7 usable factors: empathy, listening, assertiveness, professional attitude, problem solving, cooperation ability and, planning and prioritising. The revised questionnaire containing 44 items was tested a second time to determine the stability and measurement invariance of the instrument. In total, 456 students from the first sample completed the questionnaire. The structure was confirmed and measurement invariance across students with and without working experience was established.  相似文献   
922.
Individual consumers in the household sector increasingly develop products, services and processes, in their discretionary time without payment. Household sector innovation is becoming a pervasive phenomenon, representing a significant share of the innovation activity in any economy. Such innovation emerges from personal needs or self-rewards, and is distinct from and complementary to producer innovations motivated by commercial gains. In this introductory paper to the special issue on household sector innovation, we take stock of emerging research on the topic. We categorize the research into four areas: scope, emergence, implications for business, and diffusion. We develop a conceptual basis for the phenomenon, introduce the articles in the special issue, and show how each article contributes new insights. We end by offering a research agenda for scholars interested in the salient phenomenon of household sector innovation.  相似文献   
923.
IntroductionMost laboratories routinely determine haemolysis, icterus and lipemia indices to identify lipemic samples and reject potentially affected results. Hypertriglyceridemia is the most common cause of lipemia and severe hypertriglyceridemia (≥ 11.3 mmol/L) is a major risk factor of acute pancreatitis.Laboratory analysisA 56-year-old woman attended the outpatient clinic for a follow-up visit 1 month after a kidney transplantation. Her immunosuppressive therapy consisted of corticosteroids, cyclosporine, and mycophenolic acid. The routine clinical chemistry sample was rejected due to extreme lipemia. The comment “extreme lipemic sample” was added on the report, but the requesting physician could not be reached. The Cobas 8000 gave a technical error (absorption > 3.3) for the HIL-indices (L-index: 38.6 mmol/L) which persisted after high-speed centrifugation. The patient was given a new appointment 2 days later. The new sample was also grossly lipemic and gave the same technical error (L-index: 35.9 mmol/L).What happenedThe second sample was manually diluted 20-fold after centrifugation to obtain a result for triglycerides within the measuring range (0.10–50.0 mmol/L). Triglycerides were 169.1 mmol/L, corresponding to very severe hypertriglyceridemia. This result was communicated to the nephrologist and the patient immediately recalled to the hospital. She received therapeutic plasma exchange the next day and did not develop acute pancreatitis.Main lessonThis case illustrates the delicate balance between avoiding the release of unreliable results due to lipemia and the risk of delayed diagnosis when results are rejected. Providing an estimate of the degree of hypertriglyceridemia might be preferable to rejecting the result.  相似文献   
924.
This paper tackles the compensation problem of linear time invariant systems affected by unmatched perturbations. The proposed methodology exploits a high order sliding mode observer, guaranteeing theoretically exact state and perturbation estimation. A compensation based strategy is proposed to cope with the unmatched perturbations. The compensation of the desired coordinate is carried through a nested backward sliding surface design, which compensates some of the non-actuated state components, while the remaining states are maintained to be bounded. The feasibility of the technique was tested in an active suspension vehicle system.1  相似文献   
925.
In this work, we present a new approach for loss probability estimation in a single server link. We show how to get the estimates analytically once we assume multifractal input traffic. In order to make the estimation procedure numerically tractable without losing the accuracy, we propose the use of a Gaussian mixture model to represent the heavy tail distribution of modern network traffic trace. The adopted evaluation procedure is based on two performance measures: empirical traffic arrival load distribution and loss probability at connection. Extensive experimental tests validate the efficiency and accuracy of the proposed loss probability estimation approach against the results obtained by simulations with real traffic and by comparing with other multifractal approaches suggested in the literature.  相似文献   
926.
Microfluidics approaches have gained popularity in the field of directed cell migration, enabling control of the extracellular environment and integration with live-cell microscopy; however, technical hurdles remain. Among the challenges are the stability and predictability of the environment, which are especially critical for the observation of fibroblasts and other slow-moving cells. Such experiments require several hours and are typically plagued by the introduction of bubbles and other disturbances that naturally arise in standard microfluidics protocols. Here, we report on the development of a passive pumping strategy, driven by the high capillary pressure and evaporative capacity of paper, and its application to study fibroblast chemotaxis. The paper pumps—flowvers (flow + clover)—are inexpensive, compact, and scalable, and they allow nearly bubble-free operation, with a predictable volumetric flow rate on the order of μl/min, for several hours. To demonstrate the utility of this approach, we combined the flowver pumping strategy with a Y-junction microfluidic device to generate a chemoattractant gradient landscape that is both stable (6+ h) and predictable (by finite-element modeling calculations). Integrated with fluorescence microscopy, we were able to recapitulate previous, live-cell imaging studies of fibroblast chemotaxis to platelet derived growth factor (PDGF), with an order-of-magnitude gain in throughput. The increased throughput of single-cell analysis allowed us to more precisely define PDGF gradient conditions conducive for chemotaxis; we were also able to interpret how the orientation of signaling through the phosphoinositide 3-kinase pathway affects the cells’ sensing of and response to conducive gradients.  相似文献   
927.
Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software—continue to—rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered instead. Trust (i.e. full write-access) is ‘backgrounded’ by means of a permanent mobilization of Wikipedians to monitor incoming edits. Computational approaches have been developed for the purpose, yielding both sophisticated monitoring tools that are used by human patrollers, and bots that operate autonomously. Measures of reputation are also under investigation within Wikipedia; their incorporation in monitoring efforts, as an indicator of the trustworthiness of editors, is envisaged. These collective monitoring efforts are interpreted as focusing on avoiding possible damage being inflicted on Wikipedian spaces, thereby being allowed to keep the discretionary powers of editing intact for all users. Further, the essential differences between backgrounding and substituting trust are elaborated. Finally it is argued that the Wikipedian monitoring of new edits, especially by its heavy reliance on computational tools, raises a number of moral questions that need to be answered urgently.  相似文献   
928.
This article examines the politics of calculative devices in one of the most successful areas of finance, the life insurance business. By empirically tracing an insurance applicant's risk trajectory, it analyses how calculative devices perform insurance underwriting through acting on insurance risk decisions. This allows one to document what calculative devices exactly do, and to point out the political effects of what they do. First, it highlights the fact that, contrary to thinking in terms of ‘the insurance logic’, there are multiple ways of calcuting life insurance risks. Second, it underscores the crucial role of calculative devices in that process by demonstrating how they align considerations as divergent as economics and medicine to perform a life insurance market. It then demonstrates the political effects of these calculative devices by making explicit how the latter contribute to the production of inequalities in calculative power in life insurance. In this way, the article links up insights from the performativity approach in the sociology of markets with the broader question of governing economic life. Such an approach, it is argued, provides the opportunity to open up the organization of economic markets and to put classic questions of justice and power struggles in economic markets on the agenda again.  相似文献   
929.
Most studies of the effects of the Bayh-Dole Act have focused on universities. In contrast, we analyze patenting activity at two prominent national laboratories, Sandia National Laboratories and the National Institute of Standards and Technology before and after the enactment of this legislation and the Stevenson-Wydler Act. It appears as though the enactment of Bayh-Dole and the Stevenson-Wydler Act were not sufficient to induce an increase in patenting at these labs. However, the establishment of financial incentive systems, embodied in passage of the Federal Technology Transfer Act, as well as the allocation of internal resources to support technology transfer, stimulated an increase in such activity.  相似文献   
930.
Monoclonal antibodies played a key role in the development of the biotechnology industry of the 1980s and 1990s. Investments in the sector and commercial returns have rivaled those of recombinant DNA technologies. Although the monoclonal antibody technology was first developed in Britain, the first patents were taken out by American scientists. During the first Thatcher government in Britain, blame for the missed opportunity fell on the scientists involved as well as on the National Research and Development Corporation, which had been put in place after World War II to avoid a repeat of the penicillin story, when patent rights were not sought. Instead of apportioning the blame, this essay suggests that despite past experiences and despite the new channels that were in place, Britain was not in a "patent culture" in the 1970s. It traces the long and painful process that made a commercial attitude among publicly funded British research scientists and in a civil service institution like the Medical Research Council both possible and desirable. In this process the meaning of the term "public science" also changed dramatically.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号