首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11635篇
  免费   1篇
教育   8966篇
科学研究   1227篇
各国文化   4篇
体育   357篇
文化理论   384篇
信息传播   698篇
  2023年   2篇
  2022年   4篇
  2021年   6篇
  2020年   5篇
  2019年   14篇
  2018年   2181篇
  2017年   2080篇
  2016年   1576篇
  2015年   111篇
  2014年   119篇
  2013年   93篇
  2012年   226篇
  2011年   691篇
  2010年   836篇
  2009年   432篇
  2008年   642篇
  2007年   1150篇
  2006年   69篇
  2005年   395篇
  2004年   448篇
  2003年   359篇
  2002年   130篇
  2001年   6篇
  2000年   23篇
  1999年   4篇
  1998年   3篇
  1997年   15篇
  1996年   2篇
  1995年   2篇
  1993年   1篇
  1991年   6篇
  1981年   1篇
  1980年   1篇
  1976年   1篇
  1956年   1篇
  1828年   1篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
141.
142.
143.
Rats fed with hypercholesterolemic diet showed a significant increase in serum total—cholesterol, liver homogenate total-cholesterol, HDL-cholesterol and changed LDL-cholesterol, and HDL/LDL ratio in comparison to control. Flaxseedchutney (FC) supplemented diet (15%, w/w) was found to be more effective in restoring lipid profile changes in rats fed with cholesterol, (1.0%). The activities of serum marker enzymes glutamate oxaloacetate transminase (GOT), glutamate pyruvate transaminase (GPT) and alkaline phosphatase (ALP) were elevated significantly in carbon tetrachloride induced rats. Administration of flaxseedchutney (15%, w/w) resulted in depletion of serum marker enzymes and exhibited recoupment thus showing significant hepatoprotective effect. It was observed that flaxseedchutney supplemented diet could lower the serum cholesterol and as a potential source of antioxidants it could exert protection against hepatotoxic damage induced by carbon tetrachloride (CCl4) in rats.  相似文献   
144.
Acute coronary syndrome (ACS) is a term for a range of clinical signs and symptoms suggestive of myocardial ischemia. It results in functional and structural changes and ultimately releasing protein from injured cardiomyocytes. These cardiac markers play a major role in diagnosis and prognosis of ACS. This study aims to assess the efficacy of heart type fatty acid binding protein (h-FABP) as a marker for ACS along with the routinely used hs-TropT. In our observational study, plasma h-FABP (cut-off 6.32 ng/ml) and routinely done hs-Trop T (cutoff 0.1 and 0.014 ng/ml) were estimated by immunometric laboratory assays in 88 patients with acute chest pain. Based on the clinical and laboratory test findings the patients were grouped into ACS (n = 41) and non-ACS (n = 47). The diagnostic sensitivity, specificity, NPV, PPV and ROC curve at 95 % CI were determined. Sensitivity of hs-TropT (0.1 ng/ml), hs-TropT (0.014 ng/ml) and h-FABP were 53, 86 and 78 % respectively and specificity for the same were 98, 73 and 70 % respectively. Sensitivity, specificity and NPV calculated for a cut-off combination of hs-TropT 0.014 ng/ml and h-FABP was 100, 51 and 100 % respectively. These results were substantiated by ROC analysis. Measurement of plasma h-FABP and hs-TropT together on admission appears to be more precise predictor of ACS rather than either hs-Trop T or h-FABP.  相似文献   
145.
Pre analytical process of extraction for accurate detection of organic acids is a crucial step in diagnosis of organic acidemias by GCMS analysis. This process is accomplished either by solid phase extraction (SPE) or by liquid–liquid extraction (LLE). Both extraction procedures are used in different metabolic laboratories all over the world. In this study we compared these two extraction procedures in respect of precision, accuracy, percent recovery of metabolites, number of metabolites isolated, time and cost in a resource constraint setup. We observed that the mean recovery from SPE was 84.1 % and by LLE it was 77.4 % (p value <0.05). Moreover, the average number of metabolites isolated by SPE and LLE was 161.8 ± 18.6 and 140.1 ± 20.4 respectively. The processing cost of LLE was economical. In a cost constraint setting using LLE may be the practical option if used for organic acid analysis.  相似文献   
146.
Neutralization theory and online software piracy: An empirical analysis   总被引:1,自引:0,他引:1  
Accompanying the explosive growth of information technology is the increasing frequency of antisocial and criminal behavior on the Internet. Online software piracy is one such behavior, and this study approaches the phenomenon through the theoretical framework of neutralization theory. The suitability and applicability of nine techniques of neutralization in determining the act is tested via logistic regression analyses on cross-sectional data collected from a sample of university students in the United States. Generally speaking, neutralization was found to be weakly related to experience with online software piracy; other elements which appear more salient are suggested and discussed in conclusion.  相似文献   
147.
Text document clustering provides an effective and intuitive navigation mechanism to organize a large amount of retrieval results by grouping documents in a small number of meaningful classes. Many well-known methods of text clustering make use of a long list of words as vector space which is often unsatisfactory for a couple of reasons: first, it keeps the dimensionality of the data very high, and second, it ignores important relationships between terms like synonyms or antonyms. Our unsupervised method solves both problems by using ANNIE and WordNet lexical categories and WordNet ontology in order to create a well structured document vector space whose low dimensionality allows common clustering algorithms to perform well. For the clustering step we have chosen the bisecting k-means and the Multipole tree, a modified version of the Antipole tree data structure for, respectively, their accuracy and speed.
Diego Reforgiato RecuperoEmail:
  相似文献   
148.
Intelligent use of the many diverse forms of data available on the Internet requires new tools for managing and manipulating heterogeneous forms of information. This paper uses WHIRL, an extension of relational databases that can manipulate textual data using statistical similarity measures developed by the information retrieval community. We show that although WHIRL is designed for more general similarity-based reasoning tasks, it is competitive with mature systems designed explicitly for inductive classification. In particular, WHIRL is well suited for combining different sources of knowledge in the classification process. We show on a diverse set of tasks that the use of appropriate sets of unlabeled background knowledge often decreases error rates, particularly if the number of examples or the size of the strings in the training set is small. This is especially useful when labeling text is a labor-intensive job and when there is a large amount of information available about a particular problem on the World Wide Web.
Haym HirshEmail:
  相似文献   
149.
Previous papers on grey literature by the authors have described (1) the need for formal metadata to allow machine understanding and therefore scalable operations; (2) the enhancement of repositories of grey (and other) e-publications by linking with CRIS (Current Research Information Systems); (3) the use of the research process to collect metadata incrementally reducing the threshold barrier for end-users and improving quality in an ambient GRIDs environment. This paper takes the development one step further and proposes “intelligent” grey objects. The hypothesis is in 2 parts: (1) that the use of passive catalogs of metadata does not scale (a) in a highly distributed environment with millions of nodes and (b) with vastly increased volumes of R&D output grey publications with associated metadata; (2) that a new paradigm is required that (a) integrates grey with white literature and other R&D outputs such as software, data, products and patents (b) in a self-managing, self-optimizing way and that this paradigm manages automatically curation, provenance digital rights, trust, security and privacy. Concerning (1) existing repositories provide catalogs; harvesting takes increasing time ensuring non-currency. The end-user expends much manual effort/intelligence to utilize the results. The elapsed time of (1) the network (2) the centralized (or centrally controlled distributed) catalog server searches (3) end-user intervention becomes unacceptable. Concerning (2) there is no paradigm currently known to the authors that satisfies the requirement. Our proposal is outlined below. Hyperactive combines both hyperlinking and active properties of a (grey) object. Hyperlinking implies multimedia components linked to form the object and also external links to other resources. The term active implies that objects do not lie passively in a repository to be retrieved by end-users. They “get a life” and the object moves through the network knowing where it is going. A hyperactive grey object is wrapped by its (incrementally recorded) formal metadata and an associated (software) agent. It moves through process steps such as initial concept, authoring, reviewing and depositing in a repository. The workflow is based on the rules and information in the corporate data repository with which the agent interacts. Once the object is deposited, the agent associated with it actively pushes the object to the end-users (or systems) whose metadata indicate interest or an obligation in a workflowed process. The agents check the object and user (or system) metadata for rights, privacy, security parameters, and for any charges and assure compatibility. Alternatively the object can be found passively by end-user or system agents. The object can also associate itself with other objects forming relationships utilising metadata or content. Declared relationships include references and citations; workflowed relationships include versions and also links to corporate information and research datasets and software; inferenced relationships are discovered relationships such as between documents by different authors developed from an earlier idea of a third author. Components of this paradigm have been implemented to some extent. The challenge is implementing—respecting part two of the hypothesis—the integration architecture. This surely is harnessing the power of grey.
Anne AssersonEmail:
  相似文献   
150.
The emergence of the new format of electronic/digital records provides the opportunity for archivists to reconsider the presumed format-neutrality of professional practice. As research in electronic records has served to re-emphasise, without an understanding of the needs and forms of material, then the work of archivists can have a profound impact on the evidential value and long-term research potential of the material. This paper attempts to broaden the debate about the requirements of all archival formats, and to build a new regime of 21st-century format specialists.
Joanna SassoonEmail:

Dr. Joanna Sassoon    is currently seconded to Edith Cowan University as Senior Lecturer in the School of Computer and Information Science. Her permanent position is in the State Records Office of Western Australia. She has long experience in managing archival collections, and has written extensively on a range of topics including digitisation, the effect of institutional practice on archival materials, environmental and indigenous history, and photographs as archives, and her work has been recognised with two Mander-Jones awards from the Australian Society of Archivists. She holds a Ph.D. in history with distinction from the University of Western Australia.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号