首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background:

In vitro hemolysis can be induced by several biological and technical sources, and may be worsened by forced aspiration of blood in vacuum tubes. This study was aimed to compare the probability of hemolysis by drawing blood with a commercial evacuated blood collection tube, and S-Monovette used either in the “vacuum” or “aspiration” mode.

Materials and methods:

The study population consisted in 20 healthy volunteers. A sample was drawn into 4.0 mL BD Vacutainer serum tube from a vein of one upper arm. Two other samples were drawn with a second venipuncture from a vein of the opposite arm, into 4.0 mL S-Monovette serum tubes, by both vacuum an aspiration modes. After separation, serum potassium, lactate dehydrogenase (LD) and hemolysis index (HI) were tested on Beckman Coulter DxC.

Results:

In no case the HI exceed the limit of significant hemolysis. As compared with BD Vacutainer, no significant differences were observed for potassium and LD using S-Monovette with vacuum method. Significant increased values of both parameters were however found in serum collected into BD Vacutainer and S-Monovette by vacuum mode, compared to serum drawn by S-Monovette in aspiration mode. The mean potassium bias was 2.2% versus BD Vacutainer and 2.4% versus S-Monovette in vacuum mode, that of LD was 2.7% versus BD Vacutainer and 2.1% versus S-Monovette in vacuum mode. None of these variations exceeded the allowable total error.

Conclusions:

Although no significant macro-hemolysis was observed with any collection system, the less chance of producing micro-hemolysis by S-Monovette in aspiration mode suggest that this device may be used when a difficult venipuncture combined with the vacuum may increase the probability of spurious hemolysis.  相似文献   

2.

Introduction:

Preanalytical variables account for most of laboratory errors. There is a wide range of factors that affect the reliability of laboratory report. Most convenient sample type for routine laboratory analysis is serum. BD Vacutainer® Rapid Serum Tube (RST) (Becton, Dickinson and Company, Franklin Lakes, NJ, USA) blood collection tube provides rapid clotting time allowing fast serum separation. Our aim was to evaluate the comparability of routine chemistry parameters in BD Vacutainer® RST blood collection tube in reference with the BD Vacutainer® Serum Separating Tubes II Advance Tube (SST) (Becton, Dickinson and Company, Franklin Lakes, NJ, USA).

Materials and methods:

Blood specimens were collected from 90 participants for evaluation on its results, clotting time and stability study of six routine biochemistry parameters: glucose (Glu), aspartate aminotransferase (AST), alanine aminotransferase (ALT), calcium (Ca), lactate dehidrogenase (LD) and potassium (K) measured with Olympus AU2700 analyzer (Beckman Coulter, Tokyo, Japan). The significance of the differences between samples was assessed by paired t-test or Wilcoxon Matched-Pairs Rank test after checking for normality.

Results:

Clotting process was significantly shorter in the RSTs compared to SSTs (2.49 min vs. 19.47 min, respectively; P < 0.001). There was a statistically significant difference between the RST and SST II tubes for glucose, calcium and LD (P < 0.001). Differences for glucose and LD were also clinically significant. Analyte stability studies showed that all analytes were stable for 24 h at 4 °C.

Conclusions:

Most results (except LD and glucose) from RST are comparable with those from SST. In addition, RST tube provides shorter clotting time.  相似文献   

3.
Cancer biomarkers have significant potential as reliable tools for the early detection of the disease and for monitoring its recurrence. However, most current methods for biomarker detection have technical difficulties (such as sample preparation and specific detector requirements) which limit their application in point of care diagnostics. We developed an extremely simple, power-free microfluidic system for direct detection of cancer biomarkers in microliter volumes of whole blood. CEA and CYFRA21-1 were chosen as model cancer biomarkers. The system automatically extracted blood plasma from less than 3 μl of whole blood and performed a multiplex sample-to-answer assay (nano-ELISA (enzyme-linked immunosorbent assay) technique) without the use of external power or extra components. By taking advantage of the nano-ELISA technique, this microfluidic system detected CEA at a concentration of 50 pg/ml and CYFRA21-1 at a concentration of 60 pg/ml within 60 min. The combination of PnP polydimethylsiloxane (PDMS) pump and nano-ELISA technique in a single microchip system shows great promise for the detection of cancer biomarkers in a drop of blood.  相似文献   

4.
This paper analyses strategic implications of using the financial models in evaluating research and development (R&D) technology investments. It also discusses the challenges of applying the financial models in valuing intellectual property (IP). This paper reveals how conventional financial models fail to recognise the importance of strategic positioning since there are limits in terms of using the quantified approaches to measure R&D performance. From the technology management perspective, it is argued that financial models need to be reconciled with technology strategies since today's investments have linkages with firms' competitiveness in the long run.  相似文献   

5.
Detecting real-world events by following posts in microblogs has been the motivation of numerous recent studies. In this work, we focus on the spatio-temporal characteristics of events detected in microblogs, and propose a method to estimate their locations using the Dempster–Shafer theory. We utilize three basic location-related features of the posts, namely the latitude-longitude metadata provided by the GPS sensor of the user’s device, the textual content of the post, and the location attribute in the user profile, as three independent sources of evidence. Considering this evidence in a complementary way, we apply combination rules in the Dempster–Shafer theory to fuse them into a single model, and estimate the whereabouts of a detected event. Locations are treated at two levels of granularity, namely, city and town. Using the Dempster–Shafer theory to solve this problem allows uncertainty and missing data to be tolerated, and estimations to be made for sets of locations in terms of upper and lower probabilities. We demonstrate our solution using public tweets on Twitter posted in Turkey. The experimental evaluations conducted on a wide range of events including earthquakes, sports, weather, and street protests indicate higher success rates than the existing state of the art methods.  相似文献   

6.
The theoretical literature on technological changes distinguishes between paradigmatic changes and changes in trajectories. Recently several scholars have performed empirical studies on the way technological trajectories evolve in specific industries, often by predominantly looking at the artifacts. Much less - if any - empirical work has been done on paradigmatic changes, even though these have a much more profound impact on today's industry. It follows from the theory that such studies would need to focus more on the knowledge level than on the artifact level, raising questions on how to operationalize such phenomena. This study aims to fill this gap by applying network-based methodologies to knowledge networks, represented here by patents and patent citations. The rich technological history of telecommunications switches shows how engineers in the post-war period were confronted with huge challenges to meet drastically changing demands. This historical background is a starting point for an in-depth analysis of patents, in search of information about technological direction, technical bottlenecks, and engineering heuristics. We aim to identify when such changes took place over the seven different generations of technological advances this industry has seen. In this way we can easily recognize genuine paradigmatic changes compared to more regular changes in trajectory.  相似文献   

7.
Several matters pertaining to the part played by tube walls, and surface and space charges in electrical discharges through rarefied gases, are briefly discussed, and experiments are described which throw light on some of the processes involved in such discharges. Measurements made under identical conditions of the falls of potential between striæ in the positive columns of discharges in hydrogen at various pressures in two glass tubes of different diameters showed that for pressures above 1.1 mm. these falls were increasingly greater in the larger tube. The results of experiments done with discharges produced under unusual conditions and bearing on the radial and axial fields present in the positive columns of discharges are described, and photographs of some of the discharges are reproduced in a part of which the positive column was made to pass through a long metal tube placed inside of the glass discharge tube, the two ends of the metal tube being at times both open and at other times both closed with wire gauzes of fine mesh. The potential assumed by the metal tube and the current flowing through its walls were measured, and special experiments were done to elucidate the process by which the current manages to penetrate the more or less field-free space inside the metal cylinder. The distributions of the space and surface charges along the length of a glass discharge tube were measured separately, at least roughly, and the nature of these distributions was found to depend markedly upon whether the anode or the cathode of the discharge tube was connected to earth, although the combined effects of the two charges of necessity produced the same field distribution within the discharge tube in both cases. The gases used in the experiments were air and hydrogen.  相似文献   

8.
Intracerebral hemorrhage (ICH) is the most serious type of stroke, which results in a high disability or mortality rate. Therefore, accurate and rapid ICH region segmentation is of great significance for clinical diagnosis and treatment of ICH. In this paper, we focus on deep neural networks to automatically segment ICH regions. Firstly, we propose an encoder-decoder convolutional neural network (ED-Net) architecture to comprehensively utilizing both the low-level and high-level semantic information. Specifically, the encoder is used to extract multi-scale semantic feature information, while the decoder integrates them to form a unified ICH feature representation. Secondly, we introduce a synthetic loss function by paying more attention to the small ICH regions to overcome the data imbalanced problem. Thirdly, to improve the clinical adaptability of the proposed model, we collect 480 patient cases with ICH from four hospitals to construct a multi-center dataset, in which each case contains the first and review CT scans. In particular, CT scans of different patients are diverse, which greatly increases the difficulty of segmentation. Finally, we evaluate ED-Net on the multi-center ICH clinical dataset from different model parameters and different loss functions. We also compare the results of ED-Net with nine state-of-the-art methods in the literature. Both quantitative and visual results have shown that ED-Net outperforms other methods by providing more accurate and stable performance.  相似文献   

9.
Higton H 《Endeavour》2001,25(1):18-22
Gaining an insight into what it meant to be a mathematical practitioner in the 17th century is difficult. People who thought themselves mathematical included navigators, sundial makers and book-keepers. For some, the simple use of instruments was construed as mathematical; for others, this was merely a 'showing of tricks'. One profession reliant on mathematics was land surveying. An analysis of this profession throws up many interesting questions about what it meant to be mathematical in early modern England.  相似文献   

10.
11.
12.
13.
To what extent should humans transfer, or abdicate, “responsibility” to computers? In this paper, I distinguish six different senses of ‘responsible’ and then consider in which of these senses computers can, and in which they cannot, be said to be “responsible” for “deciding” various outcomes. I sort out and explore two different kinds of complaint against putting computers in greater “control” of our lives: (i) as finite and fallible human beings, there is a limit to how far we can acheive increased reliability through complex devices of our own design; (ii) even when computers are more reliable than humans, certain tasks (e.g., selecting an appropriate gift for a friend, solving the daily crossword puzzle) are inappropriately performed by anyone (or anything) other than oneself. In critically evaluating these claims, I arrive at three main conclusions: (1) While we ought to correct for many of our shortcomings by availing ourselves of the computer's larger memory, faster processing speed and greater stamina, we are limited by our own finiteness and fallibility (rather than by whatever limitations may be inherent in silicon and metal) in the ability to transcend our own unreliability. Moreover, if we rely on programmed computers to such an extent that we lose touch with the human experience and insight that formed the basis for their programming design, our fallibility is magnified rather than mitigated. (2) Autonomous moral agents can reasonably defer to greater expertise, whether human or cybernetic. But they cannot reasonably relinquish “background-oversight” responsibility. They must be prepared, at least periodically, to review whether the “expertise” to which they defer is indeed functioning as he/she/it was authorized to do, and to take steps to revoke that authority, if necessary. (3) Though outcomes matter, it can also matter how they are brought about, and by whom. Thus, reflecting on how much of our lives should be directed and implemented by computer may be another way of testing any thoroughly end-state or consequentialist conception of the good and decent life. To live with meaning and purpose, we need to actively engage our own faculties and empathetically connect up with, and resonate to, others. Thus there is some limit to how much of life can be appropriately lived by anyone (or anything) other than ourselves.  相似文献   

14.
Many universities have developed large-scale interdisciplinary research centers to address societal challenges and to attract the attention of private philanthropists and federal agencies. However, prior studies have mostly shown that interdisciplinary centers relate to a narrow band of outcomes such as publishing and grants. Therefore, we shift attention to include outcomes that have been the centers mandate to influence ? namely outreach to the media and private industry, as well as broader research endeavors and securing external funding. Using data covering Stanford University between 1993 and 2014, we study if being weakly and strongly affiliated with interdisciplinary centers in one year relates to and increases (1) knowledge production (publications, grants and inventions), (2) instruction (numbers of students taught, PhDs and postdocs advised), (3) intellectual prominence (media mentions, awards won and centrality within the larger collaboration network), and (4) the acquisition of various sources of funding in the next year. Our results indicate that interdisciplinary centers select productive faculty and increase their activity on a broad range of outcomes further, and in ways greater than departments and traditional interdisciplinary memberships, such as courtesy and joint appointments.  相似文献   

15.
How can universities develop a knowledge management dynamic in order to train knowledge workers who are effective in an organizational learning process? Can games, and more specifically serious games, contribute to reaching this goal? To answer this question, we hypothesize that play can serve as a lever for knowledge management and double-loop learning. The purpose of this article is to show that serious games contribute to training knowledge workers in an organizational learning process. From this perspective, we attempt to understand how serious games promote the acquisition of knowledge and we explain the research method used in the field (participant observation, investigation using questionnaires). The final part analyses the main results: a community of practice and organization learning, internalization through Learning by Doing and better understanding of the environment’s complexity, towards double-loop learning and student satisfaction with the serious game.  相似文献   

16.
This paper reports on a survey of 221 web sites chosen at random from a subset of the .co.uk area of the Internet. A breakdown of the types of business represented shows that whilst computing companies continue to have a sizeable presence on the web, certain other types of business now also have a relatively large number of sites. The survey found many media-related businesses and professional-based companies and practices, reflecting the increasing use of Information Technology in these areas but also the increasing knowledge about the Internet that is needed or helpful in many types of job. The results are consistent with general business use of the web in the UK still being in an experimental phase of web site creation by those who can rather than by those who should.  相似文献   

17.
This paper presents findings from quantitative analyses of UK press and print media coverage of evolutionary psychology during the 1990s. It argues that evolutionary psychology presents an interesting case for studies of science in the media in several different ways. First, press coverage of evolutionary psychology was found to be closely linked with the publications of popular books on the subject. Secondly, when compared to coverage of other subjects, a higher proportion of academics and authors wrote about evolutionary psychology in the press, contributing to the development of a scientific controversy in the public domain. Finally, it was found that evolutionary psychology coverage appeared in different areas of the daily press, and was rarely written about by specialist science journalists. The possible reason for these features are then explored, including the boom in popular science publishing during the 1990s, evolutionary psychology's status as a new subject of study and discussion, and the nature of the subject its as theoretically based and with a human, "everyday" subject matter.  相似文献   

18.
It is known that there is a significant interplay of insulin resistance, oxidative stress, dyslipidemia, and inflammation in type 2 diabetes mellitus (T2DM). The study was undertaken to investigate the effect of turmeric as an adjuvant to anti-diabetic therapy. Sixty diabetic subjects on metformin therapy were recruited and randomized into two groups (30 each). Group I received standard metformin treatment while group II was on standard metformin therapy with turmeric (2 g) supplements for 4 weeks. The biochemical parameters were assessed at the time of recruitment for study and after 4 weeks of treatment. Turmeric supplementation in metformin treated type 2 diabetic patient significantly decreased fasting glucose (95 ± 11.4 mg/dl, P < 0.001) and HbA1c levels (7.4 ± 0.9 %, P < 0.05). Turmeric administered group showed reduction in lipid peroxidation, MDA (0.51 ± 0.11 µmol/l, P < 0.05) and enhanced total antioxidant status (511 ± 70 µmol/l, P < 0.05). Turmeric also exhibited beneficial effects on dyslipidemia LDL cholesterol (113.2 ± 15.3 mg/dl, P < 0.01), non HDL cholesterol (138.3 ± 12.1 mg/dl, P < 0.05) and LDL/HDL ratio (3.01 ± 0.61, P < 0.01) and reduced inflammatory marker, hsCRP (3.4 ± 2.0 mg/dl, P < 0.05). Turmeric supplementation as an adjuvant to T2DM on metformin treatment had a beneficial effect on blood glucose, oxidative stress and inflammation.  相似文献   

19.
More than 20 years ago, electrical impedance spectroscopy (EIS) was proposed as a potential characterization method for flow cytometry. As the setup is comparably simple and the method is label-free, EIS has attracted considerable interest from the research community as a potential alternative to standard optical methods, such as fluorescence-activated cell sorting (FACS). However, until today, FACS remains by and large the laboratory standard with highly developed capabilities and broad use in research and clinical settings. Nevertheless, can EIS still provide a complement or alternative to FACS in specific applications? In this Perspective, we will give an overview of the current state of the art of EIS in terms of technologies and capabilities. We will then describe recent advances in EIS-based flow cytometry, compare the performance to that of FACS methods, and discuss potential prospects of EIS in flow cytometry.  相似文献   

20.
What's in a Name? Some Reflections on the Sociology of Anonymity   总被引:1,自引:0,他引:1  
To paraphrase Mark Twain, reports of either the recent death or coming dominance of anonymity have been greatly exaggerated. This article is a beginning effort to lay out some of the conceptual landscape needed to better understand anonymity and identifiability in contemporary life. I suggest seven types of identity knowledge, involving legal name, location, symbols linked and not linked back to these through intermediaries, distinctive appearance and behavior patterns, social categorization, and certification via knowledge or artifacts. I identify a number of major rationales and contexts for anonymity (free flow of communication, protection, experimentation) and identifiability (e.g., accountability, reciprocity, eligibility) and suggest a principle of truth in the nature of naming , which holds that those who use pseudonyms on the Internet in personal communications have an obligation to indicate they are doing so. I also suggest 13 procedural questions to guide the development and assessment of any internet policy regarding anonymity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号