首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Pedagogical agents in multimedia learning environments have frequently been designed to produce pointing gestures (deictic) to direct spatial awareness. Although this might be beneficial for native English-speaking students who possess high levels of comprehension, only using pointing gestures with foreign language students learning English is problematic because these students need more assistance with language comprehension than just directing spatial awareness. The purpose of this study was to explore how gesture type and gesture frequency help foreign language students perceive the agent’s persona and the ability to recall procedural information. The results found one significant interaction between average gestures and no gesture in terms of facilitation, but all other conditions and persona subscales were not significant. For learning outcomes, the enhanced gesture condition significantly recalled more information than the no gesture condition (p = 0.017), and was approaching significance with the conversational gesture condition (p = 0.059). The findings suggest that when the learning population consists of foreign language students, pedagogical agents should use representational and beat gestures to help students comprehend more of the language, and the gesture frequency needs to be increased to account for the lack of verbal listening skills with this population.  相似文献   

2.
Abstract

The significance of pointing gestures in the development of linguistic communication is linked to their referential character and formation of common ground in use of gestures and speech. Our longitudinal study aimed to define the nature of this relationship more precisely and to explore whether the relevance vs lack of relevance of a child’s pointing gestures is related to development of language abilities. We developed a special protocol to measure relevant and irrelevant pointing gestures in 18-month-olds, sampled production of spontaneous speech and measured their language comprehension at two years of age. A group of 343 children was tested, and using structural equation modelling we showed that relevant gestures predict the level of development of language production and comprehension. As predicted, this association was not applied to irrelevant gestures. It is likely that a child’s more frequent use of relevant pointing gestures helps the caregiver to recognize the child’s communicative intentions and to comment on his/her behaviour appropriately. The identified developmental/predictive relationship is valid in both mentalistic and teleological interpretation of early communicative development.  相似文献   

3.
This study examined whether poor pointing gestures and imitative actions at 18 months of age uniquely predicted late language production at 36 months, beyond the role of poor language at 18 months of age. Data from the Norwegian Mother and Child Cohort Study were utilized. Maternal reports of the children's nonverbal skills and language were gathered for 42,517 children aged 18 months and for 28,107 of the same children at 36 months. Panel analysis of latent variables revealed that imitative actions, language comprehension, and language production uniquely contributed to predicting late development of language production, while pointing gestures did not. It is suggested that the results can be explained by underlying symbolic representational skills at 18 months.  相似文献   

4.
Infants’ pointing gestures are a critical predictor of early vocabulary size. However, it remains unknown precisely how pointing relates to word learning. The current study addressed this question in a sample of 108 infants, testing one mechanism by which infants’ pointing may influence their learning. In Study 1, 18‐month‐olds, but not 12‐month‐olds, more readily mapped labels to objects if they had first pointed toward those objects than if they had referenced those objects via other communicative behaviors, such as reaching or gaze alternations. In Study 2, when an experimenter labeled a not pointed‐to‐object, 18‐month‐olds’ pointing was no longer related to enhanced fast mapping. These findings suggest that infants’ pointing gestures reflect a readiness and, potentially, a desire to learn.  相似文献   

5.
Previous research suggests that presenting redundant nonverbal semantic information in the form of gestures and/or pictures may aid word learning in first and foreign languages. But do nonverbal supports help all learners equally? We address this issue by examining the role of gestures and pictures as nonverbal supports for word learning in a novel (e.g. original/pretend) language in a sample of 62 preschoolers who differ in language abilities, language background, and gender. We tested children’s ability to learn novel words for familiar objects using a within-subjects design with three conditions: word-only; word + gesture; word + picture. Children were assessed on English translation, immediate comprehension and follow-up comprehension 1 week later. Overall performance on the tasks differed by characteristics of the learners. The importance of considering the interplay between learner characteristics and instructional strategies is discussed.  相似文献   

6.
There have been many theories about how children learn to use language. Professor Narasimhan proposed a theory of child language acquisition based on behavioural pragmatics. In this article we present a simplified version of his theories about how children learn to communicate, to describe, manipulate and explore the world around them from exposure to variety of language utterances and non-verbal inputs such as gestures and pointing. We also discuss the method he used to substantiate his ideas, and briefly present a computational model of the ideas arising from his work. Raman Chandrasekar has been with Microsoft Research, Redmond, USA. since 1998. His current research interests are at the intersection of information retrieval, natural language processing and machine learning. He received his PhD from TIFR, Bombay.  相似文献   

7.
In typical development, gestures precede and predict language development. This study examines the developmental sequence of expressive communication and relations between specific gestural and language milestones in toddlers with autism spectrum disorder (ASD), who demonstrate marked difficulty with gesture production and language. Communication skills across five stages (gestures, word approximations, first words, gesture-word combinations, and two-word combinations) were assessed monthly by blind raters for toddlers with ASD participating in an randomized control trial of parent-mediated treatment (N = 42, 12–30 months). Findings revealed that toddlers acquired skills following a reliable (vs. idiosyncratic) sequence and the majority of toddlers combined gestures with words before combining words in speech, but in contrast to the pattern observed in typical development, a significant subset acquired pointing after first words.  相似文献   

8.
教师非言语行为作为课堂教学行为的重要组成部分,在教学过程中发挥着不可忽视的作用。教师非言语行为在课堂上主要表现为手部非言语行为、脸部非言语行为、身体非言语行为、副语言行为、空间语言行为。教师的非言语行为在辅助课堂教学、激发学生的学习兴趣、调控教学进程和增进师生情感方面有显著作用。在课堂教学中,教师要有意识地规范自身的非言语行为,要加强实践练习,提高非言语表达技巧,同时要适量,掌握好"度"。  相似文献   

9.
《Learning and Instruction》2002,12(3):285-304
Where anthropological and psychological studies have shown that gestures are a central feature of communication and cognition, little is known about the role of gesture in learning and instruction. Drawing from a large database on student learning, we show that when students engage in conversations in the presence of material objects, these objects provide a phenomenal ground against which students can enact metaphorical gestures that embody (give a body to) entities that are conceptual and abstract. In such instances, gestures are often subsequently replaced by an increasing reliance upon the verbal mode of communication. If gestures constitute a bridge between experiences in the physical world and abstract conceptual language, as we conjecture here, our study has significant implications for both learning and instruction.  相似文献   

10.
Gestures are a natural form of communication between preverbal children and parents which support children's social and language development; however, low-income parents gesture less frequently, disadvantaging their children. In addition to pointing and waving, children are capable of learning many symbolic gestures, known as “infant signs,” if modeled by adults. The practice of signing with infants is increasingly popular in middle-income populations around the world, but has not been examined as an intervention to promote positive qualities of the parent–child relationship. This study tested whether an infant sign intervention (ISI) encouraging low-income parents to use symbolic gestures could enhance the parent–child relationship. A final sample of twenty-nine toddlers and their families were followed for 7 months after assignment to the ISI or a control group. Children and mothers in ISI group families used more symbolic gestures than those in control families. Mothers’ in the ISI group were more attuned to changes in children's affect and more responsive to children's distress cues. Mothers in the intervention group also viewed their children more positively, reducing parenting-related stress. This study provides evidence that a simple infant sign intervention is an effective tool to promote bidirectional communication and positive interactions for preverbal children and their parents.  相似文献   

11.
It has previously been demonstrated that enactment (i.e., performing representative gestures during encoding) enhances memory for concrete words, in particular action words. Here, we investigate the impact of enactment on abstract word learning in a foreign language. We further ask if learning novel words with gestures facilitates sentence production. In a within‐subjects paradigm, participants first learned 32 abstract sentences from an artificial corpus conforming with Italian phonotactics. Sixteen sentences were encoded audiovisually. Another set of 16 sentences was also encoded audiovisually, but, in addition, each single word was accompanied by a symbolic gesture. Participants were trained for 6 days. Memory performance was assessed daily using different tests. The overall results support the prediction that learners have better memory for words encoded with gestures. In a transfer test, participants produced new sentences with the words they had acquired. Items encoded through gestures were used more frequently, demonstrating their enhanced accessibility in memory. The results are interpreted in terms of embodied cognition. Implications for teaching and learning are suggested.  相似文献   

12.
Two experiments examined the effects of a pedagogical agent's (PA's) pointing gestures, eye gaze, and eye contact on learning processes (measured by learners' eye fixations on relevant elements) and learning outcomes (as measured by retention and transfer test scores) with a multimedia lesson on neural transmission. In Experiment 1, having the PA look at and point to relevant elements as she lectured led to more eye fixations on the relevant portion of the graphic and better retention and transfer test scores. Keeping eye contact with learners tended to improve retention test scores and increased their eye fixations on relevant elements when the PA also looked at and pointed to the graphic. In Experiment 2, the PA's pointing gestures as a stand-alone feature caused better retention test scores and more fixations on relevant elements of the graphic, but eye gaze direction did not. These findings help extend the embodiment principle.  相似文献   

13.
This article examines how emergent bilingual students used gestures in science class, and the consequences of students’ gestures when their language repertoire limited their possibilities to express themselves. The study derived from observations in two science classes in Sweden. In the first class, 3rd grade students (9–10 years old) were involved in a unit concerning electricity. The second class consisted of 7th‐grade students (13–14 years old) working with acids and bases. Data were analyzed by using practical epistemological analysis (PEA). When students’ language proficiency limited their possibility to express themselves, using gestures resulted in the continuation of the science activities. Furthermore, both peers and teachers drew on the used gestures to talk about the science content. In some situations, the meaning of the gestures needed to be negotiated. Regardless, the gestures were always related to language. Both students and teachers participated in this process, but the teachers directed the communication toward the goal of the lessons: learning how to talk science. The study contributes to the field by showing the importance of paying attention to and valuing bilingual students’ use of gestures as a way to express scientific knowledge. In addition, it demonstrates how teachers might draw on students’ gestures to teach science and discusses the importance of creating multimodal learning environments. © 2017 Wiley Periodicals, Inc. J Res Sci Teach 55: 121–144, 2018  相似文献   

14.
Language and gesture are viewed as highly interdependent systems. Besides supporting communication, gestures also have an impact on memory for verbal information compared to pure verbal encoding in native but also in foreign language learning. This article presents a within‐subject longitudinal study lasting 14 months that tested the use of gestures in the classroom, with the experimenter presenting the items to be acquired. Participants learned 36 words distributed across two training conditions: In the audio‐visual condition subjects read, heard, and spoke the words; in the gestural condition subjects additionally accompanied the words with symbolic gestures. Memory performance was assessed through cued native‐to‐foreign translation tests at five time points. The results show that gestures significantly enhance vocabulary learning in quantity and over time. The findings are discussed in terms of Klimesch's connectivity model (CM) of information processing. Thereafter, a code, a word, is better integrated into long‐term memory if it is deep, that is, if it is comprised of many interconnected components.  相似文献   

15.
<正>I.Introduction In China,classroom oral instruction is the chief or indeed the only source of English learning,which is quite different from the natural ways of language learning.Teachers usually put much attention to their language in teaching.However,in fact language is not just composed of verbal or vocal form.Instead,"language is a systematic means of communicating ideas or feelings by the use of conventionalized signs,sounds,gestures,or marks".Body  相似文献   

16.
Explanations are typically accompanied by hand gestures. While research has shown that gestures can help learners understand a particular concept, different learning effects in different types of gesture have been less understood. To address the issues above, the current study focused on whether different types of gestures lead to different levels of improvement in understanding. Two types of gestures were investigated, and thus, three instructional videos (two gesture videos plus a no gesture control) of the subject of mitosis—all identical except for the types of gesture used—were created. After watching one of the three videos, participants were tested on their level of understanding of mitosis. The results showed that (1) differences in comprehension were obtained across the three groups, and (2) representational (semantic) gestures led to a deeper level of comprehension than both beat gestures and the no gesture control. Finally, a language proficiency effect is discussed as a moderator that may affect understanding of a concept. Our findings suggest that a teacher is encouraged to use representational gestures even to adult learners, but more work is needed to prove the benefit of using gestures for adult learners in many subject areas.  相似文献   

17.
Early in development, many word‐learning phenomena generalize to symbolic gestures. The current study explored whether children avoid lexical overlap in the gestural modality, as they do in the verbal modality, within the context of ambiguous reference. Eighteen‐month‐olds’ interpretations of words and symbolic gestures in a symbol‐disambiguation task (Experiment 1) and a symbol‐learning task (Experiment 2) were investigated. In Experiment 1 (N = 32), children avoided verbal lexical overlap, mapping novel words to unnamed objects; children failed to display this pattern with symbolic gestures. In Experiment 2 (N = 32), 18‐month‐olds mapped both novel words and novel symbolic gestures onto their referents. Implications of these findings for the specialized nature of word learning and the development of lexical overlap avoidance are discussed.  相似文献   

18.
杜新秀 《时代教育》2009,(6):214-216
本文以广州版小学英语教科书的词汇分析为例,指出其常用词汇比例总体符合语言学习规律;但其词汇总量超过课程标准之要求,且年级分布不大合理,在编排方面出现重复现象。由此建议编写者对词汇进行定量分析,考虑词汇总量、常用词汇比例、核心词汇等在词汇表中的编排。并建议英语教师对词汇分析,形成自己的教学词汇表。  相似文献   

19.
Embodied cognition and evolutionary educational psychology perspectives suggest pointing and tracing gestures may enhance learning. Across two experiments, we examine whether explicit instructions to trace out elements of geometry worked examples with the index finger enhance learning processes and outcomes. In Experiment 1, the tracing group solved more test questions than the non-tracing group, solved them more quickly, made fewer errors, and reported lower levels of test difficulty. Experiment 2 replicated and extended the findings of Experiment 1, providing evidence for a performance gradient across conditions, such that students who traced on the paper outperformed those who traced above the paper, who in turn outperformed those who simply studied by reading. These results are consistent with the activation of an increasing number of working memory channels (visual, kinaesthetic and tactile) for learning-related processing.  相似文献   

20.
This study considers the role and nature of co-thought gestures when students process map-based mathematics tasks. These gestures are typically spontaneously produced silent gestures which do not accompany speech and are represented by small movements of the hands or arms often directed toward an artefact. The study analysed 43 students (aged 10–12 years) over a 3-year period as they solved map tasks that required spatial reasoning. The map tasks were representative of those typically found in mathematics classrooms for this age group and required route finding and coordinate knowledge. The results indicated that co-thought gestures were used to navigate the problem space and monitor movements within the spatial challenges of the respective map tasks. Gesturing was most influential when students encountered unfamiliar tasks or when they found the tasks spatially demanding. From a teaching and learning perspective, explicit co-thought gesturing highlights cognitive challenges students are experiencing since students tended to not use gesturing in tasks where the spatial demands were low.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号