Bordegoni Monica
Professore Ordinario
Politecnico di Milano
monica.bordegoni@polimi.it
Sito istituzionale
SCOPUS ID: 6602375436
Orcid: 0000-0001-9378-8295
Pubblicazioni scientifiche
Abstract: As the Metaverse gains popularity due to its use in various industries, so does the desire to take advantage of all its potential. While visual and audio technologies already provide access to the Metaverse, there is increasing interest in haptic and olfactory technologies, which are less developed and have been studied for a shorter time. Currently, there are limited options for users to experience the olfactory aspect of the Metaverse. This paper introduces an open-source kit that makes it simple to add the sense of smell to the Metaverse. The solution is modular, allowing for the simultaneous use of multiple odors and compatibility with both desktop and wearable applications. The details of the solution, including its technical specifications, are outlined to enable potential users to utilize, test, and enhance the project and make it available to the scientific community.
Keywords: Metaverse | olfactory display | smell | virtual and augmented reality environments
Abstract: The increasing interest in technologies such as Metaverse, Augmented Reality (AR), and Artificial Intelligence (AI) opens up numerous possibilities that have the potential to shape our daily experiences in the future. We investigated how integrating today’s popular chatbots and multisensory AR technologies can create personalized museum visiting experiences. The paper outlines the concept of a digital assistant-based application for museum visits. The visitor, equipped with AR technology, can ask questions about the exhibits and receive precise and contextually relevant answers from the digital assistant who can virtually appear as an avatar with accompanying multimedia AR information. In the paper, we describe the implementation of a prototype of the application.
Keywords: Augmented Reality | Chatbots | Metaverse | Museums
Abstract: Stroke affects approximately fifteen million people worldwide annually, with impaired hand function being one of its most common effects. Hemiparetic post-stroke patients suffer a mild loss of strength involving one side of their body: though not fully debilitating, it still impacts their everyday life activities. To prevent mobility deterioration, patients must perform well-focused and repetitive exercises during chronic rehabilitation. Virtual Reality (VR) emerges as an interesting tool in this framework, offering the possibility of training and measuring the patient’s performances in ecologically valid, engaging, and challenging environments. In recent years, there has been an increasing diffusion of accessible head-mounted displays that enhance the sense of realism and immersion in a virtual scene. To explore the feasibility and efficacy of VR immersion and game mechanics in rehabilitation programs, a VR system that allows users to rehabilitate their motor skills in a home-based environment has been designed and tested considering standard measures related to usability, immersion, workload, and simulator sickness, and with the involvement of rehabilitation experts. The results demonstrate how users and experts have received the application positively, highlighting the potential of VR applications for the future development of home-based rehabilitation programs.
Keywords: Exergames | Post-Stroke Rehabilitation | Virtual Reality
Abstract: Teaching technical contents related to product design and 3D modeling is a challenging task. Students enrolled in Master of Science courses often have different backgrounds and technical skills, due to their different educational paths at the bachelor’s degree. From the student’s perspective, one of the main difficulties concerns the definition of the 3D modeling procedure, understand technical drawings and figuring out the proper assembly sequence to get the final product. Over the years, some methods to fill these existing gaps have been experimented, such as tutoring and Massive Open Online Courses. Despite these passive approaches have been proven useful for students, digital technologies, such as Virtual Reality, can boost the implementation active learning sessions and hands-on experiences. In this paper, we present a Virtual Reality application for active learning developed to help students in the initial stage of the “Methods and tools for detailed design” course to acquire a set of lacking knowledge, which includes technical terminology related to mechanical parts and components, the ability to decompose mechanical systems into sub-parts, and to create a 3D model using a CAD tool given a 2D engineering drawing. To evaluate the effectiveness of the proposed approach, the application has been tested with a cohort of students with different backgrounds and knowledge about mechanical design and CAD.
Keywords: Active learning | hands-on experience | Virtual learning | Virtual Reality | Virtual Reality for education
Abstract: In an era characterized by the increasing complexity of products and the rapid turnover of the workforce across different companies, there is a growing need to invest significantly in quick and efficient training methods. Concurrently, the advancement of digitalization has rendered certain training practices anchored to paper-based materials obsolete. Numerous companies are directing their investments toward digital training, yet the optimal format to exploit the full advantages of digitalization remains unclear. This study undertakes a comparison of four distinct digital versions of the same training process with the aim of comprehending the tangible benefits. The findings indicate that to fully capitalize on the advantages of digital technology, a complete rethinking of training practices is necessary.
Keywords: virtual and augmented reality environments | virtual prototyping
Abstract: Addressing sustainability challenges requires shifts in consumption patterns and lifestyles. Design for Sustainable Behavior (DfSB) aims to cultivate sustainable attitudes and behaviors through product-based interventions. However, there can be a disconnect between design strategy and its embodiment and sometimes conflicts between designers’ intent and users’ interpretation. This paper explores the role of metaphors in DfSB in terms of using metaphorical thinking during the design process and/or creating product metaphors in the final design. It begins by identifying barriers that prevent people from engaging in sustainable practices, such as human nature and ambiguity in design. It then examines the roles of metaphor in design and its key strengths in DfSB. Furthermore, the paper outlines three methods to generate metaphors in DfSB: (1) The source domain implies the target domain. (2) The source domain serves design goals and strategies. (3) Cross-domain mapping is based on embodied experience. In conclusion, the paper discusses potential issues surrounding its use in DfSB.
Keywords: Behavior change | Design for sustainable behavior | Metaphor | Metaphorical thinking | Product design | Sustainability communication
Abstract: Nowadays, there is a pressing need to incorporate sustainable practices and promote environmental awareness. The fashion industry, characterized by resource-intensive consumption, significantly contributes to global sustainability threats. The authors proposed the adoption of Virtual Reality (VR) to design and develop the Fashion Footprint application, an experience aimed at enhancing sustainability awareness and fostering behavioral change. The findings suggest that VR technology is valuable in promoting sustainability awareness and driving positive behavioral shifts in the fashion industry.
Keywords: behavioural design | sustainability | virtual reality (VR)
Abstract: The growing attention of people to aesthetics has led to a greater demand for dental whitening treatments. Several solutions can be utilized to obtain the desired visual whiteness of teeth but, according to literature, at-home supervised treatments are the standard in dental bleaching. They require soft plastic trays to contain a whitening gel, with active chemical agents, and keep it in contact with the patient’s teeth. The fitting, comfort, and tightness of trays play a fundamental role in the treatment. Any gel leakage can compromise the effectiveness of the treatment and damage soft tissues. Commonly, the trays are ready-made or based on physical dental impressions and manually modified by the dental technician. These procedures have low repeatability and do not always ensure high accuracy. This work presents an automatic digital algorithm to design customized whitening trays. Starting from a digital scan acquisition of the patient’s dental arches, it generates the 3D models of the bespoke trays, in approximately two minutes per arch, ready to be produced by additive manufacturing and thermoforming technologies. The evaluation of the method involved 20 patients. The results emphasize that the custom trays were comfortable and ensured high levels of tightness and fitting.
Keywords: automatic product design | custom teeth trays | dental whitening | digital process
Abstract: Olfactory Displays are devices used to generate and deliver scented air that is eventually smelled by the users. As the literature reports, their development and evaluation mostly rely on experimental activities based on a “trial-and-error” approach, which prevents a comparative analysis of designed solutions and their technical performances, thus leading to prototypes with low potential to become future products. In this paper, an innovative framework embedding Computational Fluid Dynamics (CFD) simulations for designing, prototyping and testing new Olfactory Displays is proposed. After presenting the framework, the paper illustrates the settings for a multi-phase CFD analysis based on Discrete Particles Modeling for simulating olfactory displays. The design of a new wearable olfactory display is presented, detailing all the steps of the framework. A first architecture is devised, and an initial set of simplified 2D multi-phase CFD simulations has been used to propose possible improvements. A new design has been developed, and a 3D CFD simulation has been run to predict its performance. A set of experiments has been conducted to test the real prototypes and compare the performance with the one predicted by the simulations. The experimental results are in good accordance with the simulations, which have proven their effectiveness in improving the design of the olfactory displays.
Keywords: CFD | Olfactory Display DOI: https://doi.org/ | Rapid Prototyping | Virtual and Physical Prototyping
Abstract: eXtended Reality (XR) technology can enhance the visitors’ experience of museums. Due to the variety of XR technologies available that differ in performance, quality of the experience they provide, and cost, it is helpful to refer to the evaluation of the various technologies performed through user studies to select the most suitable ones. This paper presents a set of empirical studies on XR application for museums to select the appropriate technologies to meet visitors’ expectations and maximise the willingness of repeating and recommending the experience. They provide valuable insights for developing Virtual Museum applications increasing the level of presence and experience economy.
Keywords: Extended reality | Multisensory experience | Sense of smell | User experience | Virtual museum
Abstract: The demand for orthodontic and aesthetic treatments, aimed at having healthier teeth and more beautiful smiles, is increasingly growing. The devices on which these treatments are based must be rigorously bespoke for each patient. This is amplifying the need to develop digitized workflows, ranging from scanning to Additive Manufacturing (AM). The present work proposes an alternative workflow for designing and manufacturing orthodontic aligners, also known as clear aligners, starting from the intraoral scanning of the patient’s dentition. Orthodontic aligners are an alternative to metal brackets to correct dental malocclusions and they are often preferred by the patients because of their lower impact on facial aesthetics and for their higher comfort. The orthodontic treatments based on the aligners utilize a series of aligners, each one with a geometry slightly different from the previous one. The use of the single aligners is aimed to apply a force to the teeth and gradually aligning them until the end of the treatment. The workflow we propose in the present study is based on the following three main stages: intraoral scanning of the patient’s dentition, design of the aligners through a semi-automatic algorithm, and the direct additive manufacturing of the aligners through VAT photopolymerization technique. The possibility to directly additive manufacturing the aligners allows us to rethink the current orthodontic treatments. The aligners geometry can be re-designed, with the possibility of locally manipulating the thickness. This approach would allow the regulation of the amount of force applied locally to the tooth, thus optimizing the treatment and its duration. A feasibility study of the proposed workflow is reported in the present paper, with a focus on the semi-automatic design algorithm and on the additive manufacturing process of the aligners.
Keywords: Additive Manufacturing | Bespoke Medical Devices | Dental Appliances | Design Algorithms for Medical Applications | DfAM
Abstract: Reduction of global bee populations is one of the relevant topics in the list of environmental issues defined in the United Nations Sustainable Development Goals Agenda. Virtual Reality technologies can play a strategic role in raising awareness and education for sustainability by allowing users to visualize, demonstrate and emphasize information, making it more accessible. Virtual Reality can also elicit emotional involvement, which is important for influencing user behavior. The authors have designed and developed an educational Virtual Reality application called Colonies, which allows users to understand the issues related to the bee problem and empathize with bee colonies. The experience also offers suggestions for more sustainable behaviors.
Keywords: Behavioral research | Sustainable development
Abstract: This chapter presents selected case studies showcasing the design and testing of user experiences through prototypes created with eXtended Reality (XR) technologies. These case studies were developed by students participating in the Virtual and Physical Prototyping course at the School of Design, Politecnico di Milano. The projects assigned to student teams aimed to create prototypes prioritizing user interaction and experience, such as apps connecting young people with nature or aiding language learning. The students followed a simplified product development process, starting with defining the target audience and requirements, followed by concept development and design execution, and appropriate XR technologies selection for the implementation. The chapter presents a set of guidelines that outline this simplified product development process for crafting XR application prototypes. Subsequent chapters showcase how these guidelines were applied by the students in their respective case studies.
Keywords: Product design | Product development | Software prototyping
Abstract: This chapter presents three examples demonstrating the advantages of multisensory experiences in eXtended Reality applications. The first case study involves a training application designed to familiarize users with laboratory machinery, incorporating the senses of sight, hearing, and smell. The second case study concerns the utilization of multisensory experiences in Augmented Reality to facilitate language learning and enhance the exploration of a new city. The third one concerns a multisensory application including visual components, sounds, and odors for learning music.
Keywords: Augmented reality
Abstract: This chapter provides an overview of the fundamental concepts, methodologies, and interdisciplinary nature of User eXperience Design (UX Design). UX Design is a multidisciplinary field that focuses on creating meaningful and valuable experiences for users. This chapter explores the concept of experience and its relation to perception, emotions, and individual needs. It discusses the User Centered Design (UCD) philosophy, which places the user at the center of the design process. The chapter outlines the four main phases of the UCD process: specifying the user and context of use, specifying user requirements, producing design solutions, and evaluating designs. It emphasizes the importance of user participation throughout the design process and highlights various qualitative and quantitative methods used in each phase. The chapter also introduces different models and frameworks, including the Double Diamond model and the Design Thinking model, which provide structured approaches to problem-solving and innovation. Additionally, it addresses the misconception that UX Design is limited to digital products and emphasizes the broader scope of the field, encompassing both digital and physical interactions.
Keywords: Behavioral research | Product design
Abstract: This chapter presents three examples that demonstrate the effective integration of Reality and Virtuality, resulting in captivating applications that significantly enhance and improve the user experience through their seamless combination. The primary objective of the first case study is to explore and examine various methods of fostering connections among individuals through the utilization of Augmented Reality technology for collaborative art experiences. The second case study focuses on proposing a connection between humans and plants through Virtual Reality to enhance mental and physical well-being, as well as overall quality of life. The third case study focuses on leveraging Augmented Reality technology for maintenance operations.
Keywords: Arts computing | Augmented reality | User experience
Abstract: This chapter provides an overview of eXtended Reality (XR), a term that encompasses technologies such as Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). XR allows the real world to be enriched with virtual data, objects, and information, offering varying levels of immersion and interaction. The chapter explores the components of XR and discusses classification frameworks, including Milgram’s Reality-Virtuality Continuum and the 3I model. It further delves into the developments and technologies of VR, highlighting stereoscopic vision, VR headsets and immersive walls. The chapter also explores AR, its accessibility through everyday devices, and its applications. The importance of considering all sensory modalities for creating immersive XR experiences is emphasized, and the role of multimodal interaction in enhancing user experiences is discussed. The chapter highlights the significance of a multisensory approach in achieving a sense of presence and explores examples of early immersive and multisensory technology. Finally, it discusses the impact of touch in human interaction and the incorporation of tactile sensations in VR applications, as well as of olfaction, which plays a significant role in human perception and memory, as certain smells can evoke strong emotional responses and trigger vivid recollections of past experiences. Incorporating odors into XR applications can enhance the overall sense of presence and realism by providing users with scent cues that align with the virtual environment. Olfactory displays are devices that release specific odors or scents to accompany virtual content.
Keywords: Behavioral research | Display devices | Mixed reality | Odors | Stereo image processing
Abstract: In this chapter, three examples are given of how Augmented Reality (AR) technologies can be used to promote sustainability. The initial case study involves an application that illustrates how the environment affects biochemical processes. The second case study features sustainable handmade toys that educate children on electronic circuits. The third one supports children in understanding the entire lifespan of a plant and learning its scientific microscopic processes.
Keywords: Augmented reality | Toys
Abstract: This chapter discusses the complex process of introducing new products to the market and the challenges involved. It highlights the importance of understanding user needs, both explicit and latent, in order to create products that meet customer requirements and provide a positive user experience. Failure to properly analyze user needs can lead to product dissatisfaction and costly design changes later in the process. The chapter emphasizes key requirements for new products, including style, functionality, quality, safety, and sustainability, and also explores the cost of design changes and the significance of addressing design problems early in the development process. It explains how the cost of fixing issues increases as the product development progresses and emphasizes the importance of timely validation and testing to minimize the cost impact of changes. Prototyping is identified as a valuable tool in the product development process for evaluating user needs and refining design concepts. It explains the iterative nature of prototyping and its role in eliminating unfeasible or undesirable solutions. The chapter discusses the purposes of prototypes, such as experimentation, learning, communication, testing, exploration, education, and research. Different types and scopes of prototypes are explained, including comprehensive and focused prototypes, as well as the concept of fidelity, which measures the degree of resemblance between the prototype and the final product. The chapter also provides an overview of various prototyping resources, such as sketches, wireframes, physical models, 2D and 3D renderings, interactive prototyping, and virtual prototyping using eXtended Reality technologies.
Keywords: Iterative methods | Product design | Three dimensional computer graphics
Abstract: This chapter provides insights into the dynamic nature of product design and the key factors influencing consumer perceptions and preferences in the modern era. It explores the multidimensional field of product design, which combines engineering, design, psychology, and other disciplines to create new products. It emphasizes the importance of understanding the concept of a product, which comprises both tangible and intangible elements that provide value to consumers. Today's products are complex systems that constantly evolve due to advancing technology and the demands of customers. The chapter discusses the shift in customer preferences towards personalized products and the challenges faced in manufacturing customized items, and highlights the three primary attributes of products: functions, style, and usability. The chapter also emphasizes the increasing importance of customization and personalization in product design, and explores the significance of efficient product development processes to meet market demands and user expectations.
Keywords: Advancing technology | Consumer perception | Consumers preferences | Customer preferences
Abstract: The industry's interest in Virtual and Augmented Reality (VR and AR) technologies started from the beginning of their appearance in the research world. Over the years, scholars observed ups and downs, to which various factors contributed. In recent years these technologies, now known as eXtended Reality (XR), have returned to fascinate the industrial world, mainly because most of the related enabling technologies have improved to the point of pushing companies to re-invest in them. The introduction of approaches such as the digital twin one and the recent hype on the metaverse is also a push in this direction. A few questions arise: what are the benefits of such technologies in the industry today, and what are the unexplored possibilities? Starting from a systematic literature review and exploring the practical implications of integrating technologies in the industrial field, the paper tries to answer these questions. The paper is not intended as a technological forecast but as a stimulus for future research.
Keywords: Industry 4.0 | Training | Virtual reality
Abstract: As technology advances, we are surrounded by more complex products that can be challenging to use and troubleshoot. We often turn to online resources and the help of others to learn how to use a product's features or fix malfunctions. This is a common issue in both everyday life and industry. The key to being able to use a product or fix malfunctions is having access to accurate information and instructions and gaining the necessary skills to perform the tasks correctly. This paper offers an overview of how artificial intelligence, digital twins, and the metaverse-currently popular technologies-can enhance the process of acquiring knowledge, know-how, and skills, with a focus on industrial maintenance. However, the concepts discussed may also apply to the maintenance of consumer products.
Keywords: computer-aided design | cyberphysical system design and operation | eXtended reality | maintenance | metaverse
Abstract: Nowadays, industrial training is gaining popularity, and in particular, Virtual Reality (VR) technologies are often adopted to recreate training experiences that can transform the learning process by exploiting the concept of learning by doing. Moreover, VR allows us to recreate realistic and immersive environments that provide trainees with hands-on experience in a safe and controlled setting. One of the advantages of using VR technologies to perform training simulations lies in the ability that these tools offer to mix different senses as if they were independent variables and to manipulate them one at a time. In this way, the effects of using one sense on the specific task under analysis can be discovered in a controlled scenario in which the others are controlled. One of the least used senses in VR is undoubtedly that of smell. In this paper, we intend to use a new olfactory device specifically designed to be adopted with VR technologies, integrate it with an immersive VR training application, and propose a series of uses allowed by this combination. In the paper, we report the details of the multisensory approach that integrates the different senses. Additionally, the paper presents the findings of a pilot experiment conducted to validate the adopted approach and assess the results. The study demonstrates that the developed Olfactory Display had no adverse impact on users' interaction with VR content.
Keywords: Olfactory Display | Virtual Reality | Virtual Training
Abstract: This chapter provides final remarks on the book’s content and explores the promising future of prototyping in product design, particularly with the advent of emerging technologies such as the metaverse and Artificial Intelligence, which offer significant possibilities and advancements for the prototyping process.
Abstract: Immersive technologies are often used in museum visits due to their numerous advantages, including the possibility of enhancing Cultural Heritage dissemination, improving accessibility, and learning activities. Despite the advantages, museum professionals may be reluctant to integrate immersive technologies during the museum visit due to their potential intrusiveness. In addition, a technology-driven approach is often used, which sometimes leads to scattered results and does not exploit the technological potential to meet the museum's objectives. Moreover, the design process for the creation of immersive exhibition visits is based on a trial-and-error approach rather than on specific guidelines regarding the use of immersive technologies in the museum context. The paper presents a study that investigates the immersive technology-related factors that influence the visitors’ experience in museum exhibitions and the immersive technology awareness, benefits, and hindering factors perceived by museum professionals. Specifically, the study focuses on the occurring design process for the integration of immersive technology in museum exhibitions involving multidisciplinary professionals. It presents mixed methods that include experimental case studies, online surveys, semi-structured interviews, and a participatory action research activity. The result consists of a conceptual framework for a new collaborative design process that aims to facilitate the design of immersive museum exhibitions and, consequently, to help museums achieve their objectives and improve visitors’ experience.
Keywords: Design process | Extended Reality | Immersive technology | Museum exhibitions
Abstract: The Metaverse is defined as an interconnected network of 3D environments that enable multisensory interactions with virtual environments, digital objects, and people. A central component of a Metaverse is the realism of human interaction with the environment and each other, which is achieved through emotional expression. However, there are limitations in realistically representing human emotions and facial expressions in real time through virtual avatars. In this paper, the authors present a research project called Meta-EmoVis aimed at improving the realism of the Metaverse by detecting human facial expressions and translating them into emotional feedback that influences aspects of the virtual avatar and environment. The Emoj tool is used to collect data and map facial expressions to basic emotions, as well as detect the human engagement level. The goal is to create a world of real social interaction and emotional sharing in virtual spaces that are sensitive and adaptive to human emotional and mental states. Preliminary user tests to evaluate the effectiveness of the application are presented.
Keywords: Emotion | Metaverse | Virtual Reality
Abstract: The use of digital technologies in museums is becoming increasingly popular, with a growing trend towards using multisensory eXtended Reality (XR) technologies to enhance visitor engagement and involvement. XR technologies like Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) have the potential to bring numerous advantages to museums, including increased accessibility and inclusivity. By allowing visitors to experience a sense of immersion and actively engage with cultural artifacts, XR technologies can improve emotional engagement and learning activities. However, the integration of these technologies is often related to temporary exhibitions, and there are concerns about the costs associated with implementation and the potential for technology to negatively impact the visitor experience. Despite these challenges, evaluating the impact of multisensory XR technologies on visitor experience is difficult. This paper outlines a laboratory-based research activity that compares the impact of traditional exhibitions and XR technologies on visitor experience and learning performance. The study simulates a museum exhibition and allows the user to interact with different paintings using different XR technologies in a controlled environment.
Keywords: Cultural Heritage | Multisensory XR technologies | Museum exhibitions
Abstract: Virtual Reality (VR) technologies are increasingly being adopted in fashion marketing to create innovative new channels for sales, advertising, brand communication, and strengthening relationships with consumers. In addition, virtual technologies can be useful for personalizing the offer, gathering insights, and user engagement. However, there seem to be few occurrences where the potential of virtual technologies on User eXperience (UX) and engagement is explored, evaluating the advantages compared to real-life experiences. Analyzing these aspects for virtual fashion experiences can be challenging. For measuring the UX, quantitative methods such as sensors capable of recording and examining the body's responses and automatic systems used to recognize emotions by tracking facial movements and expressions have been increasingly employed. In the context of fashion marketing, they can help analyze the implications that virtual experiences have, also considering content personalization and company promotion. This paper presents the design and development of a VR application that includes emotion-tracking technology that can be used to evaluate users’ emotional reactions to virtual experiences and compare them with real ones. The final aim of the study is to increase the degree of user engagement and the quality of the virtual fashion show UX of the Italian clothing brand Bacon. Preliminary test sessions were organized to evaluate users’ emotional reactions to the virtual experience and verify their effectiveness in relation to the objectives of the study.
Keywords: Emotion tracking | User eXperience | Virtual Fashion Show
Abstract: In the last years, interactive exhibitions based on digital technologies have become widely common, thanks to their flexibility and effectiveness in engaging visitors and creating memorable experiences. One of the topics in which digital technologies can be particularly effective is the communication of abstract concepts that are difficult for the human mind to imagine. An emblematic example is the astronomy discipline, which requires us to imagine and understand phenomena far away from our everyday life. In this paper, the authors present a research project, MARSS, in which digital technologies are used effectively to enhance the Users’ Experience of the Museo Astronomico di Brera located in Milan. Specifically, the MARSS project aims at designing and developing a new digital journey inside the museum to allow different categories of visitors to enjoy the exhibition in an engaging and interactive way. The paper presents the design and development phases of the experience and its evaluation with users. The results of the evaluation indicate that the digital interactive experience is appreciated by users and is successful in translating the content of high scientific value into more engaging and easily understandable elements.
Keywords: augmented reality | cultural heritage | extended reality | interactive exhibitions | science museums | user experience
Abstract: This study presents an alternative process for designing and manufacturing customized trays for dental-whitening treatments. The process is based on a digitized approach consisting of three main stages: design of a reference model, its manufacturing by AM, and thermoforming of the tray. The aim of the study was to develop a high-performance tray, able to guarantee comfort, safety, and efficacy for whitening treatments. To evaluate the patient’s experience, some tests under real operating conditions were performed. Twenty people carried out a nighttime treatment of 14 days. Each patient was asked to assess the overall level of satisfaction and the comfort of the tray and its ability to retain the gel. Tooth whitening was also determined according to the VITAPAN scale. All patients involved in the study were satisfied and provided positive feedback about comfort and tightness of the tray. At the end of the treatment, 15 out of 20 patients achieved shade A1 on the VITAPAN scale. The mean improvement in color shades was about 7. These results confirmed the great potential of the proposed dental tray. Its use was proven to guarantee a high level of quality, flexibility, and customization of dental-whitening treatments, improving comfort, safety, and efficacy.
Keywords: additive technologies | bespoke dental trays | custom design | dental engineering | digital manufacturing | esthetic dentistry | tooth whitening
Abstract: The issue of training operators in the use of machinery is topical in the industrial field and in many other contexts, such as university laboratories. Training is about learning how to use machinery properly and safely. Beyond the possibility of studying manuals to learn how to use a machine, operators typically learn through on-The-job training. Indeed, learning by doing is in general more effective, tasks done practically are remembered more easily, and the training is more motivating and less tiresome. On the other hand, this training method has several negative factors. In particular, safety may be a major issue in some training situations. An approach that may contribute overcoming negative factors is using Virtual Reality and digital simulation techniques for operators training. The research work presented in this paper concerns the development of a multisensory virtual reality application for training operators to properly use machinery and personal protective equipment (PPE). The context selected for the study is a university laboratory hosting manufacturing machinery. The application allows user to navigate the laboratory, to approach a machine and learn about how to operate it, and also to use proper PPE while operating a machine. Specifically, the paper describes the design and implementation of the application and presents the results of preliminary testing sessions.
Abstract: Augmented Reality seems a promising tool to provide engaging and effective educational experiences, thanks to its potentiality in stimulating intrinsic motivation, that could influence the learning process and the attitude of the users towards behaviours. This paper presents the Resized Plastic Augmented Reality learning experience, designed on the basis of Dunleavy's framework to provide a systemic overview of the microplastics issue to allow users to understand its mechanisms, educate them about their role in the system and help them to connect this information to their everyday actions.
Keywords: augmented reality (AR) | digital learning | sustainability | user experience
Abstract: The development of Virtual Reality in a wide range of field, including engineering related applications, has pushed towards the investigation of novel solutions that are able to take advantage of such new possibilities, while possibly trying to seamlessly integrate them within currently established workflows. Regarding conceptual sketching, which commonly represents one of the first activities taking place across Product Design development work-flows, there are examples of applications that allow to shift from the 2D layout of traditional drawing to a fully immersive 3D environment where the user is able to produce strokes in space by means of a set of natural gestures. Despite sounding extremely intuitive, this kind of approach also comes with potential issues: the lack of a supportive surface onto which the user can rely on to produce strokes with a high degree of precision while not feeling tired after prolonged sessions can be problematic. Based on these premises, a new hybrid approach is proposed: the user is still immersed in the Virtual Environment, but is able to make use of a traditional tablet device which lays on a physical desk in order to produce visible strokes in Virtual Reality, while having the possibility to simultaneously manipulate the position and the orientation of the scene thanks to a hand tracking device to break into the third dimen-sion. As designed, the application supports the generation of simple line strokes and few basic commands, but a thorough testing session is still needed to validate the solution and investigate on the necessary improvements.
Keywords: Conceptual design | Product design | Virtual reality
Abstract: Design for Sustainability is a research area based on a multidisciplinary approach, which has become increasingly important in recent years. Among the several approaches to Design for Sustainability emerged in the past decades, great attention is paid to the “Design for sustainable behavior” approach, used to design products that can impact on users' behaviors, through embedded smart technologies, e.g., Internet of Things (IoT). In fact, IoT systems are able to "dialogue" with the users, supporting the identification of any misbehavior, and suggesting more sustainable ones. The authors identified the opportunity to design and develop AR interactive applications aiming to generate awareness about the impact of humans on Earth, make people reason about how they can directly contribute to limit the expansion of this problem and induce their behavioral change. The applications are meant to engage users in an active process of exploring and discovering informative contents and to foster them to elaborate a personal and critical vision and change their bad habits. Specifically, this paper presents two case studies about the design and development of Augmented Reality applications and IoT products to be used for supporting users towards more conscious food consumption in their daily life, in order to reduce food waste.
Keywords: Augmented reality | Design for behavioral change | Design for sustainable behavior | Food waste
Abstract: eXtended Reality applications include multiple sensorial media to increase the quality of the User Experience. In addition to traditional media, video, and sound, two other senses are typically integrated: touch and smell. The development of applications that integrate multiple sensorial media requires a framework for properly managing their activation and synchronization. The paper describes a framework for the development of eXtended Reality mulsemedia applications and presents some applications based on the integration of smells developed using the framework.
Keywords: Users experiences | extended reality
Abstract: This paper presents a case study regarding the simulation of a robotic workstation to virtually test objects detection and obstacle avoidance. Testing these features inside a virtual environment is useful, especially when human-robot cooperation and interaction are involved. Indeed, it allows users to avoid real dangerous conditions, lowering the possible risks of injuries by the users and cutting down the costs compared to a real testing environment. The work presented here exploits a framework where a virtual environment is connected to a Robot Operating System (ROS) able to simulate the kinematic of the robot and, on the other side, a physical ultrasonic sensor acts as the bridge with the real world. The latter, driven by an Arduino board, allows the virtual robot to recognize static obstacles in the real world, mapping the surrounding environment and computing a suitable trajectory to accomplish the given task. Thanks to the sensing capability, the virtual robot is also able to react to the presence of other obstacles (e.g. humans) entering the workspace at runtime. The seamless connection between the virtual and the physical world makes the framework suitable for the fast testing of new algorithms driving the behavior of the robot when interacting with dynamic environments.
Keywords: Costs | Human robot interaction | Object detection | Ultrasonic applications | Virtual reality
Abstract: Memorization techniques are of primary importance in education. Two relevant and extensively used techniques are the Concept maps and the method of loci. Both methods are based on the visualization of information, which helps memorization and retrieval. For these reasons, they are also considered inclusive learning tools for people with Specific Learning Disability. Augmented Reality is a technology that has gained popularity in many sectors, from industry to the medical one, for its effectiveness in visualizing graphical items on top of real contexts. The paper demonstrates that Augmented Reality can also be beneficial for representing and interacting with Concept maps in a 3D virtual space that is linked to the real world. Specifically, the authors developed an Augmented Reality application in which the key features of both the Concept maps (such as visual-spatial logic and concepts organization) and of the method of loci are integrated with those of Augmented Reality technologies to improve the learning process and the retrieval of information.
Keywords: Augmented Reality | Concept maps | virtual learning
Abstract: This paper describes the design and preliminary test of a virtual reality driving simulator capable of conveying haptic and visual messages to promote eco-sustainable driving behavior. The driving simulator was implemented through the Unity game engine; a large street environment, including high-speed and urban sections, was created to examine different driving behaviors. The hardware setup included a gaming driving seat, equipped with a steering wheel and pedals; the virtual scenarios were displayed through an Oculus Rift headset to guarantee an immersive experience. Haptic stimulation (i.e., vibrations) was delivered to the driver through the accelerator pedal, while visual stimuli (i.e., icons and colors) were shown on a virtual head-up display. The sensory feedbacks were presented both alone and in combination, providing information about excessive acceleration and speed. Four different virtual scenarios, each one including a distracting element (i.e., navigator, rain, call, and traffic), were also created. Ten participants tested the simulator. Fuel consumption was evaluated by calculating a mean power index (MPI) in reference to the sensory feedback presentation; physiological reactions and responses to a usability survey were also collected. The results revealed that the haptic and visuo-haptic feedback were responsible for an MPI reduction, respectively, for 14% and 11% compared with a condition of no feedback presentation; while visual feedback alone resulted in an MPI increase of 11%. The efficacy of haptic feedback was also accompanied by a more relaxing physiological state of the users, compared with the visual stimulation. The system’s usability was adequate, although haptic stimuli were rated slightly more intrusive than the visual ones. Overall, these preliminary results highlight how promising the use of the haptic channel can be in communicating and guiding the driver toward a more eco-sustainable behavior.
Keywords: Eco-driving | Haptics | Multisensory | Virtual reality
Abstract: Aims: The aim of the study was to show a case with a midline diastema in a patient with high periodontal risks and gingival recessions treated with clear aligners. The objective was to predict and quantify root movements using a dedicated software that extrapolates data from the Cone Beam Computed Tomography (CBCT). Case Presentation: A 31-year-old female with a mandibular midline diastema asked for an aesthetic treatment. She had vertical bone loss on the lower central incisors, so a CBCT was necessary in order to plan the root movements. The purpose of the treatment was to avoid an uncontrolled tipping of the incisors and, therefore, a vestibular movement of the roots, which could cause serious periodontal problems. Conclusion: At the end of the treatment, the complete closure of the diastema and the radiographic healing of the vertical bone loss between mandibular central incisors were achieved. The superimpositions with the virtual setup demonstrated predictability of root movements of 76%.
Keywords: Clear aligners | Cone beam computed tomography | Midline diastema | Periodontal conditions | Periodontal disease | Virtual root setup
Abstract: The issue of training operators in the use of machinery is topical in the industrial field and in many other contexts, such as university laboratories. Training is about learning how to use machinery properly and safely. Beyond the possibility of studying manuals to learn how to use a machine, operators typically learn through on-the-job training. Indeed, learning by doing is in general more effective, tasks done practically are remembered more easily, and the training is more motivating and less tiresome. On the other hand, this training method has several negative factors. In particular, safety may be a major issue in some training situations. An approach that may contribute overcoming negative factors is using Virtual Reality and digital simulations techniques for operators training. The research work presented in this paper concerns the development of a multisensory Virtual Reality environment for training operators to properly use machinery and Personal Protective Equipment (PPE). The context selected for the study is a university laboratory hosting manufacturing machinery. It has been developed an application that allows user to navigate the laboratory, to approach a machine and learn about how to operate it and also what PPE to use while operating. Specifically, the paper describes the design and implementation of the application.
Keywords: Multisensory interaction | Virtual reality | Virtual training
Abstract: Nature has always been a source of inspiration for designers and engineers, through the imitation of biological patterns and structures. This emulating and creative process is nowadays supported by technologies and tools as additive manufacturing and computational design. This paper describes the design and prototyping of a lamp inspired by a plant called Physalis Alkekengi, known as Chinese Lantern. We present the development of an algorithm, based on a computational model from literature, to realize the 2D pattern and leaves. They were then 3D printed to create the structure of the lamp and obtain an aesthetical and symbolic shading effect.
Keywords: Bio-inspired design / biomimetics | Computational design methods | Design for Additive Manufacturing (DfAM)
Abstract: Design for Sustainability is a research area based on a multidisciplinary approach, which has become increasingly important in recent years. Great attention is paid to the design of products that can impact on users' behaviours, through embedded smart technologies, e.g. Internet of Things (IoT). In fact, IoT systems are able to "dialogue" with the users, supporting the identification of any misbehaviour, and suggesting more sustainable ones. This paper presents a research aiming at supporting users towards more conscious food consumption in their daily life to reduce food waste. As a case study, it has been developed an interactive system in which chicken eggs are used as main communication element. Indeed, the environmental footprint of the egg industry is very heavy, and eggs are one of the main wasted food. The interactive system consists of a physical product, an eggs tray, integrating sensors and actuators for handling the interaction with users. It is accompanied by an interactive application for monitoring eggs consumption, displaying eggs waste statistics, and an Augmented Reality part for children, aimed to improve their awareness about food waste and the impact on their food habits through an "edutainment" approach.
Keywords: Industrial design | Sustainability | Virtual reality
Abstract: The era of the fourth industrial revolution has fundamentally transformed the manufacturing landscape. Products are getting increasingly complex and customers expect a higher level of customization and quality. Manufacturing in the Era of 4th Industrial Revolution explores three technologies that are the building blocks of the next-generation advanced manufacturing. The first technology covered in Volume 1 is Additive Manufacturing (AM). AM has emerged as a very popular manufacturing process. The most common form of AM is referred to as “three-dimensional (3D) printing”. Overall, the revolution of additive manufacturing has led to many opportunities in fabricating complex, customized, and novel products. As the number of printable materials increases and AM processes evolve, manufacturing capabilities for future engineering systems will expand rapidly, resulting in a completely new paradigm for solving a myriad of global problems. The second technology is industrial robots, which is covered in Volume 2 on Robotics. Traditionally, industrial robots have been used on mass production lines, where the same manufacturing operation is repeated many times. Recent advances in human-safe industrial robots present an opportunity for creating hybrid work cells, where humans and robots can collaborate in close physical proximities. This Cobots, or collaborative robots, has opened up to opportunity for humans and robots to work more closely together. Recent advances in artificial intelligence are striving to make industrial robots more agile, with the ability to adapt to changing environments and tasks. Additionally, recent advances in force and tactile sensing enable robots to be used in complex manufacturing tasks. These new capabilities are expanding the role of robotics in manufacturing operations and leading to significant growth in the industrial robotics area. The third technology covered in Volume 3 is augmented and virtual reality. Augmented and virtual reality (AR/VR) technologies are being leveraged by the manufacturing community to improve operations in a wide variety of ways. Traditional applications have included operator training and design visualization, with more recent applications including interactive design and manufacturing planning, human and robot interactions, ergonomic analysis, information and knowledge capture, and manufacturing simulation. The advent of low-cost solutions in these areas is accepted to accelerate the rate of adoption of these technologies in the manufacturing and related sectors. Consisting of chapters by leading experts in the world, Manufacturing in the Era of 4th Industrial Revolution provides a reference set for supporting graduate programs in the advanced manufacturing area.
Abstract: The following sections are included: • Introduction • Related Works • VR Maintenance Training: A Case Study • Implementation of the VR Application • Experimental Setup and Testing • Analysis of Test Results • Discussion and Conclusion • References.
Abstract: Virtual and Augmented Reality technologies are being leveraged by the manufacturing community to improve operations in a wide variety of ways. Opportunities arise from multiple manufacturing engineering research areas which can potentially enable such technologies to provide both innovative support and solutions to improve product design, planning, manufacturing and support throughout the product life cycle. These not only extend to the manufacturing domain itself but also to other areas where engineering-type virtual manufacturing approaches are relevant to the tools and skill sets required in other domains, such as healthcare, health and safety, ergonomics and education and training, to name but a few. Such solutions and technologies must be considered from the point-of-view of both the individual engineer and their needs as well as team-based engineering when collaborative working, design reviews and data/information provenance are necessary. All of the capabilities of real time engineering interaction must be considered when interfacing with VR/AR models and associated data; this will facilitate rapid product development, agile manufacturing and product support. With careful virtual manufacturing system planning and development - including front-end design - this will enable the effective generation of downstream data and information as well as upstream feedback. These capabilities will be critical for the engineering of products through a range of contemporary concepts now very much relevant to manufacturing applications, such as digital twins, Industry4.0, intelligent asset management, etc.
Abstract: This research aims at providing an example of how Virtual Reality technology can support the design and development process of the Human Machine Interaction system in the field of Autonomous vehicles. The autonomous vehicles will be an important role in the future’s daily life, as widely concerned. However, the relationship between the human user and the vehicle changes in the autonomous driving scenario, therefore a new interactive modality should be established to guarantee the operational precision and the comfort of the user. But as an underdevelopment sector, there are no mature guidelines for the interaction design in autonomous vehicles. In the early phase of the autonomous vehicle popularization, the first challenge is to build the trust of the user towards the autonomous vehicle. Keeping high transparency of the autonomous vehicle’s behavior to the user will be very helpful, however, it is not possible to communicate the information that the sensors of the autonomous vehicle are collecting because it can create safety risks. In this research, two hierarchical Human Machine Interaction information systems have been introduced and a virtual reality scenario has been developed, based on the most popular applicating scenario: the autonomous taxi. Possible verification methods are also discussed to apply the tool, considering the current design and development procedure in industry, in order to give constructive help to the researchers and practitioners in the field.
Keywords: Autonomous Vehicle | Fully autonomous vehicle | HMI design | Human Machine Interaction | Virtual Reality | VR
Abstract: This paper describes an innovative 3D-printed beam-based lightweight structure that is used to increase the adhesion strength of metal-composite joints without damaging the composite fibers. It is conceived as the interface between the two parts to be joined: by filling the voids of this structure with resin, a mechanical interlocking effect can be generated to enhance the mechanical properties of the junction. A dedicated design workflow was defined to explore different types of 3D beam-based structures, starting from the analysis of the main failure modes of this type of junction. Tensile tests were performed on both polymeric and metal samples to validate the effectiveness of this interlocking strategy. Results demonstrated an increase in the adhesion strength relative to standard adhesive joints. A possible practical implementation is also discussed: a new type of insert is presented for application in metal-to-polymer composite joints. Finally, such a beam-based joining approach also represents an innovative application in the field of design for additive manufacturing.
Keywords: 00-01 | 99-00 | Design for additive manufacturing | Lattice structures | Material extrusion | Metal-composite junctions | Powder bed fusion
Abstract: We present the design and test of a wearable device capable to detect the user's trunk orientation with respect to the gravitational field and to provide tactile stimulation to correct tilted positions. Vibrations are delivered to the shoulders, the frontal and dorsal parts of the trunk, by using the human body as an indicator of the four cardinal directions. The device was experimentally tested in normal gravity conditions by thirty-nine volunteers. The efficacy of tactile cues was investigated in comparison to visual and visuo-tactile cues. The results revealed that, despite the fact that the time needed to complete the task was shorter when people were guided by visual signals, the tactile cues were equally informative and, in some cases, the trunk spatial orientation was even more accurate. Overall, tactile cues were evaluated by users as more intuitive, effective and accurate.
Keywords: Haptics | Trunk orientation | Wearable device
Abstract: The recent interest in human-robot interaction requires the development of new gripping solutions, compared to those already available and widely used. One of the most advanced solutions in nature is that of the human hand, and several research contributions try to replicate its functionality. Technological advances in manufacturing technologies and design tools are opening possibilities in the design of new solutions. The paper reports the results of the design of an underactuated artificial robotic hand, designed by exploiting the benefits offered by additive manufacturing technologies.
Keywords: 3D printing | additive manufacturing | design for additive manufacturing | mechatronics
Abstract: Within the scope of Design for Sustainable Behaviour, the connection between behavioural change strategies and design idea generation has received limited attention. This paper highlights metaphorical thinking in product design to stimulate sustainable behaviour. In particular, the current study proposes a metaphor-based design method to guide designers on how to associate product features with behavioural and experiential cues through metaphors. We next report two design cases to evaluate this method. In the end, the shortcomings of current research and future developments are also discussed.
Keywords: design methodology | ecodesign | human behaviour | product design | sustainable design
Abstract: During the concept phase of the industrial design process drawings are used to represent designer’s ideas. More specifically, the designer’s goal is to put the characteristics of ideas on paper so that they can later act as pivotal points in the development of a project. Sketching is also the ideal tool to continue developing an idea: because it is imprecise, the sketch guarantees a high degree of freedom, allowing for changes to made and new ideas to be added. Another possibility is to translate ideas into sketches on computer tools. This approach can allow the designer to use the created 3D model as the basis for further developing ideas. At the present moment, however, this type of solution is not extensively used by designers during the concept phase. Some researchers have identified technical problems as the reason why these instruments have been unsuccessful on the market, while for others this is related to systems still too rigid to be adapted to the often-diverse needs of designers. The research presented in this position paper aims at analyzing what has so far been understood with respect to the process of generating ideas, their initial representation in the concept phase and the tools that have been developed so far to support this phase. Consequently, a discussion on these themes and some hypotheses from which develop new research lines will be presented.
Abstract: Museums have been subjected to important changes in the approach they use to involve visitors. Among the other trends, storytelling and interactive exhibitions are two of the most used approaches used to make exhibitions more interesting for users. Virtual Reality and Augmented Reality methods can be effectively used in the context of a museum exhibition to support both storytelling and interaction. The primary objective of the use of these technologies is to make the visit of museums much more engaging, and suitable for different types of visitors. Among the several museums that are moving in this direction, there is the Museo Astronomico di Brera. The museum mainly consists of a corridor, hosting instruments used by astronomers, and the Cupola Schiaparelli, which is an observatory dome. The aim of the research presented in this paper is to develop an interactive Virtual Reality application to be used for improving the users’ experience of visits to the Museo Astronomico di Brera. Specifically, the paper presents a VR application to virtually visit the Dome. Preliminary tests have been carried out for evaluating the users' sense of presence in the VR environment. An analysis of the collected data is presented in the paper.
Keywords: Augmented Reality | Museum exhibitions | User Experience | Virtual Reality
Abstract: Cultural and educational institutions have been subjected to important changes in the approach they use to involve the public in the last years. For example, museums are more and more playing a pedagogical role, referring not only to exhibitions of pieces of art, but also to exhibitions concerning current topics in cultural and social affairs. Storytelling and interaction are two of the most popular methods used to make exhibitions more interesting for the visitors, and many works have demonstrated that Virtual Reality and Augmented Reality technologies can be effectively used to support these approaches in the context of museum exhibitions. This paper presents a research work aimed to design and develop an interactive multisensory AR application (based on sight, hearing, and olfaction senses), which can be used for improving the users' engagement in exhibitions and generate awareness about the dramatic outcomes of pollution on the environment. Specifically, the paper describes a case study of multisensory Augmented Reality interactive experiences concerning the negative effects of human activities on natural environments.
Keywords: Augmented Reality | Multisensory Perception | User Experience
Abstract: Kandinsky-Experience Book is a multisensory Augmented Reality experience that involves sight, hearing and smell senses and aims at improving the users’ engagement in the Kandinsky’s artworks. Specifically, the aim of the application is to augment the experience of the user creating a journey throughout Kandinsky's work by using an AR application for smartphones integrated with audio and olfactory stimuli, in order to allow him/her to be more immersed in the piece of art. The research project has been inspired by the synesthetic approach of the abstract painter to the theory and the perception of art in his books. Starting from the artist’s considerations about the relationship between different sensorial stimuli in works of art, we decided to amplify some of his theories suggesting a connection between the main pictorial elements and some corresponding olfactory stimuli, grounding our hypotheses on the content of papers concerning the crossmodal synesthetic correspondences between olfactory stimuli and other sensorial modalities. Thanks to the simultaneous presentation of the specifically developed AR contents and the olfactory stimuli, the users’ feelings and emotions during the experience are amplified as a result of the sensory integration. Moreover, by using AR technology and olfactory devices to stimulate visual and olfactory perceptual channels we aimed at increasing the generation of longer-lasting memories in the users' mind.
Keywords: Augmented Reality | Cultural Heritage | Multisensory Perception | User Experience
Abstract: The increasing concern for sustainability-related issues leads to the rise of new fields in design research, dedicated to limit the negative impact of human activities on the environment and society. After addressing issues related to production, efficiency, recyclability and disassembly, designers start to recognize their responsibility in guiding users to behave in a more responsible and sustainable way. For this reason, designing products to support users’ behaviour change is becoming one of the most popular trends in design research at the moment. To achieve the desired results design for behaviour change, and in particular, Design for Sustainable Behaviour exploits a variety of approaches. In this Chapter, we explore the use of Design for Sustainable Behaviour techniques through a literature review of theories and case studies. Then, we defined a framework which describes the use of multisensory stimuli as elements to support different phases of interaction during the user experience with an interactive product. We relate this framework to previous works and then we discuss two case studies.
Abstract: In order to increase safety in the road and improve the user experience in the vehicle, the user studies have been conducted by researchers and practitioners in the automobile industry over the decades. Also, over time, the technology and design inside the car have changed and are leading to a faster, safer, and more comfortable user experience in driving, thanks to the results gained from the user studies. On the other side, the boosting automated driving technology gives new challenges to user studies in the validation of new technologies from the user’s perspective, improving the acceptance, employing the right usage, and so on. Laboratory driving simulation becomes one of the main methods for user studies because of its safety, ease of control, and precision in the scene restoration. In this paper, a typical fixed-base driving simulator will be introduced with a user interaction model in order to help the researchers to define the user study scope in each vehicle automation level and even predict the potential user study issues in the future autonomous vehicle technology and scenario. The strategy in the current study is to treat the different levels of automation in vehicles differently. Three case studies are provided accordingly from the low-automated to semi-autonomous driving and eventually fully autonomous driving. Each one addresses some of the critical points that should be paid attention to, in the user studies of the corresponding automation level, applying the previous model. In the low automation condition, the case study showed the effectiveness of the proposed method in the verification of olfactory modality interaction in the driver’s attention maintenance. The case study in the semi-automation condition demonstrated the capacity of the current method of capturing the user’s behavior changes in the take-over task, which is the most critical scenario in conditional autonomous driving. And the last case study showed the possibility to conduct comfort-related user studies in the full automation condition using the method, by monitoring the cognitive workload of users under different autonomous driving styles.
Keywords: Driving simulation | Driving simulator | User behavior studies | User experience | User studies
Abstract: Human beings interact with the external world through the perception that they get by touching, looking at, listening to, tasting and smelling it. Even if this exploration is essential to identify opportunities and dangers, today it is also used to investigate, understand and enjoy objects that surround us and to use them to manage our life, have fun, increase our knowledge, relax, etc. To date, interaction is based primarily on sight, on hearing and on touch. Very little interaction, however, is based on smell, a sense considered very difficult to manage and to use for creating more pleasant and effective experiences. Yet, olfactory stimuli can make the interaction between users and objects more engaging and effective on sub-conscious levels and long-term memory. This can be particular relevant for museums, which are subjected to important changes in the way they use to involve visitors in their exhibitions. This work presents two case studies of olfactory experiences integrated in applications for cultural heritage purposes, and effectively used to enhance the user interaction and experience.
Keywords: Augmented Reality | Interactive exhibitions | Multisensory experiences
Abstract: In Mass Customisation (MC), products are intrinsically variable, because they aim at satisfying end-users’ requests. Modular design and flexible manufacturing technologies are useful strategies to guarantee a wide product variability. However, in the eyewear field, the current strategies are not easily implementable, due to some eyewear peculiarities (e.g., the large variability of the frame geometry and material, and the necessity to use specific manufacturing phases). For example, acetate spectacle-frames are bent through a thermoforming process. This particular phase requires dedicated moulds, whose geometry strictly depends on the frame model to be bent; consequently, changes of the frame geometry continuously require new moulds, which have to be designed, manufactured, used, and finally stored. The purpose of this paper is to propose a new strategy to transform a dedicated tool (i.e., a thermoforming mould) into a reconfigurable one, to optimise the tool design, manufacturing and use. First, how the frame features influence the mould geometry has been investigated, creating a map of relations. On the basis of this map, the conventional monolithic-metallic mould was divided into “standard” (re-usable) and “special” (ad-hoc) modules, where the “special” ones are in charge of managing the variability of the product geometry. The mapped relations were formalised as mathematical equations and then, implemented into a Knowledge Based Engineering (KBE) system, to automatically design the “special” modules and guarantee the mould assemblability. This paper provides an original example of how a reconfigurable thermoforming mould can be conceived and how a KBE system can be used to this aim.
Keywords: Eyewear | KBE system | Mass customisation | Thermoforming mould
Abstract: The potentiality of the Fused Deposition Modeling (FDM) process for multi-material printing has not yet been thoroughly explored in the literature. That is a limitation considering the wide diffusion of dual extruders printers and the possibility of increasing the number of these extruders. An exploratory study, based on tensile tests and performed on double-material butt-joined bars, was thus conceived; the aim was to explore how the adhesion strength between 3 pairs of filaments (TPU-PLA, PLA-CPE, CPE-TPU) is influenced by the material printing order, the type of slicing pattern used for the layers at the interface, and the infill density of the layers below the interface. Results confirm the effectiveness of mechanical interlocking strategies in increasing the adhesion strength even when thermodynamic and diffusion mechanisms of adhesion are not robust enough. Besides, thermal aspects also demonstrated to play a relevant role in influencing the performance of the interface.
Keywords: design for additive manufacturing | fused deposition modelling (FDM) | multi-material adhesion | Multi-material printing | slicing parameters
Abstract: The paper describes the design of an innovative virtual reality (VR) system, based on a combination of an olfactory display and a visual display, to be used for investigating the directionality of the sense of olfaction. In particular, the design of an experimental setup to understand and determine to what extent the sense of olfaction is directional and whether there is prevalence of the sense of vision over the one of smell when determining the direction of an odor, is described. The experimental setup is based on low-cost VR technologies. In particular, the system is based on a custom directional olfactory display (OD), a head mounted display (HMD) to deliver both visual and olfactory cues, and an input device to register subjects' answers. The paper reports the design of the olfactory interface as well as its integration with the overall system.
Abstract: Smart environment is a key challenge for current ICT research: it is one of the solutions that can enhance people’s quality of life and enable users with impairment to live independently. Over the years, scientific research has proposed several solutions to help and improve the capabilities of its occupants, but they are often developed for a specific context (e.g. particular disease or impairment). These systems do not adapt to the real needs of users with different profiles, and neglect that the user’s requirements may evolve over time. This research work aims to develop a new adaptive smart system able to support users (with and without disabilities) in performing daily tasks by recognizing their preferences and actions and adapting the system feedback consequently. With the aim to develop an easy, efficient and usable adaptive smart system, the final users have been involved in the whole design and development process. The system was validated through a virtual reality system allowing the user interaction evaluation and helping the usability improvement.
Keywords: Adaptive and adaptable user interface | Bayesian network | ICT | Smart environment | User-centered design | Virtual reality system
Abstract: Magika is an interactive Multisensory Environment that enables new forms of playful interventions for children, especially those with Special Education Needs. Designed in cooperation with more than 30 specialists at local care centers and primary schools, Magika integrates digital worlds projected on the wall and the floor with a gamut of "smart" physical objects (toys, ambient lights, materials, and various connected appliances) to enable tactile, auditory, visual, and olfactory stimuli. The room is connected with an interface for educators that enables them to: control the level of stimuli and their progression; define and share a countless number of game-based learning activities; customize such activities to the evolving needs of each child. This paper describes Magika and discusses its potential benefits for play, education and inclusion.
Keywords: Children | Education | Inclusion | Multisensory Environments | Play | Special Education Needs
Abstract: The use of odors in different areas, such as marketing, entertainment, wellbeing, and arts is nowadays calling increasing attention. Various types of olfactory displays to integrate into Virtual Reality applications have been proposed in recent years. Actually, several characteristics of the sense of smell and of odors still remain not fully understood, and the development of olfactory displays presents a high level of complexity. Therefore, new strategies for the development of more reliable and effective olfactory displays are required.This paper presents a wearable olfactory display, based on solid fragrances, to integrate into multisensory applications for museum exhibitions. The motivation is that the use of odors can be particularly effective for improving the users' involvement, comprehension and experience of art exhibitions. The paper also describes the experimental setup for the evaluation of the display.
Keywords: Multisensory Application | Olfactory Display | Virtual Reality
Abstract: Augmented Reality (AR), is one of the most promising technology for technical manuals in the context of Industry 4.0. However, the implementation of AR documentation in industry is still challenging because specific standards and guidelines are missing. In this work, we propose a novel methodology for the conversion of existing “traditional” documentation, and for the authoring of new manuals in AR in compliance to Industry 4.0 principles. The methodology is based on the optimization of text usage with the ASD Simplified Technical English, the conversion of text instructions into 2D graphic symbols, and the structuring of the content through the combination of Darwin Information Typing Architecture (DITA) and Information Mapping (IM). We tested the proposed approach with a case study of a maintenance manual of hydraulic breakers. We validated it with a user test collecting subjective feedbacks of 22 users. The results of this experiment confirm that the manual obtained using our methodology is clearer than other templates.
Keywords: Augmented reality | Industry 4.0 | Maintenance support | Technical documentation
Abstract: The paper describes the design of a wearable and wireless system that allows the real-time identification of some gestures performed by basketball players. This system is specifically designed as a support for coaches to track the activity of two or more players simultaneously. Each wearable device is composed of two separate units, positioned on the wrists of the user, connected to a personal computer (PC) via Bluetooth. Each unit comprises a triaxial accelerometer and gyroscope, a microcontroller, installed on a TinyDuino platform, and a battery. The concept of activity recognition chain is investigated and used as a reference for the gesture recognition process. A sliding window allows the system to extract relevant features from the incoming data streams: mean values, standard deviations, maximum values, minimum values, energy, and correlations between homologous axes are calculated to identify and differentiate the performed actions. Machine learning algorithms are implemented to handle the recognition phase.
Abstract: Littering is a highly diffused anti-environmental and anti-social behavior, especially among young people. Furthermore, cigarette butts are one of the most littered items and are responsible for both severe environmental damages and high clean up expenses. The aim of this project is to design an interactive ashtray for the campus environment to limit the cigarette butts littering behavior in an engaging and effective way. Qualitative and quantitative data are collected. Coded observations were implemented through the research process, including the 2 pre (without the prototype) and 2 pros (with the prototype) sessions. Also, user experience test and one to one interview were conducted for deepening the understanding of the littering phenomenon and the reasons behind in the behavior among young people. The prototype indeed reduced the number of cigarette butts littering among observed behaviors of 156 students, especially in male sample. Final results indicate the behavior change of disposers is moderated by other factors, as the environmental cleanliness. Future development is also discussed.
Keywords: Design for Behavior Change | Gamification | Multisensory product experience | Sustainability | User centred design
Abstract: The scarce availability of water in highly populated cities is about to become a social problem. While the water service companies work on improving the distribution network in order to reduce losses, it is evident that one of the main problems is due to an excess of use of this resource by users. This consumption is relatively controlled when excessive consumption is clearly associated, in the consumer mind, with high costs. However, when users are in public places they tend to consume water because of a loss of correlation with costs. In this paper, we describe the design of a device to be installed in public environments, which aims to reduce the consumption of water. The device measures in real time the flow of water and sends the user visual and sound information trying to create a link between consumption and costs. The device has been installed in a university campus bathroom and has been tested. Test results show a reduction in water consumption, especially in the interactive prototype approach compared to the conventional treatment. Further modifications for future development of the interactive device is also discussed.
Keywords: Design for sustainable behavior | Multisensory product experience | Sustainability | User centred design | Water conservation
Abstract: The control shifting between a human driver and a semi-autonomous vehicle is one of the most critical scenarios in the road-map of autonomous vehicle development. This paper proposes a methodology to study driver's behaviour in semi-autonomous driving with physiological-sensors-integrated driving simulators. A virtual scenario simulating take-over tasks has been implemented. The behavioural profile of the driver has been defined analysing key metrics collected by the simulator namely lateral position, steering wheel angle, throttle time, brake time, speed, and the take-over time. In addition, heart rate and skin conductance changes have been considered as physiological indicators to assess cognitive workload and reactivity. The methodology has been applied in an experimental study which results are crucial for taking insights on users' behaviour. Results show that individual different driving styles and performance are able to be distinguished by calculating and elaborating the data collected by the system. This research provides potential directions for establishing a method to characterize a driver's behaviour in a semi-autonomous vehicle.
Keywords: Evaluation | Semi-autonomous vehicle | Simulation | User behaviour | Virtual reality
Abstract: 3D printed heterogeneous lattice structures are beam-and-node based structures characterised by a variable geometry. This variability is obtained starting from a periodic structure and modifying the relative density of the unit cells or by combining unit cells having different shapes. While several consolidated design approaches are described to implement the first approach, there are still computational issues to be addressed to combine different cells properly. In this paper, we describe a preliminary experimental study focused on exploring the design issues to be addressed as well as the advantages that this second type of heterogeneous structures could provide. The Three-Point-Bending test was used to compare the behaviour of different types of heterogeneous structures printed using the Fused Deposition Modeling (FDM) technology. Results demonstrated that the possibility of combining multiple unit cells represents a valid strategy for performing a more effective tuning of the material distribution within the design space. However, further studies are necessary to explore the behaviour of these structures and develop guidelines for helping designers in exploiting their potential.
Keywords: 3D printing | Design for Additive Manufacturing (DfAM) | Heterogeneous lattice structures | Lightweight design
Abstract: A bursting number of achievements in the autonomous vehicle industry have been obtained during the past decades. Various systems have been developed to make automated driving possible. Due to the algorithm used in the autonomous vehicle system, the performance of the vehicle differs from one to another. However, very few studies have given insight into the influence caused by implementing different algorithms from a human factors point of view. Two systems based on two algorithms with different characteristics are utilized to generate the two driving styles of the autonomous vehicle, which are implemented into a driving simulator in order to create the autonomous driving experience. User’s skin conductance (SC) data, which enables the evaluation of user’s cognitive workload and mental stress were recorded and analyzed. Subjective measures were applied by filling out Swedish occupational fatigue inventory (SOFI-20) to get a user self-reporting perspective view of their behavior changes along with the experiments. The results showed that human’s states were affected by the driving styles of different autonomous systems, especially in the period of speed variation. By analyzing users’ self-assessment data, a correlation was observed between the user “Sleepiness” and the driving style of the autonomous vehicle. These results would be meaningful for the future development of the autonomous vehicle systems, in terms of balancing the performance of the vehicle and user’s experience.
Keywords: Autonomous vehicle | Driving style | Human behavior
Abstract: A research area of interest is that one concerning the design of solutions for improving the life conditions of users in extreme environmental situations. An example is the spacecraft environment, where astronauts are subject to particular conditions, due to the extreme environment. The isolated and confined environment influences behaviors and perceptions. This situation can impact both on astronauts’ moods, cause states of depression, and impact on their performance in working activities. A spacecraft can be the Space Station orbiting the Earth, or future means of transportation used for travelling to other planets. In both cases the space should be designed so as to offer the best possible living and working conditions to the astronauts. The research presented in this paper aims at designing and developing a multisensory VR system for the entertainment and the relaxation of astronauts. The use of VR technology allows us to overcome physical and psychological boundaries of the confined space, which is typical in a spacecraft environment. The sense of smell, which is more linked to visceral emotions than the other senses and can affect various aspects of humans’ physiological and psychological conditions, is used to improve astronauts’ productivity and concentration, and also to relieve their stress and anxiety.
Keywords: Entertainment | Multisensory environment | Relaxation | Scents simulation | Virtual Reality
Abstract: The use of collaborative robots in the manufacturing industry has widely spread in the last decade. In order to be efficient, the human-robot collaboration needs to be properly designed by also taking into account the operator’s psychophysiological reactions. Virtual Reality can be used as a tool to simulate human-robot collaboration in a safe and cheap way. Here, we present a virtual collaborative platform in which the human operator and a simulated robot coordinate their actions to accomplish a simple assembly task. In this study, the robot moved slowly or more quickly in order to assess the effect of its velocity on the human's responses. Ten participants tested this application by using an Oculus Rift head-mounted display; ARTracking cameras and a Kinect system were used to track the operator's right arm movements and hand gestures respectively. Performance, user experience, and physiological responses were recorded. The results showed that while humans’ performances and evaluations varied as a function of the robot’s velocity, no differences were found in the physiological responses. Taken together, these data highlight the relevance of the kinematic aspects of robot’s motion within a human-robot collaboration and provide valuable insights to further develop our virtual human-machine interactive platform.
Keywords: Human-robot collaboration | Stress | Virtual reality | Workload
Abstract: More and more modern digital applications allow users to make experiences that elicit their senses. More traditional applications allow users to make visual and sound experiences. Recently, the sense of touch has been introduced to enrich the users’ experiences with digital worlds. The sense of smell is equally important for enriching and making the experiences engaging, but has been mostly neglected so far, mostly because of the limited knowledge about olfaction and of olfactory technologies. This chapter presents a methodology for the development of applications including multisensory user experiences based also on the sense of olfaction. The methodology has been used and tested for the development of applications in various sectors, which are reported in the chapter.
Abstract: Virtual, Augmented and Mixed Reality technologies are more and more getting attention from tourism researchers and professionals, because of their recognized potential to support marketing activities. The paper describes the development of a multisensory environment thought for a travel agency, which combines visual, auditory, tactile and olfactory stimuli. The idea is to develop an experience able to provide a virtual preview of the desired holiday destination, resulting in both an attractive experience for the customer and an effective way to increase sales. A case study about the multisensory experience of a walk on Italian Alps has been developed. The multisensory experience is based on a video streaming, recorded in the real environment, synchronously matched with a haptic interface. The haptic interface is made up of a pair of slippers provided with actuators, and also an actuator positioned on the customer trunk, used to reproduce the feeling of a snowball hit. Moreover, an olfactory display is also used to provide pine smell during the walk. During the experience, the user is sitting on a yoga ball, whose inclination allows him/her to start and stop the multisensory virtual experience.
Keywords: Multisensory simulation | User experience | Virtual prototyping | Virtual tourism
Abstract: This review focuses on the design process of additively manufactured mesoscale lattice structures (MSLSs). They are arrays of three-dimensional (3D) printed trussed unit cells, whose dimensions span from 0.1 to 10.0 mm. This study intends to detail the phases of the MSLSs design process (with a particular focus on MSLSs whose unit cells are made up of a network of struts and nodes), proposing an integrated and holistic view of it, which is currently lacking in the literature. It aims at guiding designers' decisions with respect to the settled functional requirements and the manufacturing constraints. It also aims to provide an overview for software developers and researchers concerning the design approaches and strategies currently available. A further objective of this review is to stimulate researchers in exploring new MSLSs functionalities, consciously considering the impact of each design phase on the whole process, and on the manufactured product.
Keywords: additive manufacturing | design for additive manufacturing | design process | mesoscale lattice structures | multifunctional lattice structures
Abstract: Advanced driver assistance systems (ADASs) allow information provision through visual, auditory, and haptic signals to achieve multidimensional goals of mobility. However, processing information from ADAS requires operating expenses of mental workload that drivers incur from their limited attentional resources. The change in driving condition can modulate drivers' workload and potentially impair drivers' interaction with ADAS. This paper shows how the measure of cardiac activity (heart rate and the indexes of autonomic nervous system (ANS)) could discriminate the influence of different driving conditions on drivers' workload associated with attentional resources engaged while driving with ADAS. Fourteen drivers performed a car-following task with visual ADAS in a simulated driving. Drivers' workload was manipulated in two driving conditions: one in monotonous condition (constant speed) and another in more active condition (variable speed). Results showed that drivers' workload was similarly affected, but the amount of attentional resources allocation was slightly distinct between both conditions. The analysis of main effect of time demonstrated that drivers' workload increased over time without the alterations in autonomic indexes regardless of driving condition. However, the main effect of driving condition produced a higher level of sympathetic activation on variable speed driving compared to driving with constant speed. Variable speed driving requires more adjustment of steering wheel movement (SWM) to maintain lane-keeping performance, which led to higher level of task involvement and increased task engagement. The proposed measures appear promising to help designing new adaptive working modalities for ADAS on the account of variation in driving condition.
Abstract: Designing for pleasurable and engaging product experiences requires an understanding of how users will experience the product, sometimes at a very abstract level. This focus on user experiences, rather than on the formal qualities of the product, might cause difficulties for designers in the materialization of design ideas. Designers need to navigate through several choices, shaping and refining the product qualities in order to elicit the intended experience. To support this process, we propose a tool, the Experience Map, guiding designers in the progressive transformation of an ‘experiential vision’ into tangible formal qualities, considering all the opportunities perceived by the different senses. The paper presents the results of two studies in which we verified the potential of the Experience Map, first in a workshop with design students and second in four design cases with professional designers. The results show that the Experience Map can provide a good structure to organize creative thoughts and progressively decrease the level of abstraction, particularly to support novice designers. It stimulates greater confidence and awareness of design decisions, while allowing the exploration of several design directions in parallel. These benefits, together with the visually stimulating layout and its ability to foster awareness on design decisions, make the Experience Map an effective tool to support experience-driven design practice, especially in the early phases of the creative process and in the educational context.
Keywords: Design intentions | Design process | Experience design | Experience map | Multi sensory design
Abstract: Emotional expression is an important human behavior, which enriches communication. Sensory organs play crucial role in emotional perception. Today, communication is mostly done via digital mediators, which dominantly address vision excluding other senses; therefore, communication becomes less affective. Wearable technology provides an intimate and continuous interaction with sensory organs. Hence, this technology can be used in order to make distant communication more affective by enabling multi-sensory interaction. This article presents a user-centered design practice on wearable products that simulate sensorial feedbacks (tactile, visual, and auditory) to express basic emotions. Three prototypes that transmit emotional messages were designed, built, and tested to observe user behavior. This article discusses how user experience obtained through the user test can be taken further to design new communication products, which can find solutions for different user needs.
Keywords: design practice | emotion | multi-sensory interaction | sensory perception | wearable technology
Abstract: The evolution of technical documentation in the age of Industry 4.0 is going towards the use of visual manuals, in particular exploiting Augmented Reality (AR) technology. Traditional manuals are rich of text instructions that in AR applications are not advisable. In fact text occludes the real scene behind and it is an issue for the translation. For this reason, we propose to create and adopt a controlled and exhaustive vocabulary of graphical symbols, to be used in AR to represent maintenance instructions. In particular, in this work we identified the most frequent maintenance actions used in manuals, and converted them into graphical symbols. Then, we made an elicitation of the symbols designed and created different candidate vocabularies of symbols basing on the criteria found in literature of guessability and homogeneity. Moreover, the vocabularies had to respect two constraints: conflict set and reversibility. Finally, we identified the best of symbols and integrated this one in a real AR application for remote maintenance.
Keywords: Augmented Reality | Industry 4.0 | Maintenance
Abstract: Eco-driving assistance devices are being introduced to reduce CO2 emissions, but the overall changes in the user behavior have not been sufficiently explored. While in-vehicle driver advanced systems are designed to support a single driving task (e.g. reducing emissions), they also imply the adoption of a different driving behavior and different driving attitudes in order to be efficient. Adopting for the first time a new driving style could affect driver's acceptance and undermine new technologies’ efficacy. Purpose of the present research is to measure and evaluate the user's responses to first-time use of eco-driving assisting technology. Driver's performances in a virtual simulator were compared between experimental and a control group. The actual driving parameters and CO2 emissions were recorded and compared to the optimal eco-driving style calculated by CarMaker software. The cognitive costs of the new driving style were measured by changes in the modulation of autonomic nervous system and NASA-TLX workload scale. Acceptance of the assisted driving style and general eco-friendly attitudes were analyzed by self-reported measures. Results show that being exposed for the first time to eco-driving technology produces a reduction of cumulate fuel consumption only due to speed reduction, and not to changes in the driving style parameters, as recommended by the assisting software. Overall CO2 emissions of eco-driving group were not different from the control group. Rather, the first time use of the eco-driving assistance increases perceived fatigue and the physiological cardiac autonomic balance related to increased workload over time. These difficulties show that eco-driving style cannot be simply adopted by following the assistance device indications. It seems rather a process, which requires specific support during in the first driving-interaction with eco-driving technology. The design of assistance device that aims to change the driving style, could benefit from the measurement of the user's workload to avoid primacy effect that potentially undermine technology efficacy in supporting user-sustainable behaviors.
Keywords: Acceptance | Assistance technologies | CO2 emissions | Human-machine interaction | Pro-environmental behavior | Workload
Abstract: The successful combination of aesthetic and engineering specifications is a long-standing issue. The literature reports some examples where this problem was achieved developing tools to support the automatic generation of new product shapes, embedding and linking predefined rule-sets. Notwithstanding, these kinds of tools are effective if and only if the relations among these specifications are known. Other complementary strategies act upstream by building a common ground: they aid in the formalisation of these specifications, fostering the use of a shared language and the same level of detail. This paper lies in between the previous approaches since its purpose is the description of a strategy to formalise the relations among aesthetic and engineering specifications and whose validities are not affected by the product variability. Indeed, fashion-driven products are subject to continuous innovations and changes. Therefore the identification of these predefined rule-sets is challenging. In detail, the paper objective is to build a high-level and long-lasting formalisation of these relations, based on topological and functional rules. To demonstrate the effectiveness of this approach, we developed a case study in the eyewear industry. We started considering the spectacle-frame functionality and derived the high-level formulation linking aesthetic and engineering specifications. We used this formulation to generate an abstraction of the frame geometry, i.e., an archetype, to be used as a reference for the design of new collections. We implemented the archetype through a MATLAB script, and we translated it into a design tool, to wit an Excel spreadsheet. The validity of both the archetype and the tool has been tested, in collaboration with an eyewear manufacturer, designing and manufacturing two new models of frames.
Keywords: Archetype | Design methods | Design specifications | Design tool | Eyewear industry | Product variability
Abstract: This research aims to design and develop an innovative system, based on an olfactory display, to be used for investigating the directionality of the sense of olfaction. In particular, the design of an experimental setup to understand and determine to what extent the sense of olfaction is directional and whether there is prevalence of the sense of vision over the one of smell when determining the direction of an odor, is described. The experimental setup is based on low cost Virtual Reality (VR) technologies. In particular, the system is based on a custom directional olfactory display, an Oculus Rift Head Mounted Display (HMD) to deliver both visual and olfactory cues and an input device to register subjects’ answers. The VR environment is developed in Unity3D. The paper describes the design of the olfactory interface as well as its integration with the overall system. Finally the results of the initial testing are reported in the paper.
Abstract: 3D virtual reconstruction of human body parts is nowadays a common practice in many research fields such as the medical one, the manufacturing of customized products or the creation of personal avatar for gaming purpose. The acquisition can be performed with the use of an active stereo system (i.e., laser scanner, structured light sensors) or with the use of a passive image-based approach. While the former represents a consolidated approach in human modeling, the second is still an active research field. Usually, the reconstruction of a body part through a scanning system is expensive and requests to project light on the patient’s body. On the other hand, the image-based approach could use multi-photo technique to reconstruct a real scene and provides some advantages: low equipment costs (only one camera) and rapid acquisition process of the photo set. In this work, the use of the photogrammetry approach for the reconstruction of humans’ face has been investigated as an alternative to active scanning systems. Two different photogrammetric approaches have been tested to verify their potentiality and their sensitivity to configuration parameters. An initial comparison among them has been performed, considering the overall number of points detected (sparse point cloud reconstruction, dense point cloud reconstruction). Besides, to evaluate the accuracy of the reconstruction, a set of measures used in the design of wearable head-related products has been assessed.
Abstract: Both physical and virtual prototyping are core elements of the design and engineering process. In this paper, we present an industrial case-study in conjunction with a collaborative agile design engineering process and "methodology." Four groups of heterogeneous Post-doc and Ph.D. students from various domains were assembled and instructed to fulfill a multi-disciplinary design task based on a real-world industry use-case. We present findings, evaluation, and results of this study.
Keywords: Augmented reality (AR) | Collaborative design | Engineering design | Virtual prototyping | Virtual reality (VR)
Abstract: In this paper, we present the development of an application that allows us to simulate the multisensory experience of tasting a glass of wine. To this end, technologies for the sense of touch, sight, hearing, and smell have been integrated, creating an interactive multi-sensory experience. The user, after picking up a glass, activates an application in which he is guided by a virtual sommelier to make a tasting, and he is able to perceive the multi-sensory experience of the wine tasting experience. The paper describes the application and its potential use in marketing.
Keywords: Multisensory product experience | User experience | Virtual reality (VR)
Abstract: The growing potential of virtual reality (VR) systems presents an immense opportunity for amputee patients who are suffering from the medical condition which causes sensation of pain at the location of the missing limb called Phantom Limb Pain (PLP). The occurrence rate of PLP is reported 60–80% [1] among amputee patients and treatment methods vary between physical therapy, surgery and medication. One of the proven treatment methods is “Mirror Therapy” [3] and it is applied by visually projecting a healthy limb on the amputated part with the help of a mirror to create the perception of presence. Similar approach can also be represented in a virtual environment and immersion of VR can enhance the rehabilitation experience. Therefore, we aimed to improve the mirror therapy treatment by a VR application and to overcome the physical limitations by presenting potentially engaging activities with a way to treat double-limb amputees.
Keywords: Mirror therapy | Phantom Limb Pain | Tele-rehabilitation
Abstract: Despite the large number of studies on the multisensory aspects of tactile perception, very little is known regarding the effects of visual and auditory sensory modalities on the tactile hedonic evaluation of textures, especially when the presentation of the stimuli is mediated by a haptic device. In this study, different haptic virtual surfaces were rendered by varying the static and dynamic frictional coefficients of a Geomagic® Touch device. In Experiment 1, the haptic surfaces were paired with pictures representing everyday materials (glass, plastic, rubber and steel); in Experiment 2, the haptic surfaces were paired with sounds resulting from the haptic exploration of paper or sandpaper. In both the experiments, participants were required to rate the pleasantness and the roughness of the virtual surfaces explored. Exploration times were also recorded. Both pleasantness and roughness judgments, as well as the durations of exploration, varied as a function of the combinations of the visuo-tactile and the audio-tactile stimuli presented. Taken together, these results suggest that vision and audition modulate haptic perception and hedonic preferences when tactile sensations are provided through a haptic device. Importantly, these results offer interesting suggestions for designing more pleasant, and even more realistic, multisensory virtual surfaces.
Keywords: Haptics | hedonic touch | multisensory interaction | virtual surface
Abstract: Road accidents represent the primary cause of death among young drivers. To understand the emerging issue of young drivers’ involvement in traffic accidents abroad, risk-taking behaviors were investigated in familiar and unfamiliar driving situations. Twenty-two young Chinese drivers completed a road regulation test followed by a simulated test drive. The number of traffic violations and accidents in familiar and unfamiliar driving intersections were correlated to road regulation knowledge, risk perception scores and to self-assessment of driving skills. Significant number of mistakes was found in risk-taking situations, regardless of the familiarity of the situation, especially for drivers that presented high ratings of self-assessed driving skills. Results show that risk-taking behaviors while driving in unfamiliar conditions are mediated by psychological factors, like self-assessment of being a good driver, more than the actual knowledge of road regulation rules. Implications for international driving can be considered for future research development.
Keywords: Driving skills | International driving | Knowledge of traffic laws | Risk perception | Road intersection | Road violations and accidents | Young drivers
Abstract: The paper deals with prototyping strategies aimed at supporting engineers in the design of the multisensory experience of products. It is widely recognised that the most effective strategy to design it is to create working prototypes and analyse user’s reactions when interacting with them. Starting from this consciousness, we will discuss of how virtual reality (VR) technologies can support engineers to build prototypes suitable to this aim. Furthermore we will demonstrate how VR-based prototypes do not only represent a valid alternative to physical prototypes, but also a step forward thanks to the possibility of simulating and rendering multisensory and real-time modifiable interactions between the user and the prototype. These characteristics of VR-based prototypes enable engineers to rapidly test with users different variants and to optimise the multisensory experience perceived by them during the interaction. The discussion is supported both by examples available in literature and by case studies we have developed over the years on this topic. Specifically, in our research we have concentrated on what happens in the physical contact between the user and the product. Such contact strongly influences the user’s impression about the product.
Keywords: Experience design | Interaction design | User experience | Virtual-mixed prototyping
Abstract: The i.Drive Lab has developed inter-disciplinary methodology for the analysis and modelling of behavioral and physiological responses related to the interaction between driver, vehicle, infrastructure, and virtual environment. The present research outlines the development of a validation study for the combination of virtual and real-life research methodologies. i.Drive driving simulator was set up to replicate the data acquisition of environmental and physiological information coming from an equipped i.Drive electric vehicle with same sensors. i.Drive tests are focused on the identification of driver's affective states that are able to define recurring situations and psychophysical conditions that are relevant for road-safety and drivers' comfort. Results show that it is possible to combine different research paradigms to collect low-level vehicle control behavior and higher-level cognitive measures, in order to develop data collection and elaboration for future mobility challenges.
Keywords: driving assessment | on-road tests | physiological measures | test vehicle | virtual reality simulator
Abstract: This demonstration offers the opportunity to explore a multisensory digital interface as part of the wider research project 'Mapping Memory Routes: Eliciting Culturally Diverse Memes for Digital Archives'. The interface is conceived as a tool for capturing memes rooted in the rich intangible heritage of culturally diverse communities in London, opening up a space for intercultural exchange to be used in meaningful urban design. Based on a model developed by artist and researcher Alda Terracciano for her multisensory installation 'Streets of...7 cities in 7 minutes', the interface is used to explore new design methods to elicit cultural memories through the use of multisensory technology. The tool aims to stimulate collective curatorial practices aimed at democratising decision-making processes in critical heritage studies and urban planning.
Keywords: Community curation | Critical heritage studies | Design research | Multisensory interface design | Sensorial mapping | Sensorial urbanism | Urban planning
Abstract: Books are the tools used for reading novels and stories, but also for educational purposes. Conventional books have undergone a radical transformation in recent years due to the use of new technologies. However, even today the technological devices used for reading e-books are still poorly exploited, despite the fact that they represent a fundamental tool to make the reading experience more immersive by using a complete multisensory approach. In this perspective, one sense that represents an important element of human perception is the sense of smell. Consequently, authors make the hypothesis that the introduction of odours during reading sessions could increase the user experience and the learning performances. In order to demonstrate these hypotheses, the authors have defined and carried out several experimental testing sessions. The analysis of the collected data proved that the introduction of odour does not disturb the reader during reading activities but, on the contrary, can actually make the experience more immersive. Similarly, odours do not disturb studying activities, but they can instead increase the level of concentration and people's learning performance.
Keywords: Augmented reality | Multisensory environment | Olfactory technologies | User experience
Abstract: The design discipline is faced with radical changes related to new technologies and to an increasingly globalized world with more and more competitive markets. These factors are profoundly influencing methods and processes of design, and the knowledge and skills related to the designer's role. Consequently, the design educational models are radically changing. Today, one of the most impacting evolutions is related to rapid prototyping techniques, which are bringing design practice closer to the auto-production. This emerging trend cannot be anymore supported with traditional didactic approaches, but it is necessary to create spaces for allowing students to learn, design and experiment in a shared way. This paper presents the Polifactory Lab at Politecnico di Milano, an innovative makerspace established with the aim of creating a new research and teaching space. In this paper, the authors present the Polifactory Lab, its theoretical purposes, and some examples of didactic activities carried out in the lab.
Keywords: Design educational models | Innovation in education | Makerspace | Rapid prototyping
Abstract: This paper presents an immersive virtual reality system (IVRS) that has been designed for unilateral amputees in order to reduce the phantom limb pain (PLP). The patient's healthy limb is tracked by using a motion sensor. Data of the limb in motion are used as input parameters to move the phantom limb in the immersive virtual reality system. In this way, the patient has the illusion of moving the phantom limb while moving the real and contra-lateral limb. The system has been implemented by using low cost and open technologies, and combines the Oculus Rift SDK2 device, the LeapMotion device, a motion sensor, and an engine for interactive 3D content and gaming generation (Unity 3D). The Oculus Rift head mounted display is used to provide the immersive experience.
Abstract: The advancement of in-vehicle technology for driving safety has considerably improved. Current Advanced Driver- Assistance Systems (ADAS) make road safer by alerting the driver, through visual, auditory, and haptic signals about dangerous driving situations, and consequently, preventing possible collisions. However, in some circumstances the driver can fail to properly respond to the alert since human cognition systems can be influenced by the driving context. Driving simulation can help evaluating this aspect since it is possible to reproduce different ADAS in safe driving conditions. However, driving simulation alone does not provide information about how the change in driver's workload affects the interaction of the driver with ADAS. This paper presents a driving simulator system integrating physiological sensors that acquire heart's activity, blood volume pulse, respiration rate, and skin conductance parameters. Through a specific processing of these measurements, it is possible to measure different cognitive processes that contribute to the change of driver's workload while using ADAS, in different driving contexts. The preliminary studies conducted in this research show the effectiveness of this system and provide guidelines for the future acquisition and the treatment of the physiological data to assess ADAS workload.
Abstract: In this paper, we discuss the possibilities available as well as the challenge to be faced when designing for metal additive manufacturing through the description of an application of the Selective Laser Melting technology within the professional sports equipment field. We describe the redesign activity performed on the cam system of a compound bow, starting from the analysis of the functional, manufacturing and assembly constraints till the strategies applied to guarantee the printability of the object. This activity has thus provided the opportunity to analyse the difficulties currently encountered by practitioners when designing for additive manufacturing due to the lack of integrated design approaches and the high number of aspects that need to be simultaneously taken into account when performing design choices.
Keywords: Design for Additive Manufacturing | Metal Additive Manufacturing | Selective Laser Melting | Sports Equipment
Abstract: Due to the Industry 4.0 initiative, Augmented Reality (AR) has started to be considered one of the most interesting technologies companies should invest in, especially to improve their maintenance services. Several technological limitations have prevented AR to become an effective industrial tool in the past. Now some of them have been overcome, some others not yet by off-the-shelf technologies. In this paper, we present a solution for remote maintenance based on off-the-shelf mobile and AR technologies. The architecture of the application allows us to remotely connect a skilled operator in a control room with an unskilled one located where the maintenance task has to be performed. This application, which has been initially described in a previous work, has been improved on the basis of feedback received by industrial partners. We describe the important features we have added and the rationale behind them to make the remote communication more effective.
Keywords: Augmented Reality | Industry 4.0 | Remote Maintenance
Abstract: One of the most fascinating possibilities of Additive Manufacturing technologies is their capability to realize objects that include various types of joints and moving parts. The research presented in this paper proposes to embed elastic elements in these joints in order to control their compliance. Two applications are also presented, in order to demonstrate, firstly, the practical feasibility of this innovative joint, and, secondly, the possibility to control joint elastic behavior in order to force the connected parts to automatically return to their initial positions when the actuating load is removed.
Keywords: Adaptive-Grip | Additive Manufacturing | Flexi-Hand | Fused Deposition Modelling | Multi-Material-Deposition | Print-in-Place
Abstract: The very rapid evolution of digital technologies and the "Internet of Things" phenomenon are today some of the most important issues that product designers have to face. Consequently, today designers need to understand and manage these new technologies in order to exploit their potential into innovative products. Therefore, it is recommendable that designers focus their activities on the design of the meaning and on the user interaction of products, in order to create smart products that are easy-to-use and enjoyable. In order to address all these issues, the authors set up an experimental workshop in which students with different backgrounds in design-related disciplines were asked to collaborate to the design of a domestic product that allows new tangible interaction with live-data streams. In addition, students were asked to develop the functioning prototype of their design solution, by using rapid prototyping and physical computing techniques. The students were able to develop working prototypes of products that are capable of communicating information derived from real-time data streams. Some of the most representative results of this workshop are presented in the paper.
Keywords: 3D printing | Design education | User centred design
Abstract: In this paper we describe the design of a smart alarm clock, conceived as a persuasive system to foster a sustainable urban mobility. Automatically retrieving and elaborating information available on the web, such as means of transport and weather forecast, the device is able to suggest to the user the most sustainable travelling solution, to help him/her to wake-up and reach the destination on time. Following a user-centered design approach the elaboration of the best travelling solution takes also into account, together with his/her next day appointments, user's needs and habits such as: The time he/she needs to get ready in the morning; his/her travelling preferences. A functional prototype has been built to test the effectiveness of the device using as a context the city of Milan.
Keywords: Design for sustainable behaviour | Multisensory product experience | Smart mobility | Sustainability | User centred design
Abstract: In order to cope with the issues arisen by analysing the first version of the system presented in Chap. 5, a new version of the module has been designed and it will be described in this chapter. It will be based on direct actuation of the joints. Furthermore, we will describe the introduction of an important feature as the possibility to regulate in real time the nominal distance between the modules. This feature allows the control sectors to slide on the strip, thus increasing the performances of the whole system. In this chapter we will also describe the kinematic analysis that have been performed, the design of the components and the developed prototype so as to to evaluate the pros and the limits of this implemented solution. Eventually, we will describe the developed control process, which allows to perform the rendering of virtual surfaces.
Abstract: This chapter will introduce the research work described in this book. The final output of the research work is a new concept of a desktop Tactile Shape Display. This device allows the designer to perform the tactile evaluation of the virtual model of the products they are creating in real time. In particular, we will describe the hypothesis and the starting point of the research.
Abstract: As explained in Chap. 3, the aim of the development of the tactile interface is to allow users to perform the tactile evaluation together with the visual one. Thanks to the desktop and portability features of the developed tactile display, the user can use as a visualisation device the monitor of the PC/Laptop, which he/she is using in order to create the digital model. However, Augmented Reality (AR) technology can be used and exploited to create a more immersive experience. AR technology allows us to superimpose a 3D visual representation of the digital model directly onto the tactile interface. In this way, the user can interact at the same time with the tactile interface, in order to perform the tactile evaluation, and with the digital representation of the model shape, thus performing the visual evaluation.
Abstract: In this last chapter we will summarize the results of the described research work which consist in the development of an innovative Tactile Shape Display. Moreover, we will describe the future works that we are planning in order to proceed with the research activites related to the tactile devices.
Abstract: Tactile interaction consists in providing the user of a Virtual Reality (VR) system with sensations related to touch, mainly during the evaluation and the manipulation of virtual objects. In some cases the term tactile is used to refer to mechanical stimulation of the skin, which -together with the kinaesthetic sense- creates the haptic feedback. For these reasons, the tactile devices are strictly related to the haptic interfaces. Therefore, in order to give a complete and exhaustive overview of the interfaces related to touch, we have analysed the different categories of devices, such as vibrotactile interfaces, force feedback devices, local and full shape displays. Nowadays, a larger number of applications have been developed for tactile and haptic interaction in Virtual Reality. These applications belong to various fields: Medicine (chirurgical simulators, rehabilitation), Education (display of physical or mathematical phenomena), Industry (virtual prototyping, training, maintenance simulations), Entertainment (video games, theme parks), Arts and Creation (virtual sculpture, virtual instruments), etc. Hereafter the State of the Art related to the tactile and haptic technologies is presented.
Abstract: In this chapter we will describe the characteristics of the developed system in terms of: performances, modularity, portability and implementation costs. In order to perform an accurate analysis of the rendered trajectory and of the usability of the system we will describe the user tests performed and we will analyse their results.
Abstract: In order to develop the tactile interface that allows users to touch a 3D shape by means of a continuous and free hand interaction, it is needed to conduct a preliminary study for evaluating the concept. In this chapter we will analyse how it is possible to render a 3D shape by means of a trajectory according to the designers needs. After that, we will study the ways to make possible the physical rendering of this trajectory by controlling the elastic behaviour of a continuous plastic strip. This analysis allows defining the degrees of freedom that have to be controlled in order to actuate the system and the different kinematics solutions. Various conceptual solutions for the actuations systems will be investigated, and the best solution according to the project goals will be chosen.
Abstract: Product design process is typically based on loops between the design and the evaluation phases, which require the creation of digital models and physical prototypes. In the last years, real prototypes have been partially substituted with Virtual Prototypes (VPs). These consist in computer simulations of physical products that can be presented, analysed, and tested from concerned product life-cycle aspects, such as design/engineering, manufacturing, maintenance, aesthetic and ergonomic evaluation, as if these activities are conducted on a real physical model. VPs allow designers to evaluate the characteristics of the future product in order to reduce the number of physical prototypes needed, thus leading also to a reduction of developing time and costs. The currently available tools enable the designers to access only the visual information of the virtual prototypes, while in order to get tactile information, physical prototypes are still needed. In this Chapter it is described the product development methodology where the designer can perform the tactile evaluation of a model of a product at the same time of the visual one. Furthermore, the concept of the system that will allow obtaining this goal will be presented.
Abstract: This chapter presents the concept of the solution based on indirect actuation approach. This configuration aims at positioning the control points of the rendered trajectory in the space. The degree of freedom of the device are controlled by using indirect actuation systems, which are described in the following paragraphs. Moreover in this chapter we will describe the performed kinematic analysis, the design of the components and the developed prototype. In this manner, it has been possible to evaluate the pros and the limits of the presented solution.
Abstract: More than one million people die per year on world’s road. Researches have identified drivers’ cognitive aspects as the major cause of human errors in 80% of crash events. Driver-Assistance Systems (DAS) have been developed to detect data about vehicle, environment and driver, and to communicate information usually through the senses of vision and hearing. But, the growth of in-vehicle devices increases the visual and auditory demand of the driver. This research aims at investigating whether olfactory stimuli can be used to elicit drivers’ cognitive aspects. An experimental framework has been set up, and testing sessions have been organised. The analysis of the data collected from tests shows that olfactory stimuli are more effective in increasing some subjects’ physiological parameters than the auditory ones. Therefore, smells may be used as a DAS, for increasing drivers’ attention.
Keywords: Driver-assistance systems | Drivers’ attention | Olfactory stimuli
Abstract: The Italian fashion industry is nowadays subject to radical transformation; therefore, it needs to remain competitive and, at the same time, innovate itself, in order to strengthen its position in the global market. An important opportunity of innovation can be the introduction of ICT technologies in the garment design process, which today is based on traditional methods and tools. Moreover, this innovation could be particularly important for online sales, in order to reduce the customers’ doubts during purchasing. The research presented in this paper describes a framework for designing clothes as realistic 3D digital models and for allowing customers to evaluate the designed clothes by using realistic virtual mannequins of their bodies instead of the standard ones. A case study will be presented in the paper. The obtained results show that the framework can innovate the traditional garment design process and it could have a huge impact on fashion industry and customers behaviours.
Keywords: Body scanning | Cloth simulation | Design process | Motion capture | Virtual prototype
Abstract: This paper presents a novel system that allows product designers to design, experience, and modify new shapes of objects, starting from existing ones. The system allows designers to acquire and reconstruct the 3D model of a real object and to visualize and physically interact with this model. In addition, the system allows designer to modify the shape through physical manipulation of the 3D model and to eventually print it using a 3D printing technology. The system is developed by integrating state-of-the-art technologies in the sectors of reverse engineering, virtual reality, and haptic technology. The 3D model of an object is reconstructed by scanning its shape by means of a 3D scanning device. Then, the 3D model is imported into the virtual reality environment, which is used to render the 3D model of the object through an immersive head mounted display (HMD). The user can physically interact with the 3D model by using the desktop haptic strip for shape design (DHSSD), a 6 degrees of freedom servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. The DHSSD device is controlled by means of hand gestures recognized by a leap motion sensor.
Abstract: Innovation of fashion-related products implies the continuous search for new and appealing shapes and materials in a short period of time due to the seasonality of the market. The design and manufacturing of such products have to deal with a dimensional variability as a consequence of the new shapes. An additional difficulty concerns properly forecasting the technological behaviour of the new materials in relation to the manufacturing process phases. The control of dimensional variations requires time and resource intensive activities. Human's manual and visual inspection solutions are more common than automatic ones for performing such control, where skilled operators are typically the only ones capable of immediately facing non-standard situations. The full control of such variations is even more subtle and mandatory in the field of spectacles, which are fashion-related products and also medical devices. This paper describes an inspection system developed to monitor the dimensional variations of a spectacles frame during the manufacturing process. We discuss the methodological approach followed to develop the system, and the experimental campaign carried out to test its effectiveness. The system intends to be an alternative to current inspection practices used in the field, and also to provide a methodological approach to enable engineers to systematically study the correlations existing among the frame main functional and dimensional parameters, the material behaviour and the technological variables of the manufacturing process. Hence, the system can be considered a method to systematically acquire and formalise new knowledge. The inspection system consists of a workbench equipped with four high-quality commercial webcams that are used to acquire orthogonal-view images of the front of the frame. A software module controls the system and allows the automatic processing of the images acquired, in order to extract the dimensional data of the frame which are relevant for the analysis. A case study is discussed to demonstrate the system performances.
Keywords: Eyewear industry | Image processing | Inspection systems | Knowledge-based engineering | Product variability
Abstract: The paper presents an integrated technological framework, which aims to make a step forward in contract furniture design methods and supporting web-enabling applications. The implementing platform enables the extensible and temporary cluster of companies to manage the entire contract furniture process. The main contribution regards the integration of different software modules into an overall system that exploits E-marketing Intelligence applications, 3D web-based tools and Augmented Reality techniques. Different user interfaces are implemented to accomplish the involved stakeholders needs and furniture development goals.
Keywords: Collaborative Product Development | E-marketing | Web-enabled design
Abstract: The sense of smell has a great importance in our daily life. Recently, smells have been used for marketing purposes for improving the people's mood and for communicating information about products as household cleaners and food. However, the scent design discipline can be used for creating a "scent identity" of these products not traditionally associated to a specific smell, in order to communicate their features to customers. In the area of virtual reality (VR), several researches concerned the integration of smells in virtual environments. The research questions addressed in this paper concern if virtual prototypes (VP), including smell simulation, can be used for evaluating products as effectively as studies performed in real environments, and also if smells can enhance the users' sense of presence in virtual environments. For this purpose, a VR experimental framework including a prototype of a wearable olfactory display (wOD) has been set up, and experimental tests have been carried out.
Abstract: The sense of smell has a great importance in our daily life. In recent years, scents have been used for marketing purposes for improving the person's mood and of communicating information about products as household cleaners and food. However, scents can be also used for any kind of products to communicate their features to customers. In the area of Virtual Reality several researches have focused on integrating scents in virtual environments. The research questions addressed in this work concern if scents can contribute to increase the users’ sense of presence in the virtual environment, and also if Virtual Prototypes, including the sense of smell, can be used for evaluating products as effectively as studies performed in real environments. For this purpose, a Virtual Reality experimental framework including a prototype of a Wearable Olfactory Display has been set up, and experimental tests have been performed.
Keywords: Multisensory environment | olfactory display | product evaluation
Abstract: This work evaluates the impact of augmented reality (AR) technology to support operators during manual industrial tasks. The work focuses specifically on monitor-based augmented reality as a solution to provide instructions to the operator. Instructions are superimposed directly onto a video representation of the physical workspace and are displayed on a standard monitor. In contrast to previous AR solutions proposed to support manual tasks, the current work is more industrially acceptable because it meets most of the industrial requirements and it is cost effective for a deployment in an industrial environment. The developed prototype is described and evaluated by means of a user study to compare the monitor-based augmented reality solution to provide instructions and the traditional method. The test shows that AR can be a valid substitute to support the operators during manual tasks because it allows them to be more time efficient and it reduces their mental workload compared to traditional instructional manuals.
Keywords: Augmented reality | Comparative test | Instructional media | Manual task
Abstract: Abstract: The product design process is based on a sequence of phases where the concept of the shape of a product is typically represented through a digital 3D model of the shape, and often also by means of a corresponding physical prototype. The digital model allows designers to perform the visual evaluation of the shape, while the physical model is used to better evaluate the aesthetic characteristics of the product, i.e. its dimension and proportions, by touching and interacting with it. Design and evaluation activities are typical cyclical, repeated as many times as needed in order to reach the optimal and desired shape. This reiteration leads to an increase of the development time and, consequently, of the overall product development cost. The aim of this research work is to develop a novel system for the simultaneous visual and tactile rendering of product shapes, thus allowing designers to both touch and see new product shapes already during the product conceptual development phase. The proposed system for visual and tactile shape rendering consists in a Tactile Display able to represent in the real environment the shape of a product, which can be explored naturally through free hand interaction. The device is designed in order to be portable, low cost, modular and high performing in terms of types of shapes that can be represented. The developed Tactile Display can be effectively used if integrated with an Augmented Reality system, which allows the rendering of the visual shape on top of the tactile haptic strip. This allows a simultaneous representation of visual and tactile properties of a shape. By using the Tactile Display in the initial conceptual phases of product design, the designers will be able to change the shape of a product according to the tactile evaluation, before the development of the physical prototype. This feature will lead to a decrease of the number of physical prototypes needed, thereby reducing, both cost and overall time of the product development process.
Keywords: augmented reality | shape rendering | Tactile display | virtual prototyping
Abstract: The paper describes a system to supply meaningful insights to designers during the concept generation of new car interiors. The aim of the system is to capture the movements of car passenger's and making these acquisitions directly available as a generative input. The system has been developed by integrating the Abstract Prototyping technique with Motion Capture technology. In addition, a systematic procedure allows the treatment of the collected data to obtain a graphical representation, which can be easily used with standard NURBS-based modeling software. The effectiveness of the system has been evaluated through a testing session conducted with subjects. The outcomes of the testing sessions have highlighted benefits and limitations of the implemented system.
Keywords: Abstract prototyping | human factors | motion capture
Abstract: Every year approximately more than one million people die on world's road. Human factors are the largest contributing factors to the traffic crashes and fatality, and recent researches have identified drivers' cognitive aspect as the major cause of human errors in 80% of crash events. Thus, the development of countermeasures to manage drivers' cognitive aspect is an important challenge to address. Driver-Assistance Systems have been developed and integrated into vehicles to acquire data about the environment and the driver, and to communicate information to the driver, usually via the senses of vision and hearing. Unfortunately, these senses are already subjected to high demands, and the visual and auditory stimuli can be underestimate or considered as annoying. However, other sensory channels could be used to elicit the drivers' cognitive aspect. In particular, smell can impact on various aspects of humans' psychological state, such as people's attention level, and can induce activation states in people. The research presented in this paper aims at investigating whether olfactory stimuli, instead of auditory ones, can be used to influence the cognitive aspect of the drivers. For this purpose, an experimental framework has been set up and experimental testing sessions have been performed. The experimental framework is a multisensory environment consisting of an active stereo-projector and a screen used for displaying a video that reproduces a very monotonous car trip, a seating-buck for simulating the car environment, a wearable Olfactory Display, in-ear earphones and the BioGraph Infiniti system for acquiring the subjects' physiological data. The analysis of the data collected in the testing sessions shows that, in comparison to the relaxation state, olfactory stimuli are effective in increasing subjects' attention level more than the auditory ones.
Abstract: Today's world is facing numerous problems due to an uncontrolled waste of energy and of primary resources in general. To manage this, on one side designers are asked to improve the efficiency of products; on the other side, users must be trained toward a more sustainable lifestyle. Some researchers are exploring the idea of trying to change users' behavior while interacting with products in order to make it more sustainable. This trend is known as "design for sustainable behavior" applied to energy/resources consumption issues. Our idea is to stimulate users in changing their behavior by introducing a multisensory communication with the product. This communication is not meant as warning messages informing the users about wrong habits/actions or something like; instead, it should consist of sensorial stimuli able to naturally drive users in performing the right actions. However, before designing these stimuli, it is fundamental to highlight the aspects and conditions that do not allow users behaving in a sustainable way when interacting with products. In this paper we discuss about the aspects that could be useful to explore in order to retrieve the specifications to drive both the design and prototyping phases, so as to faithfully test the effectiveness of the feedback with final users.
Abstract: Product designers are more and more addressing the design of product experience, in addition to the more traditional product design, where the focus of the design practice is on the interaction between the product and its users. Traditional methods and tools used so far for the design of products are not suitable to the design of experience. Among the emerging methods, experience prototyping seems effective and well addressing the new requirements of experience design. The paper describes the emerging technologies enabling experience prototyping, and provides some examples where this new methodology for experience design has been applied.
Abstract: Moving from conceptual design intentions to the materialization in product sensory qualities can be challenging. For Experience-driven designers this transition can be even more difficult, as they need to move from the abstract level of user experience to the concrete level of product features. In this paper, we suggest an approach to progressively deconstruct experiential visions and decrease the level of abstraction. We propose the use of a tool, namely the Experience Map, which describes five steps to develop a well-refined materialization and maintain a solid correlation with the initial intention. To investigate its value and challenge the approach in design practice, we set up four case studies. The analysis of designers' attitudes towards the Experience Map gave insights on its ability to provide a structure for creative thoughts, while suiting different and subjective attitudes of designers. Moreover, the map supports the integration of several different elements and the exploration of alternative design directions to achieve the intended, holistic experience. Some limitations were also highlighted by the case studies, which are discussed in light of future work.
Keywords: Aesthetics | Experience design | Experience prototyping | Materialization | Multisensory
Abstract: Along with the rise of Experience Design, the term Experience Prototyping was coined to describe the practice of prototyping for an experience-driven design approach. However, limited resources are available to define what Experience Prototyping is, which approaches it entails, and its scopes within the product development process. To answer these questions, we first discuss the fundamental definition of Prototyping itself, proposing a model that can describe Experience Prototyping too. The model details the possible focuses that Experience Prototyping can take, aiming at fostering greater awareness on how to prototype for future experiences. Furthermore, we discuss the role of new emerging technologies in shaping the practices related to prototyping. As an example, we report one case in which we used virtual technologies to perform an Experience Prototyping activity at the early phases of design process. The aim of this paper is to contribute to both design research and design practice by providing significant knowledge to shed light on the multifaceted practice of Experience Prototyping and thus tackle the prototyping decisions with greater awareness.
Keywords: design theory | Experience Design | Experience Prototyping | virtual technologies
Abstract: This paper describes the implementation of a Multimodal Guidance System (MGS) for upper limb rehabilitation through vision, haptic and sound. The system consists of a haptic device that physically renders virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs about patient’s actions while performing a manual task as for example: starting and/or finishing an sketch; different sounds related to the hand’s velocity while sketching. The goal of this sonification approach is to strengthen the patient’s understanding of the virtual shape which is used in the rehabilitation process, and to inform the patient about some attributes that could otherwise remain unseen. Our results provide conclusive evidence that the effect of using the sound as additional feedback increases the accuracy in the tasks operations.
Keywords: Haptic guidance | Sound interaction | Upper-limb rehabilitation
Abstract: The study of new systems for supporting upper limb rehabilitation is of primary importance, due to the high number of people in need of rehabilitation and the limited effectiveness of most of the current systems. The research work described in this paper proposes a VR system for upper-limb rehabilitation that is immersive, is based on hand gestures to interact with virtual objects, and which can deliver odours when a goal is reached.
Abstract: This paper presents an immersive virtual reality system that includes a natural interaction approach based on free hand gestures that is used to drive a Desktop Haptic Strip for Shape Rendering (DHSSR). The DHSSR is a mechatronic display of virtual curves intersecting 3D virtual objects, and aims at allowing designers to evaluate the quality of shapes during the conceptual design phase of new products. The DHSSR consists of a 6DOF servo-Actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. Virtual curves can be interactively generated on the 3D surface of the virtual object, and coherently the DHSSR haptic interface renders them. An intuitive and natural modality for interacting with the 3D virtual objects and 3D curves is offered to users, who are mainly industrial designers. This consists of an immersive virtual reality system for the visualization of the 3D virtual models and a hand gestural interaction approach used by the user for handling the models. The system has been implemented by using low cost and open technologies, and combines a software engine for interactive 3D content generation (Unity 3D), the Oculus Rift Head Mounted Display for 3D stereo visualization, a motion capture sensor (LeapMotion) for tracking the user's hands, and the Arduino Leonardo board for controlling the components. Results reported in the paper are positive for what concerns the quality of the rendering of the surface, and of the interaction modality proposed.
Abstract: Today smells are used for communicating information about products as household cleaners and food. However, smells can be also applied to any kind of products. Several researches have focused on integrating smells in virtual environments. The research questions addressed in this work concern whether Virtual Prototypes, including the sense of smell, can be used for evaluating products as effectively as studies performed in real environments, and whether smells can increase the users' sense of presence in the virtual environment. For this purpose, an experimental framework including a wearable olfactory display has been developed, and experimental tests have been performed.
Keywords: Olfactory Display | Presence | Virtual Reality | Wearable Device
Abstract: This paper presents a hand gestural interaction system that is used to handle a Desktop Haptic Strip for Shape Rendering (DHSSR). The strip is a physical display of virtual curves intersecting 3D virtual objects, and aims at allowing designers to evaluate the quality of shapes during the conceptual design phase of new products. The DHSSR consists of a servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. In order to generate the virtual curve intersecting the virtual object it has been implemented a hand gestural interaction system, which allows moving the curve or the 3D object. Specifically, through an interaction approach based on a magnetic pinch the user can move the virtual strip until reaching the desired position. Instead, an interaction modality based on index finger recognition allows moving the 3D object while the virtual curve is floating in space. The virtual curve generated by the interaction is used for deforming the physical strip. The hand gestural interaction system has been implemented by using low cost and open technologies, and combines an engine for interactive 3D content generation (Unity 3D), a motion capture sensor (LeapMotion) and the Arduino Leonardo board. In particular, the user's hands are tracked by means of the Leap Motion sensor. The Unity 3D environment is used to virtually render the virtual objects (curve and shape) and the virtual avatar of the users' hands. The virtual environment is connected with the Arduino Leonardo board in order to control the six servo-actuators of the DHSSR. The paper also presents some studies performed to evaluate the performances of the DHSSR system and the usability of the hand gestural interaction modalities.
Keywords: Hand gestural interaction | Haptic strip | Motion capture sensor | Shape rendering
Abstract: Experience Design gained a lot of attention from both academic and professional research. The state of the art covers the theoretical notions of User Experience and provides designer with step-by-step methodologies. Another great amount of references addresses some specific moments of the design process. While being specific and extensive on these topics, literature lacks in explaining how to move from the abstract level of Experience to the pragmatic choice of product features. Designers who intentionally aim at creating products able to elicit specific meaningful experiences can benefit from the introduction of a methodological tool that supports them through the Experience-driven design process. The final goal of the tool is to help designers in visualising and deconstructing the Experience they wish to recreate in the product, into a set of sensory features. The article introduces a ‘working principle’, a strategy to fulfil the Experience Design process, considering some fundamental scientific resources. On these bases, we will present a first draft of the tool and narrate the results of a pilot validation study with designers. The paper ends with an exploration of future developments and possible directions of research in the Experience Design domain.
Keywords: accordance | design methodology | design thinking | experience design | experience map
Abstract: This paper describes the design and implementation of a system for rendering virtual shape through vision, haptic and sound. The system consists of a haptic strip that physically renders virtual curves. A flexible capacitive touch sensor (FCTS) is integrated with the haptic strip, and allows the system to track the position of the user's fingers on the strip. According to the position, the system renders curve properties such as curve shape, inflexion points and curvature through sound metaphors. The goal of this sonification approach is to strengthen the user's understanding of the shape of a virtual prototype, and to inform the user about geometrical attributes that could otherwise remain unseen. Such unseen attributes may either be a result of limitations in the visual and haptic display hardware or a result of limitations in human perception.
Keywords: conceptual design | haptic rendering | human-computer interaction | immersive virtual reality | product design
Abstract: The paper proposes an alternative approach to well-known feedback solutions, such as visual displays or warning sound messages, to make users perceptually aware of the energy consumption occurring when using a product. The approach is grounded on the use of multisensory feedback interfaces that are designed to make the user experience the consumption process directly during the interaction with the product. Such multisensory feedback should be intended as indications, rather then alarms, so as to naturally guide users towards a more sustainable behaviour. The daily task of opening the fridge door has been used as case study. All the steps followed to ideate and test the effectiveness of the designed multisensory interfaces are discussed. The results demonstrate how even simple stimuli, such as a gradual colour change of the fridge cavity from a cold to a warm one, may be able to reduce the time users keep the fridge door open.
Keywords: Design for behaviour change | Human behaviour in design | Multisensory design | Sustainability | User centred design
Abstract: Human movements express non-verbal communication: The way humans move, live and act within a space influences and reflects the experience with a product. The study of postures and gestures can bring meaningful information to the design process. This paper explores the possibility to adopt Motion Capture technologies to inform the design process and stimulate concept generation with an Experience Design perspective. Motion data could enable designers to tackle Experience-driven design process and come up with innovative designs. However, due to their computational nature, these data are largely inaccessible for designers. This study presents a method to process the raw data coming from the Motion Capture system, with the final goal of reaching a comprehensible visualization of human movements in a modelling environment. The method was implemented and applied to a case study focused on User Experience within the car space. Furthermore, the paper presents a discussion about the conceptualization of human movement, as a way to inform and facilitate Experience-driven design process, and includes some propositions of applicable design domains.
Keywords: Body tracking | Conceptual design | Data visualization | Motion capture | User experience
Abstract: The interest of people working in rehabilitation towards the possibilities offered by Virtual Reality (VR) technologies is growing in years. Through VR technologies, rehabilitation can become more engaging with respect to traditional methods, since exercises can be performed in different simulated scenarios. They can be adapted on the basis of patient's requests, and can be easily modified to have growing difficulties, according to the rehabilitation progresses. Furthermore results can be collected and monitored, even remotely, if necessary. The paper describes the development and testing of a set of exercises in a multimodal VR environment for upper limb rehabilitation. The VR environment includes technologies addressing three senses: vision, hearing and touch. The patient is asked to grab and move a number of objects in an ecologically valid environment, which corresponds to a household scenario. While s/he performs the exercises, object trajectories are recorded in order to be analyzed later on. The development as well as a preliminary testing activity are reported in the paper.
Abstract: This paper presents a new concept of a desktop tangible shape display for virtual surface rendering. The proposed system is able to represent in the real environment the shape of a digital model of a product, which can be explored naturally through a free-hand interaction. Aim of the shape display is to allow product designers to explore the rendered surface through a continuous touch of curves lying on the product shape. Ideally, the designer selects curves, which can be considered as style features of the shape, on the shape surface, and evaluates the aesthetic quality of these curves by manual exploration. In order to physically represent these selected curves, a flexible surface is modelled by means of servo-actuated modules controlling a physical deforming strip. The behaviour of the strip is controlled by acting on the position and rotations of a discrete number of control sectors. Each control sector is controlled by a module, which is based on an absolute positioning approach and equipped with five degrees of freedom. The developed system is able to manage the elastic behaviour of the strip in terms of bending, twisting and local tangency. The tangency control allows us to manage the local tangency of the strip to the rendered trajectory, thus increasing the accuracy of the representation. Moreover, a preliminary second version of the module is presented, which has been designed so as to allow the control sectors to slide on the strip. Thanks to this feature, it will be possible to place the control sector in a given point of the trajectory, such as point of maximum, point of minimum or inflection points. The device is designed to be portable, low cost, modular and high performing in terms of types of shapes that can be represented. A prototype equipped with three modules has been developed in order to evaluate the usability and the performances of the display.
Abstract: The study of experience in the interaction with goods, both physical and virtual, is consistently affected by the role of the context in which this interaction occurs. The paper analyses the global problem starting from three independent variables: product, people and context, and focuses on the analysis of single couples of independent variables: goods-person, goodscontext and person-context. From this analysis it is derived a general framework oriented to the experimenting and validation of the use of Virtual Prototyping to optimize the perceived value of products To-Be.
Abstract: The sense of smell has a great importance in our daily life. In recent years, smells have been used for marketing purposes with the aim of improving the person's mood and of communicating information about products as household cleaners and food. However, the scent design discipline can be also applied to any kind of products to communicate their features to customers. In the area of Virtual Reality several researches have focused on integrating smells in virtual environments. The research questions addressed in this work concern whether Virtual Prototypes, including the sense of smell, can be used for evaluating products as effectively as studies performed in real environments, and also whether smells can contribute to increase the users' sense of presence in the virtual environment. For this purpose, a Virtual Reality experimental framework including a prototype of a wearable olfactory display has been set up, and experimental tests have been performed.
Abstract: Emotional expression is an important human behaviour, which enriches communication. Sensory organs play crucial role in emotional perception. Today communication is mostly done via digital mediators, which dominantly address to vision excluding the other senses; therefore, communication becomes less affective. Wearable technology can appeal to sensory organs from very close distance due to its intimate interaction with human body. Hence, this technology can be used in order to make distant communication more affective by enabling multisensory interaction. This paper represents a user-centred design practice on wearable products that simulate sensorial feedbacks (tactile, visual and auditory) to express basic emotions. Three prototypes that transmit emotional messages were designed, built and tested to observe user behaviour. This paper discusses how user experience obtained through the user test can be taken further to design new communication products.
Keywords: Design practice | Emotion | Sensory perception | Wearable technology
Abstract: Beyond ergonomic measurements, the study of human movements can help designers in exploring the rich, non-verbal communication of users’ perception of products. This paper explores the ability of human gestures to express subjective experiences and therefore, to inform the design process at its early stages. We will investigate the traditional techniques used in the Experience Design domain to observe human gestures, and propose a method to couple Experience-driven design approach with Motion Capture technique. This will allow integrating qualitative user observations with quantitative and measurable data. However, the richness of information that Motion Capture can retrieve is usually inaccessible for designers. This paper presents a method to visualize human motion data so that designers can make sense of them, and use them as the starting point for concept generation.
Keywords: Body Tracking | Concept design | Data visualization | Motion Capture | User experience
Abstract: The sense of smell has a great importance in our daily living: today pleasant odors are used to elicit positive emotions in users. In the marketing area, a lot of works have been done concerning the use of odors for communicating information for products as household cleaners and foods. In the area of Virtual Reality (VR) several researches have focused in presenting odors in virtual environments. The introduction of odors simulation in virtual environment could represent an easy and flexible tool for evaluating industrial products characteristics. This research work aims at evaluating in which way odors can influence the users’ evaluation of products and if studies on the influence of odors on the users’ evaluation of products in a VR environment and in a real environment can be comparable. For this purpose, an experimental framework has been defined, a wearable olfactory display has been developed and experimental testing sessions have been performed.
Keywords: Multisensory interaction | Olfactory display | Product evaluation | User experience
Abstract: ABSTRACT: The paper describes an interactive Finite Elements Analysis (FEA) tool that aims to improve the learning of mechanical behavior of materials in industrial engineering schools. We implemented a “user in the loop” approach where students can explore the mechanical behavior of virtual specimens selected from a library of standard elements (cantilever beam, IPE beams etc.). The users can apply forces or displacements interactively by mouse or haptic device, and visualize and “feel” the structures stress configurations. We extended our previous work and compared this novel approach with respect to traditional FEA learning techniques. A test with twenty engineering students showed that learners following the interactive approach are faster in completing the given assignment showing a reduced error rate.
Keywords: engineering education | haptics | real-time finite element analysis | virtual reality
Abstract: ABSTRACT: This paper proposes a new interactive Augmented Reality (AR) system, which has been conceived to allow a user to freely interact with virtual objects integrated in a real environment without the need to wear cumbersome equipment. The AR system has been developed by integrating the Fog Screen display technology, stereoscopic visualization and the Microsoft Kinect. The user can select and manage the position of virtual objects visualized on the Fog Screen display by using directly his/her hands. A specific software application has been developed to perform evaluation testing sessions with users. The aim of the testing sessions was to verify the influence of issues related to tracking, visualization and interaction modalities on the overall usability of the AR ‘system. The collected experimental results demonstrate the easiness of use and the effectiveness of the new interactive AR system and highlight the features preferred by the users.
Keywords: augmented reality | design review | gesture-based interface
Abstract: In this paper, we present a pilot study of a haptic system that leads the patients' limb to follow trajectories performed on a plane or in space (2D or 3D haptic trajectories). This function is implemented by the Multimodal Guidance System (MGS) whose aim is to provide robotic assistance during the rehabilitation of upper extremities when patients perform 2D and 3D tasks during manual activities such as drawing, coloring and gaming. The MGS consists of a virtual environment including several technologies as haptic, sound and video gaming. The patients are able to feel virtual objects and haptic trajectories, which act as virtual guides taking advantages of its force feedback capabilities. A virtual environment is used forming a haptic interface between the patient and the game. The haptic device is driven under the user's movements and assisted through the Magnetic Geometry Effect (MGE). Several 2D and 3D haptic trajectories have been tested in order to analyze the use interaction with the MGS. Preliminary evaluation has been performed in order to obtain more information related to the accuracy of the haptic trajectories. The haptic device has been used as an input means for tracking the hand trajectory made by the patient according to the feedback received from 2D and 3D tasks. The performance has been evaluated by comparing the analysis of the tracking results.
Keywords: desktop haptic system | haptic guidance | multimodal guidance | rehabilitation system
Abstract: This paper describes an efficient methodology based on the co-simulation between several software tools that has been developed to drive and increase the dynamic behavior of a Desktop Mechatronic Interface (DMI) for shape rendering. The co-simulation is performed by using a multi-body software, which is linked to MATLAB/Simulink. With the multi-body software the virtual simulation of the DMI is performed in order to sense the rotation angles of the virtual servo-actuators. By using this interaction it is possible to control the real servo-actuators presented in the DMI. In addition, through this methodology it is possible to perform experimental simulation in kinematics, dynamics and control of the DMI. A collision approach is presented taking into account both, friction and restitution coefficients which are required in the virtual simulation of the DMI. Simulation results show that the co-simulation platform is established successfully.
Abstract: The paper describes the design and preliminary virtual testing of OSHap, an open-source low cost 2DOF haptic interface. The aim the research is to create a mechatronic platform for haptic interaction based mainly on cheap and open source components. The results will be distributed freely in order to provide guidelines for expert users to develop their own haptic interface. Both the mechanical design philosophy and the control electronics are hereby discussed together with a preliminary testing made by using virtual prototyping tools. The kinematic layout is based on a serial double joint scheme. This choice is claimed to be the less expensive one, inasmuch as tolerances do not have to be narrow as for the parallel systems. The system also allows future users to tune dimensions (and relative workspace) easily. The interface is admittance controlled for maximizing the transparency with low cost drives. The device is based on the Intel Galileo open-source development board: this board embeds a powerful chip-sets that enables us to perform real-time 2DOF control tasks that are hardly reachable with the most common open source electronic platforms. All the choices listed above are discussed and evaluated, with focus on performance-to-cost ratio.
Abstract: Our goal is to allow the creators to focus on their creative activity, developing their ideas for physical products in an intuitive way. We propose a new CAD system allows users to draw virtual lines on the surface of the physical object using see-through AR, and also allows users to import 3D data and make its real object through 3D printing.
Keywords: 3D printing | AR/MR | CAD | Collaboration
Abstract: In this work, a method for monitoring fatigue crack growth in a metal to composite bonded joint based on the strain field is proposed and applied in the framework of a visualization tool based on Augmented Reality (AR). This tool superimposes some virtual objects, which are the data acquired by the sensors and the crack length, directly on top of the specimen under inspection and in real time. By finite element (FE) analyses, a good correlation between the crack tip position and the strain field in a single lap specimen is found and this feature is exploited to monitor the crack length during fatigue tests and to feed the AR system to virtually visualize the crack on the real specimen. An array of electrical resistance strain gauges is bonded to the surface of one adherend. A Matlab function collects values from the strain gauges mounted on the specimen under investigation analyses them on the basis of the FE analysis and finally feeds the AR system. The validation of this process is done by measuring the crack by optical microscope. This procedure is also tested with the use of Fiber Bragg Gratings (FBG) optical strain gauges. Copyright © 2014 Taylor & Francis Group, LLC.
Keywords: Aluminum and alloys | Augmented reality | Composites | Fatigue | Fiber Bragg Grating
Abstract: Product maintenance is a service offered to customers which represents an interesting business for companies. Their interest is both providing a good service in terms of quality and at the same time cutting operational costs. In this view companies are seeking tools that enable them to reach both goals, among those offered by the rapidly evolving ICT sector. The paper describes an application based on augmented reality and mobile technologies aiming to support remote maintenance operations, and improve maintenance services that companies offer to their customers. The paper describes the main idea at the basis of the application, the requirements as well as its implementation. Finally a case study is presented.
Abstract: This paper describes a shape and force tracking approach aimed for the assessment and training of patients' upper extremities functionalities, while performing 2D tasks in a post-stroke rehabilitation program. The 2D tasks are assisted by a Multi-modal Guidance System (MGS), which consists in a combination of visual, haptic and sound interaction. The device enables users to haptically interact with a virtual template, which acts as a virtual tool path taking advantage of its force feedback capabilities while the patient performs a 2D task, as sketching and hatching operations. Furthermore, the patient receives sound information, which provides audio feedback related to the hand velocity. By tracking the shape and the forces required to complete the tasks according to the visual feedback provided on the computer screen, the system can inform about quantitative measurement of a patients progress. The paper concludes by presenting a preliminary test using the device for sketching and hatching operations.
Abstract: The objective of the paper is to present a case study to exploit interactive Virtual Prototypes (iVPs) for investigating the way humans experience products. This method can be used for "prototyping" new product experiences, for monitoring users' emotional reactions during the interaction and finally, for practically redesigning these experiences on the basis of the users' feedback. Products considered here are domestic appliances, where the experience consists of the interaction with their physical interfaces.
Abstract: In this paper, a framework based on Augmented Reality (AR) and a mobile device is proposed for monitoring mechanical components during fatigue tests. This solution enables the user to move around and to inspect the component through AR from different points of view. In addition, the framework proposes a solution to estimate in real time the crack growth and to visualize it directly onto the component. A user application is developed according to the proposed framework and it is used for case studies of fatigue test for adhesively bonded joints.
Abstract: The perception of haptic textures depends on the mechanical interaction between a surface and a biological sensor. A texture is apprehended by sliding one’s fingers over the surface of an object. We describe here an apparatus that makes it possible to record the mechanical fluctuations arising from the friction between a human fingertip and easily interchangeable samples. Using this apparatus, human participants tactually scanned material samples. The analysis of the results indicates that the biomechanical characteristics of individual fingertips clearly affected the mechanical fluctuations. Nevertheless, the signals generated for a single material sample under different conditions showed some invariant features. We propose that this apparatus can be a valuable tool for the analysis of natural haptic surfaces.
Keywords: Apparatus | Biomechanics | Biotribology | Humans | Texture
Abstract: Understanding users points of view is rapidly becoming an urgent issue for companies and designers, as well as User Experience and Experience Design proving to be of wide interest for academic research. This paper presents an experimental-based methodology aimed at guiding designers during product optimisation. It is meant to support designers in choosing the right strategy to assess the users emotional reaction towards a product at an early stage of product development. The methodology consists of three different phases: 1) Design Challenge definition, to help in clarifying the research question; 2) Interaction Study, aimed at understanding the user experience; 3) Sensory Boost phase, to improve the products perceived pleasurableness. The methodology includes a review of the methods and tools used for catching users emotional reactions to products. Moreover, a computer-based version of the methodology will be introduced, together with two case studies to validate the developed methodology.
Keywords: Emotional design | Methodology | Sensory boost | User experience
Abstract: In the product development process one of the crucial phases is the evaluation of the design of the product that must satisfy the marketing targets based on the users' needs analysis. It is commonly acknowledged that a product is successful if people like and buy it. In the phase of ideation of a new product, it is paramount to test functionality and performances as well as the users' appreciation and feeling towards the new product. More specifically, in the case of consumer products characterised by a plurality of offers, interaction and experience should be addressed in addition to function and aesthetics in the user studies. Recent research has focused on the study of the user's emotional reaction when interacting and experiencing products, which is correlated with the global appreciation of the product and of its attributes. This paper presents an emotional engineering methodology using interactive virtual prototyping for evaluating the user experience and the emotional response with newly designed products early in the development process. The methodology suggests a way to optimise those aspects at the product concept phase. © 2014 © 2014 Taylor & Francis.
Keywords: emotional engineering | product experience | virtual prototyping
Abstract: This paper describes a system that combines haptic, virtual reality and game technologies in order to assist repetitive performances of manual tasks to patients, which are recovering from neurological motor deficits. These users are able to feel virtual objects by using a haptic device, which acts as a virtual guide taking advantages of its force feedback capabilities. A virtual environment is used forming a haptic interface between the patient and the game. The haptic device is driven under the users movements and assisted through the Magnetic Geometry Effect (MGE). Preliminary evaluation has been performed in order to validate the system in which two different tasks have been performed (throw down bricks in an hexagonal tower without and with haptic assistance) with the aim to obtain more information related to the accuracy of the device. © 2014 Springer International Publishing.
Keywords: Gaming | Haptic interface | Post-stroke Rehabilitation | Virtual Reality
Abstract: The research activity presented in this paper aim at extending the traditional planar navigation, which is adopted by many desktop applications for searching information, to an experience in a Virtual Reality (VR) environment. In particular, the work proposes a system that allows the user to navigate in virtual environments, in which the objects are spatially organized and sorted. The visualization of virtual object has been designed and an interaction method, based on gestures, has been proposed to trigger the navigation in the environment. The article describes the design and the development of the system, by starting from some considerations about the intuitiveness and naturalness required for a three-dimensional navigation. In addition, an initial case study has been carried out and consists in using the system in a virtual 3D catalogue of furniture. © 2014 Springer International Publishing.
Keywords: Gestures | Natural User Interfaces | Navigation | Virtual Catalogue | Virtual Reality
Abstract: The paper proposes a framework for supporting maintenance services in industrial environments through the use of a mobile device and Augmented Reality (AR) technologies. 3D visual instructions about the task to carry out are represented in the real world by means of AR and they are visible through the mobile device. In addition to the solutions proposed so far, the framework introduces the possibility to monitor the operator's work from a remote location. The mobile device stores information for each maintenance step that has been completed and it makes them available on a remote database. Supervisors can consequently check the maintenance activity from a remote PC at any time. The paper presents also a prototype system, developed according to the framework, and an initial case study in the field of food industry. © 2014 Springer International Publishing.
Keywords: Augmented Reality | Framework | Maintenance tasks | Remote Supervision
Abstract: Designing physical interfaces, like the doors of consumer products, able to elicit a positive experience when interacting with them, is now becoming a key priority for design teams. One of the main difficulties of this activity consists of translating all the qualitative perceptual feedback that can be captured from the customers into quantitative specifications. Performing this translation is not an easy task since there are still no effective tools, methodologies or approaches able to guide designers in accomplishing this goal. To overcome this lack a reverse engineering-based approach is proposed. This one guides designers towards the modelling, parameterisation and reproduction of the behaviour of the product interface to be redesigned, within a multisensory virtual environment. The intent is to let the user experience different behaviours in order to ask them to identify the desired one or to express preferences for updating it in real-time according to indications provided. At the same time a detailed physics model, built by the designer, is used to convert this desired behaviour, into detailed quantitative design specifications. The method is defined as a reverse engineering one for two main reasons: first the new interaction is derived on the basis of the behaviour of an existing interface, taken as reference, and second a reverse engineering of the user's perceptual preferences is applied to derive new specifications. A case study is discussed to demonstrate the method effectiveness and to highlight its limitations. © 2014 Taylor & Francis.
Keywords: haptics | interactive systems | product experience | reverse engineering | virtual prototyping
Abstract: In this research work, we present a Multimodal Guidance System (MGS) whose aim is to provide dynamic assistance to persons with disabilities (PWD) while performing manual activities such as drawing, coloring in and foam-cutting tasks. The MGS provides robotic assistance in the execution of 2D tasks through haptic and sound interactions. Haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs related to the hand's velocity while sketching and filling or cutting operations. By combining this Multimodal System with the haptic assistance, we have created a new approach with possible applications to such diverse fields as physical rehabilitation, scientific investigation of sensorimotor learning and assessment of hand movements in PWD. The MGS has been tested by people with specific disorders affecting coordination, such as Down syndrome and developmental disabilities, under the supervision of their teachers and care assistants inside their learning environment. A Graphic User Interface has been designed for teachers and care assistants in order to provide training during the test sessions. Our results provide conclusive evidence that the effect of using the MGS increases the accuracy in the tasks operations. Implications for RehabilitationThe Multimodal Guidance System (MGS) is an interface that offers haptic and sound feedback while performing manual tasks.Several studies demonstrated that the haptic guidance systems can help people in recovering cognitive function at different levels of complexity and impairment.The applications supported by our device could also have an important role in supporting physical therapist and cognitive psychologist in helping patients to recover motor and visuo-spatial abilities. © 2014 Informa UK Ltd. All rights reserved: reproduction in whole or part not permitted.
Keywords: Disabilities | Down syndrome | Haptic feedback | Manual tasks | Multimodal guidance device | People without skills
Abstract: The paper describes the issues related to the implementation and the control of a low cost 1DOF haptic device. Arduino, which has been widely adopted in the interaction design schools as a means to produce low cost, low-fi and rapid prototypes of interactive systems, has been used here as an alternative to more performing electronic control boards to control a DC motor and simulate a haptic knob. The idea is to explore the possibility of creating simple prototypes of haptic devices before building the final product. At this stage we have explored the implementation of a simple haptic knob, trying to identify the main issues. In particular, some control strategies have been used for the motor, and pros and cons are described in the paper. © 2014 IEEE.
Keywords: H.5.2 [Information Interfaces and Presentation]: User Interfaces?Haptic I/O
Abstract: This paper presents a new concept of a desktop haptic shape display that allows rendering and exploring virtual surfaces. Such device allows continuous and free hand contact on a developable plastic strip that is actuated by a modular handling system. Each module is based on an absolute positioning approach, which allows controlling the represented trajectory, thus overcoming the limits of relative handling systems. The developed system is able to manage the elastic behaviour of the strip in terms of bending, twisting and local curvature control. The latter allows us to handle the local tangency of the strip to the rendered trajectory. The absolute approach allows configuring the system with a variable number of modules, so as to customize it according to the application needs. © 2014 IEEE.
Keywords: Curve rendering | Desktop haptic system | Haptic strip | Virtual Prototyping
Abstract: This study evaluated newly proposed Human-Machine Interface (HMI) design concepts for improving the ergonomics of hydraulic excavators. The design concepts were based on an augmented interaction technique which involved the use of heads-up display (HUD) and coordinated control as HMI elements. Two alternative HMI designs were elaborated in order to separately evaluate the ergonomic impacts of the head-up display and the coordinated control by comparing them to the standard HMI design. The effectiveness of these three HMI designs in terms of the reduction of the operators' mental and physical workload were assessed by conducting experiments utilizing human subjects, ages 23-35 years. The National Aeronautics and Space Administration's Task Load Index (NASA TLX) method was used for collecting subjective workload scores based on a weighted average of ratings of six factors: Mental Demand, Physical Demand, Temporal Demand, Own Performance, Effort, and Frustration Level. The results showed that the type of HMI design affects different aspects of the operator's workload. Indeed, it showed how the proposed augmented interaction is an effective solution for reducing the ergonomic gaps in terms of mental workload, and to a lesser extent the physical workload, subjected by the standard HMI design. Relevance to industry: This study proposes innovative HMI solutions featuring heads-up display and coordinated control to improve the ergonomics of the hydraulic excavator HMI, particularly in reducing the operators' mental and physical workload. The results of this study promises to be an innovative approach for developing new HMI designs by hydraulic excavator manufacturers. © 2014 Elsevier B.V.
Keywords: Coordinated control | Ergonomics | Heads-up display | Human-Machine Interface | Hydraulic excavator | NASA TLX
Abstract: This paper describes a Desktop Haptic System, which supports designers in the evaluation and modification of aesthetic virtual shapes. This System includes the advantages of using the associative data to both CAD and multi-body tools and thus maintaining the parametric dependencies between them. In this way, as the parametric data model gets modified according with the designer's needs in the CAD system, the changes are consistently reflected in the multi-body system. Modifications on the virtual shape can be done by using a global approach or by using a CAD drawing, then the data model in the multi-body system is used to render a real 2D cross-section of the aesthetic virtual shape through a Desktop Haptic Interface (DHI), which allows a free-hand interaction with the aesthetic virtual object. The DHI is linked to the multi-body system by means of using MATLAB and Simulink in order to control their servo-actuators. We also present the results of the validation process specifically carried out to evaluate the accuracy and the effectiveness of the DHI while rendering 2D cross-sections of a virtual shape. © 2014 CAD Solutions, LLC.
Keywords: CAD associativity | desktop haptic system | multi-body interaction
Abstract: This paper presents a novel transportable and cost-effective Augmented Reality system developed to support interior design activities in the contract design sector. The main functioning principles and technical information about its implementation are provided to show how this system allows overcoming some of the issues related to the use of the Augmented Reality in interior design activities. The effectiveness of this system is verified through two different testing sessions based on a case study, which relates to the contract design sector. The testing sessions involved many interior designers with the intent of demonstrating the possibility of integrating this Augmented Reality system in the "everyday" interior design practice in the specific context of the contract design. © 2014 CAD Solutions, LLC.
Keywords: augmented reality | interior design | virtual prototyping
Abstract: In common industrial practice the definition of shapes of styling products is performed by product designers. They usually produce aesthetic shapes by handcrafting scale models made of malleable material like clay, and often expensive physical prototypes are also manufactured. The paper describes a Desktop Mechatronic System (DMS) that has been conceived to support designers in the creation, evaluation and modification of aesthetic virtual shapes. The objective in designing this system is to develop a system allowing a continuous and smooth free hand physical interaction with a virtual shape, which is rendered through a dynamic cross-sectional contour or curve in three-dimensional (3D) space. The DMS is a useful tool for designers, who are not certain about the quality of the shape of the product, and can use it for testing out ideas. The DMS consists of a servo-actuated developable metallic strip, which reproduces 3D cross-sectional contours, which has been devised and implemented by using the Minimal Energy Curve (MEC) spline approach in which the equidistant interpolation points scheme has been adopted to imitate the virtual object created via a Computer Aided Design (CAD) tool on a physical bendable strip-like surface. © 2013 © 2013 Taylor & Francis.
Keywords: 2D cross-section rendering | CAD associativity | desktop mechatronic system | industrial design | virtual prototyping
Abstract: We report a preliminary experiment designed to investigate people's product expectations (for a liquid soap) as a function of its fragrance and packaging. To this end, a series of soap bottles was produced that were identical in shape but had different intensities of colouring (white, pink, or red). The weight of the bottles also varied (either light -350. g, or heavy -450. g). Two different concentrations of perfume were added to the liquid soap contained in the bottles (either low or high). The participants evaluated the perceived intensity of the fragrance contained in each bottle, the perceived weight of each bottle, and the expected efficacy of the soap itself (that is, the soap's expected "cleaning ability"). The results revealed a significant main effect of the colour of the packaging on the perceived intensity of the soap's fragrance. Significant effects of the perceived weight of the container on both the perceived intensity of the fragrance and on the expected efficacy of the soap were also documented. These results are discussed in terms of the design of multisensory packaging and containers for liquid body soap and, more generally, for body care and beauty products. © 2013 Elsevier Ltd.
Keywords: Multisensory design | Multisensory integration | Packaging | Skincare products
Abstract: The term User Experience (UX) is commonly associated with interactive computer-based systems. Companies operating in the consumer market are recently discovering the importance of designing UX, and in particular multisensory UX, of any kind of system, and not necessarily high-tech products. One of the most effective ways to design UX is to enable users interacting with the prototype of the system during the design process, and in particular already during its initial stages. These prototypes should provide the same experience occurring while interacting with the corresponding real product. To this aim Virtual Prototypes (VPs) may be effectively used, especially in the early design stages when the activities are still in progress and changes are frequent. Multisensory UX can be effectively designed through VPs only if all the senses involved in the real interaction are recreated into the virtual simulation. To date, despite a growing interest of research and industry in the development and use of VPs, many applications are still limited to visual and sound simulations. This paper focuses on the use of VPs to design multisensory UX, concentrating on the introduction of the sense of touch in the simulation. The methodological approach as well as the development of a case study are described in the paper. © 2013 IEEE.
Keywords: Haptics | User experience design | Virtual prototyping
Abstract: The paper describes a methodological approach specifically developed to capture and transform the qualitative User Experience (UX) of a consumer product into quantitative technical specifications. Merging the potentialities of Virtual Prototypes (VPs) and Digital Mock-Ups (DMU), a flexible design scenario is built to interpret users' desires. Visual, sound and haptic stimuli are reproduced in order to let users live a realistic multisensory experience interacting with the virtual replica of the product. Parametric models are defined to acquire users' preferences while optimization algorithms are used to transform them into technical specifications. The aim of the approach is to propose a robust technique to objectify users' desires and enable their direct and active participation within the product development process. The methodology is derived merging insights coming from four case studies as well as indications available in literature. Specifically the paper describes how to design the multisensory UX with household appliance doors and drawers with a specific focus on the haptic/force feedback objectification. © 2013 The Design Society.
Keywords: Experience design | Haptic feedback | Human in the loop | User centred design | Virtual reality
Abstract: In recent years, the production of consumer goods has exceeded that of industrial products. This has led to changes in the areas of design and production. The target users of industrial products (in the Business to Business-B2B market) are industries that decide to purchase a product on the basis of its technical features, functions and performance. Differently, the target users of consumer products (in the Business to Consumer-B2C market) are the consumers who choose a product driven by other aspects, besides features and functions, such as the perceived value, the expected benefits, the emotions elicited, as well as features and functions. This has brought a paradigm shift in the design process. And in fact, the design of consumer products is increasingly focusing on the so-called user experience. The designer must not design only the product, but also the user experience in relation to its use. The resulting product should have a high perceived value and generate positive emotions in the consumer. These factors make a successful product on the market. Therefore, the role of the designer is designing the products and the perceptual aspects of their use, that is, designing the user experience and deriving from that the products' specifications. Consequently to that, the design process has changed in the last years. In fact, the user is now at the center of the design process, named user-centered design. Being the new focus the target users, the evaluation of their interaction with the new designed products is expected to be rigorous and systematic. An efficient approach has proved to be one in which the validation is made up by involving users in the early stages of the product design. Since typically at the level of the concept the product, or a prototype that is comparable with it at the perceptual level, is not available, it is not possible to make a thorough validation of its use with users. However, new methodologies of Virtual Prototyping allow us to simulate multisensory user interaction with product concepts early in the design process. This chapter introduces the use of interactive Virtual Prototyping (iVP) methodology for the design of the user experience with the concept of a new product. Interactive components of new products and their behavior are simulated through functional models, and users can experience them through multisensory Virtual Reality (VR) technologies.
Abstract: In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. © 2013 by the authors; licensee MDPI, Basel, Switzerland.
Keywords: Conformable sensor | Flexible sensor | Haptic interface | Haptic strip | Tactile data processing
Abstract: The need for companies to improve their competitiveness may lead to innovation and the reconceptualization of traditional products and processes, with companies making an effort to enhance product elements related to functionality, attractiveness, technology and sustainability, and implementing mass-customisation concepts. Mass-customised products are developed to satisfy specific customer needs, in line with increasing demand for product variety and customisation. The analysis of what customers really want, capturing the Voice of the Customer (VOC), is one of the strategies used to establish effective product development processes. Using a VOC survey, it is possible to transform customer needs into the functional and psychological requirements of the product. This paper presents a methodology based on Virtual Reality (VR) technologies to support the capturing of the VOC in regard to the visual, haptic and auditory characteristics of products. This method can be applied to the beginning of the product development process, to allow companies to deduce from the data the requirements of new industrial customised products. A flexible and interactive Virtual Prototype (VP) of a product category is then developed as a product platform in a draft version by designers and configured according to customer needs, using an immersive VR environment. This method, based on the use of VP, reduces the number of physical prototypes that need to be manufactured during the product development process, thus reducing overall costs. In addition, the VP based method supports the mass-customisation process of products through the real-time integration and collection of data for product configuration preferences, involving as many users as possible representative of the target users of the new products. To demonstrate this process a case study concerning the development of the VP for a washing machine, a summary of test sessions with users and results are presented. Specifically, the results presented in this paper are related to improvements in capturing the VOC and reductions in Virtual Prototyping cost and time. © 2012 Springer Science+Business Media, LLC.
Keywords: Product customisation | Virtual Prototyping | Voice of the Customer
Abstract: This chapter presents a methodology that the authors developed for the evaluation of a novel device based on haptic guidance to support people with disabilities in sketching, hatching, and cutting shapes. The user's hand movement is assisted by a sort of magnet or spring effect attracting the hand towards an ideal shape. The haptic guidance device has been used as an input system for tracking the sketching movements made by the user according to the visual feedback received from a physical template without haptic assistance. Then the device has been used as an output system that provides force feedback capabilities. The drawn shape can also be physically produced as a piece of polystyrene foam. The evaluation methodology is based on a sequence of tests, aimed at assessing the usability of the device and at meeting the real needs of the unskilled people. In fact, the system has been evaluated by a group of healthy and unskilled people, by comparing the analysis of the tracking results. The authors have used the results of the tests to define guidelines about the device and its applications, switching from the concept of "test the device on unskilled people" to the concept of "testing the device with unskilled people."
Abstract: Haptic perception constitutes an important component of our everyday interaction with many products. At the same time, several studies have, in recent years, demonstrated the importance of involving the emotions in the user-product interaction process. The present study was designed to investigate whether haptic interactions can affect, or modulate, people's responses to standardized emotional stimuli. 36 participants completed a self-assessment test concerning their emotional state utilizing as a pointer either a PHANToM device simulating a viscous force field while they moved the stylus, or else a stylus with no force field. During the presentation of the emotional pictures, various physiological parameters were recorded from participants. The results revealed a significant difference in the self-reported arousal associated with the pictures but no significant difference in the physiological measures. The behavioural findings are interpreted in terms of an effect of the haptic feedback on participants' perceived/interpreted emotional arousal. These results suggest that haptic feedback could, in the future, be used to modify participants' interpretation of their physiological states. © 2013 IEEE.
Keywords: Affective Haptics | Emotions | Haptic Interactions & Design | Physiology
Abstract: Several activities of the product development process as for example ergonomic analyses, usability testing, and what is defined as User Experience - UX- design in general require humans to be involved as testers. In order to achieve a good effectiveness degree, these tests must be performed on prototypes as much as possible similar to the final product, and this is costly and sometimes difficult to obtain during the development process. This is especially true at the earliest stages of the process. Functional mock-up - FMU - methods and tools can be of great help, because they allow technological aspects of the products, as electronics, hydraulics, mechanics, etc. to be represented and managed in a simple and effective way. Mathematical equations allow product behavior to be determined, due to input values representing the application environment of the product. At the moment, an FMU model is great in simulating product behavior from the technological point of view, but concerns about user interaction issues are left apart. The research described in this paper aims at widening the coverage of FMU to user-product interaction issues. The goal aims at evaluating the possibility of substituting real users with a characterization of them, and to model and simulate interaction in a homogeneous way together with all the other product aspects. All of this makes the research activities very challenging, and the result is a sort of FMU-assisted interaction modeling. As an evolution of what is generally recognized as hardware and software-in-the-loop, this methodology will be referred as human-in-the-loop. © 2013 Springer-Verlag Berlin Heidelberg.
Keywords: Functional Mock-Up | Interaction | User Experience
Abstract: This paper describes a desktop-mechatronic interface that has been conceived to support designers in the evaluation of aesthetic virtual shapes. This device allows a continuous and smooth free hand contact interaction on a real and developable plastic tape actuated by a servo-controlled mechanism. The objective in designing this device is to reproduce a virtual surface with a consistent physical rendering well adapted to designers' needs. The desktop-mechatronic interface consists in a servo-actuated plastic strip that has been devised and implemented using seven interpolation points. In fact, by using the MEC (Minimal Energy Curve) Spline approach, a developable real surface is rendered taking into account the CAD geometry of the virtual shapes. In this paper, we describe the working principles of the interface by using both absolute and relative approaches to control the position on each single control point on the MEC spline. Then, we describe the methodology that has been implemented, passing from the CAD geometry, linked to VisualNastran in order to maintain the parametric properties of the virtual shape. Then, we present the cosimulation between VisualNastran and MATLAB/Simulink used for achieving this goal and controlling the system and finally, we present the results of the subsequent testing session specifically carried out to evaluate the accuracy and the effectiveness of the mechatronic device. © 2013 Covarrubias et al.
Keywords: Desktop haptic strip | Mec (minimal energy curve) spline approach | Surface rendering
Abstract: The paper describes the results of an on-going research activity whose aim is to allow companies, operating in the consumer goods market, to design the multisensory experience of their products. In case of the household appliances market, which is the research context of this study, the user experience derives from the interaction with specific product features such as the door, buttons, and drawers. Designing a good multisensory experience is complex since it means taking into account a combination of visual, hearing and haptic feedbacks a user perceives when interacting with the product. Virtual Reality offers the technologies to design and test that experience thought virtual prototypes, even if to date there is a lack of methodological approaches to practically guide and support this design activity. Relying on the results of previous authors' researches, the paper describes further methodological advances focused on making usable the proposed approach in the current design practice. The case study chosen to demonstrate the effectiveness of the method is a dishwasher door and the paper describes how to re-engineer the haptic feedback of a commercial model in order to make it more perceptually appealing at the moment of purchase. © 2013 CAD Solutions, LLC.
Keywords: Haptic interaction | Reverse engineering | User experience design | Virtual prototyping
Abstract: The paper describes an application based on Virtual and Augmented Reality technologies specifically developed to support maintenance operations of industrial products. Two scenarios have been proposed. In the first an operator learns how to perform a maintenance operation in a multimodal Virtual Reality environment that mixes a traditional instruction manual with the simulation, based on visual and haptic technologies, of the maintenance training task. In the second scenario, a skilled user operating in a multimodal Virtual Reality environment can remotely train another operator who sees the instructions about how the operations should be correctly performed, which are superimposed onto the real product. The paper presents the development of the application as well as its testing with users. Furthermore limits and potentialities of the use of Virtual and Augmented Reality technologies for training operators for product maintenance are discussed. © 2013 Copyright Taylor and Francis Group, LLC.
Keywords: assembly | haptics | virtual prototyping
Abstract: The current particular critical moment for the Western industry, and the introduction of information technologies have profoundly changed the product development process. One of the most important modifications occurred is the integration of the two traditional fields of product design and product engineering into the new concept of industrial Design &Engineering. The consequent shift and the extension of the designer's area of expertise concerning the product development process have led to amplify his typical knowledge: both more and more technical knowledge, and also knowledge more focused on conceptual issues and creativity. This trend is fostered by the intensive use of computer systems and also by various Design &Engineering courses, where it has been possible to experimentally observe the difficulty for students to learn the necessary body of knowledge and to manage a wider product development process. However, the multidisciplinary approach to the product development process and to the related education issues are still based on tools and methods developed for a specific kind of user, or usually used in an unarranged way. The research presented in this paper aims at defining the guidelines for developing an integrated framework to support the Design &Engineering education and the multidisciplinary design process based on a structured integration of knowledge and tools currently used by the two main areas of reference. The paper presents the guidelines for the integrated framework and the experimental activities used for the framework validation. These activities have allowed the authors to check the framework usability during team design activities and also to verify its effectiveness in improving students' learning capabilities. Some elements derived from the analysis of the experimental data demonstrate that both students and professional designers can use the framework to assist them throughout the design process and that the knowledge learning related to the project is fostered. © 2013 TEMPUS Publications.
Keywords: Design & engineering Design process | Design education | Multidisciplinary design
Abstract: In the domain of industrial product development, products have been traditionally communicated to final users by means of technical drawings, sketches and physical prototypes. Recently, companies have tended to develop digital versions of products, which can be used with the purpose of communicating a new product to users, and also with the aim of allowing users to evaluate the product and its variants before its physical construction. Virtual prototyping is a relatively recent practice used in various industrial domains, which aims at anticipating a product that does not exist in reality yet. This practice can be used for evaluating the aesthetic quality of a product, its functional features and also its ergonomic and usability aspects. Current virtual reality technology well supports the implementation of effective virtual prototypes. In fact, one can touch, move, manipulate, and operate the virtual prototypes of products, such as household appliances, electronic devices, etc., with a good degree of realism. In order to do this, the interaction with virtual prototypes is multimodal and multisensory, and is often based on the combination of visual, haptic and auditory modalities. This paper shows how virtual prototypes can be used instead of physical artefacts or mock-ups for the communication of a new product or of variants of an existing product, and for the preliminary evaluation of its usage. The effectiveness of this practice is proved by tests performed by users. © 2013 Copyright Taylor and Francis Group, LLC.
Keywords: haptic interaction design | virtual environments | virtual prototyping
Abstract: The paper describes a methodology that can be employed to perform the analysis of aspects related to human interaction with consumer products during the Product Development Process, thanks to the use of mixed prototypes. The methodology aims at helping designers to take decisions earlier compared to the current practice based on not easily modifiable physical prototypes. Authors' method considers the interaction with adaptable mixed prototypes as a possible validating procedure for product interaction-enabling features: a multimodal environment is created to perform these validations, integrating three sensorial modalities such as vision, hearing and touch. The paper firstly describes the requirements for the creation of the multimodal environment. Then it focuses on the opportunity of using an approach based on mixed prototypes rather than on completely virtual ones: the intent is to increase the level of "realism" of the simulation by overcoming limitations of actual technologies for the sense of touch. Finally, a case study is discussed, starting from the analysis of a commercial consumer product up to the interaction with the developed Mixed Prototype. The expected benefits for the product development process are highlighted. © 2013 CAD Solutions, LLC.
Keywords: Haptic interaction | Product virtualization | Virtual and mixed prototyping
Abstract: Today, the tests of a new product in its conceptual and design stage can be performed by using digital models owning various levels of complexity. The level of complexity depends on the nature and on the accuracy of the tests that have to be performed. Besides, the tests can involve or not the interaction with humans. Particularly, this second aspect must be taken into account when developing the simulation model. In fact, this introduces a different kind of complexity with respect to simulations where humans are not involved. Simulation models used for numerical analyses of the behavior of the product (such as Finite Element Analysis, multi- body analysis, etc.) are typically named Digital Mock- Ups. Instead, simulations that are interactive in their nature, requiring humans- in- the- loop, are named interactive Virtual Prototypes. They cannot be intended as a simple upgrade of a CAD model of a product, but they are instead a combination of functional models, mapped into sensorial terms and then accessed through multisensory and multimodal interfaces. In this paper, the validity of this concept is demonstrated through some case studies where interactive Virtual Prototypes are used to substitute the corresponding physical ones during activities concerning the product conceptualization and design. © 2013 CAD Solutions, LLC.
Keywords: CAD modeling | Mixed prototyping | Simulations | Virtual prototyping
Abstract: The paper describes an application based on low cost Virtual Reality technologies whose aim is to help designers in testing some functional aspects of their product concepts without the need of building physical mockups. In particular the application allows the designers to test aspects related to the usage of products. A case study that enables designer to verify the ergonomics aspects of the interactive Virtual Prototype (iVP) of a commercial refrigerator has been implemented. The iVP is based on a three-dimensional scale representation of the product and the use of a low cost user interface as input/output device. The paper describes the implementation of the iVP and the results of some preliminary user study that has been performed in order to validate the visuo/haptic interaction approach. © Springer-Verlag London 2013.
Abstract: For more than 40 years the development of Computer Aided Design (CAD) tools has been focused on the description of the geometry of products. More recently, CAD tools have evolved in tools to support the Product Lifecycle Management (PLM), which are more oriented to support the management aspects of the product development process than the design process itself. Recently, it has been introduced a new design method that adopts a top-down approach, which starts from the definition of a Functional MockUp (FMU) allowing to simulate the overall behavior and the use of the concept level before the detailed design. This method is closer to the typical logical sequence of design, where the designer has at first an overall view of a system and of its sub-components, and then he takes care of the details. This method is supported by commercial tools, as the LMS-Amesim suite, or by open-source software tools based on the Modelica language. This is an open-source language allowing designers to integrate and describe at functional level several aspects of a system, including mechanical, electrical, thermal, hydraulic, control and others allowing to simulate all together. The paper analyses this methodological approach and presents some applications where some systems are designed using a functional modeling approach. Copyright © 2013 by ASME.
Abstract: The paper describes the results of a research activity on the design of a positioning system which includes both a physical 3-DOF and virtual platforms which carries out a Desktop Haptic Interface (DHI). The positioning system allows the user to interact with a virtual shape through a combination of linear and rotation motions, some of them driven by the user and some driven by the virtual shape. On the other hand, by rendering a physical 2D cross-section through the DHI permits the assessment of virtual prototypes of industrial products with aesthetic value. Typically, virtual objects are modified several times before reaching the desired design, increasing the development time and, consequently, the final product cost. The desktop haptic system (which includes the positioning system and the DHI) that we propose here, will reduce the number of physical mockups during the design process allowing designers to perform several phases of the product design process continuously and without any interruption. In particular the system is developed with the aim of supporting designers during the evaluation of the aesthetic quality of a virtual product. Copyright © 2013 by ASME.
Abstract: The paper describes a technique that allows measuring and annotating real objects in an Augmented Reality (AR) environment. The technique is based on the marker tracking, and aims at enabling the user to define the three-dimensional position of points, within the AR scene, by selecting them directly on the video stream. The technique consists in projecting the points, which are directly selected on the monitor, on a virtual plane defined according to the bi-dimensional marker, which is used for the tracking. This plane can be seen as a virtual depth cue that helps the user to place these points in the desired position. The user can also move this virtual plane to place points within the whole 3D scene. By using this technique, the user can place virtual points around a real object with the aim of taking some measurements of the object, by calculating the minimum distance between the points, or in order to put some annotations on the object. Up to date, these kinds of activities can be carried out by using more complex systems or it is needed to know the shape of the real object a priori. The paper describes the functioning principles of the proposed technique and discusses the results of a testing session carried out with users to evaluate the overall precision and accuracy. Copyright © 2013 by ASME.
Abstract: Several studies in the field of Emotional Engineering have focused on studying new methods to design products that elicit positive emotions in the future consumers, focusing on the study of the visual perception of the products. A typical application of Emotional Engineering (E-E) is in the product design domain, where the design of new products tends to be validated even with respect to emotional responses, well before releasing the final product design. This paper proposes a new method and process for product design that is driven by the validation of emotional aspects, elicited by multi-sensory experience with products, more in line with the E-E perspective. Copyright © 2013 by ASME.
Abstract: The paper describes a novel application of real-time finite element analysis controlled by a haptic device. The user can impose displacement constraints to a virtual structure using a probe and sense in real-time the response in terms of forces on her/his hand. In addition, conventional color map results are visualized on a desktop monitor. The application has been developed with the aim of simplifying the teaching of the mechanical behavior of materials in engineering schools by transforming the learning phase into an enactive process. A set of examples commonly used in the mechanical engineering courses have been implemented and tested. The paper describes and discusses the system implementation, the potentialities and the issues of such approach. Copyright © 2013 by ASME.
Abstract: The paper describes a desktop-haptic interface (DHI) that has been conceived to support designers in the evaluation and modification of aesthetic virtual shapes. The objective in designing this device is to reproduce a 2D cross-section of virtual objects with a consistent physical rendering well adapted to designers needs allowing a continuous and smoothly free hand contact interaction on a real and developable metallic strip. The desktop-haptic interface consists in a servo-actuated metallic strip that has been devised and implemented by using the Minimal Energy Curve (MEC) spline approach in which the equidistant interpolation points scheme has been adopted for modeling the structure chain consisting of several rigid struts and pivots. Then, we present the co-simulation between the multi-body environment and MATLAB/Simulink used for achieving this goal and control the system and finally, we present the results of the experimental validation session specifically carried out to evaluate the accuracy and the effectiveness of the interface. Copyright © 2013 by ASME.
Abstract: This paper describes the design and implementation of a 1- DOF servo-actuated stylus, which is used as an end effector in a desktop haptic device. The desktop haptic device is part of a multimodal system aimed for the assessment, training and rehabilitation of the arm, forearm and hand while the user perform several tasks. Patients will use the haptic device which carries out the servo-actuated stylus in order to draw simple and complex sketches, in this way, the patient is able to feel the virtual sketch by using the haptic device, which acts as a virtual guide taking advantages of its force feedback capabilities. The therapist is able to control the 1-DOF-stylus rotation according to the requirements of the patient. © Springer-Verlag Berlin Heidelberg 2013.
Keywords: Haptic guidance | Multimodal system | Sketching task | Stroke patient
Abstract: The Space Planning (SP) is a process that allows making an environment more ergonomic, functional and aesthetically pleasing. The introduction of Computer Aided tools for this kind of practice led to an increase of the quality of the final result thanks to some versatile support used for the generation of different options to consider for the evaluation. In particular, those based on Augmented Reality (AR) technologies allow evaluating several options directly in a real room. In this paper, an AR system, developed with the aim of supporting Space Planning activities, is proposed. The system has been developed in order to overcome some problems related to the tracking in wide environments and to be usable in different typologies of Space Planning environments. The paper also presents a qualitative evaluation of the AR system in three different scenarios. The positive results obtained through these evaluation tests show the effectiveness and the suitability of the system in different Space Planning contexts. © 2013 Springer-Verlag Berlin Heidelberg.
Keywords: Augmented Reality | HCI | Space Planning design
Abstract: This chapter presents a methodology that the authors developed for the evaluation of a novel device based on haptic guidance to support people with disabilities in sketching, hatching, and cutting shapes. The user's hand movement is assisted by a sort of magnet or spring effect attracting the hand towards an ideal shape. The haptic guidance device has been used as an input system for tracking the sketching movements made by the user according to the visual feedback received from a physical template without haptic assistance. Then the device has been used as an output system that provides force feedback capabilities. The drawn shape can also be physically produced as a piece of polystyrene foam. The evaluation methodology is based on a sequence of tests, aimed at assessing the usability of the device and at meeting the real needs of the unskilled people. In fact, the system has been evaluated by a group of healthy and unskilled people, by comparing the analysis of the tracking results. The authors have used the results of the tests to define guidelines about the device and its applications, switching from the concept of "test the device on unskilled people" to the concept of "testing the device with unskilled people." © 2013, IGI Global.
Abstract: The paper describes the preliminary results of a research activity on the design of consumer-product interaction by means of interactive Virtual Prototypes (iVPs). Differently from Virtual Prototypes (VPs), which can be defined as an integration of geometries and functional multi-domain and multi-physics models, interactive Virtual Prototypes can be implemented as an integration of functional models for each sense - in this case vision, touch and hearing - which are parametric and independent from each other, so that they can be modified on request. Thanks to the use of iVPs the interaction design approach can be reversed. In fact, first the iVP can be used for the design and evaluation with final users of the consumer-product interaction, and then the resulting interaction parameters can be mapped back to the functional models of the VPs by following a sort of reverse engineering activity. So, the interaction specifications are not derived by questionnaires and focus groups with potential customers, but products are shown, tested, and customised directly with the potential buyers. This method presents several advantages for companies. It allows designers to more easily and directly capture the voice of the customers by gathering an immediate feedback about new interaction modalities, and also to design and validate at the same time the emotional response of their products. Besides, these studies and validations can be performed when the product design is in its infancy and technical decisions have not been taken, with the consequent advantage that design changes are not too expensive. The paper demonstrates the validity and potentiality of the methodology through some case studies. Copyright © 2012 by ASME.
Abstract: The paper presents the design, construction, validation and testing of a Haptic Guidance Device whose aim is to provide dynamic assistance while performing manual activities such as drawing, hatching and cutting. A commercial phantom haptic device was modified by adding a pantograph mechanism in order to increase the haptic working area. The force feedback workspace provided by the phantom device is quite limited, 160 W x 120 H mm. This workspace sometimes is not enough to reproduce manual tasks in a large-scale area as is often required in several educational activities (e.g. sketching, hatching and cutting tasks). In this paper is evaluated a low cost solution for increasing the haptic working area provided by the phantom device. The pantograph mechanism has been linked with the haptic device in order to increase the working area in a 2:1 scale. The users hand moves a pen linked to the device through 2D predefined shapes in which the pens position have been tracked in 2D coordinates at 25 kHz in order to record all the data for the posterior analysis. The haptic guidance device is also equipped with a cutting system using hot wire for physically producing the drawn shape as a piece of polystyrene foam. The haptic guidance device has been tested by people with specific disorders affecting coordination such as Down syndrome and mental retardation under the supervision of their teachers and care assistants. The results of the study prove that higher performance levels can be achieved while performing manual tasks as sketching, hatching and cutting operation using the haptic guidance device. Copyright © 2012 by ASME.
Abstract: In face-To-face communication, touch can establish intimacy, and therefore the presence of tactile stimulation can enhance the interpersonal relationships. While human-human interaction has been shifting from face-To-face physical conversations to electronically mediated form of communication, current technologies are not able to provide a multimodal sensorial experience that can support haptic interaction besides visual and auditory. Within the haptic research fields, affective haptics explore emotional interaction and perception mediated via touch that is simulated by technology. Besides, wearable technology and tangible interfaces can be employed as a solution to bridge the gap between the digital and physical worlds by making the body fully engaged with the interaction. This paper presents findings of a design practice that explores the avenues of affective tactile interaction through wearable technology, which can artificially produce tactile stimulations as medium for instant communication between two people. The findings are presented by the light of theoretical background, observations and analysis of the design practice. Copyright © 2012 by ASME.
Abstract: In this paper, we present a new Augmented Reality (AR) tracking technique that integrates the marker-based tracking with the tracking ability of a commercial mobile robot. The role of the mobile robot is to co-work with the user for extending the working space of the marker-based tracking technique. The robot follows the userâTMs movements during the exploration in the AR environment and updates the position of a fiducial marker, which is fixed on it. The robot is automatically controlled through the device used to visualize the AR scene. The paper discusses the issues related to the integration of these two tracking techniques and proposes an AR application, which has been developed to demonstrate the feasibility of our approach. Technical issues and performances have been assessed through several testing sessions. Copyright © 2012 by ASME.
Abstract: The design of a new product requires a series of validations before its approval and manufacture. Virtual prototyping based on mixed reality technology seems a promising technique, especially when applied to the design review of products that are characterised by interaction with users. This paper presents a new methodology that allows the collaborative design review and modification of some components of automotive interiors. Professionals can work together through a mixed reality distributed design platform by interacting intuitively and naturally with the virtual prototype of the product. The methodology has been validated by means of tests with users, aiming at assessing the effectiveness of the approach, and at identifying potential usability issues. © 2012 Copyright Taylor and Francis Group, LLC.
Keywords: automotive design | collaborative design | design methodology | mixed reality | virtual reality
Abstract: Emotion is an intangible fact that is woven into interpersonal relationships in order to sustain social bonds with more understanding, trust and intimacy. Today, people are in an era of digitally mediated HHI (Human-to-Human Interaction), which is transforming the traditional way of communication and social dynamics. In order to increase the affective level of communication, research areas, such as affective haptics and wearable technology have emerged to find more affective interaction avenues and convey human emotions through technology in a more physical way. This paper presents a survey and a design practice that aim to explore how wearable technology can enhance emotional intimacy by introducing new ways for expressing emotions. The analysis of this study is addressed on the privacy issue of emotional expression through wearable technology and intimacy of emphatic communication through these agents.
Keywords: Design practice | Emotion | Interaction | Intimacy | Wearable technology
Abstract: The paper describes a multimodal guidance system whose aim is to improve manual skills of people with specific disorders, such as Down syndrome, mental retardation, blind, autistic, etc. The multimodal guidance system provides assistance in the execution of 2D tasks as for example: sketching, hatching and cutting operations through haptic and sound interactions. The haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides some audio feedback inputs about his or her actions while performing a manual task as for example: start and/or finish an sketch; some alarms related to the hand's velocity while sketching and filling or cutting operations. Unskilled people use these interfaces in their educational environment. © 2012 Springer-Verlag.
Keywords: Haptic Guidance | Sound Interaction | Unskilled People
Abstract: In common industrial practice the definition of shapes of styling products is performed by product designers by handcrafting scale models made of clay while exploiting their manual skills. Several Computer-Aided Styling tools have been introduced in years with the aim of allowing the creation of product shapes in digital. But the low interest of developers to provide designers with a natural interface that would allow them to continue to use their learned manual skills, has led them to continue to work on clay materials and produce expensive physical prototypes. Enactive Interfaces have been demonstrated to be effective to support the exploitation of human skills including the manual skill of industrial designers. The paper describes an Enactive Interface that has been conceived to support designers in the evaluation of aesthetic shapes, which consists of a combination of an active manual and visual exploration of the shape. The Enactive Interface is the combination of visuo-haptic technologies and has been implemented on the basis of observations of the shape evaluation activity performed by manually skilled designers. In the paper we describe the testing sessions performed to capture the designers' manual skill, the working principles of the enactive interface and finally the results of the subsequent testing activities specifically carried out to evaluate the effectiveness of the interface. © 2012 Springer-Verlag.
Keywords: Aesthetic shape evaluation | Designers' skills | Enactive Interface | Haptic interface | Multimodal interface
Abstract: The paper describes the development of a multimodal environment that can be used as a tool to train operators to perform maintenance tasks of industrial products during their lifecycle. The environment provides a visual and haptic representation of a product through the use of commercially-available hardware and software technologies. The new environment mixes interactive simulations of products, based on Virtual Reality technologies, and the information included in the traditional instruction manuals. The paper discusses about what can be effectively simulated with the available technology, and how the current technological limitations can be overcome by using the potentialities offered by a multimodal approach. Technology limitations can also be overcome by mixing two different strategies for coupling parts: the physics-based and the constraint-based approach. A case study provided by a company working in the field of household appliances has been used to prove the concept, to develop, test and finally refine the multimodal environment. The paper describes the development of the multimodal environment and presents the results of the user tests. Copyright © 2012 by ASME.
Abstract: Interacting with computers by using the bodily motion is one of the challenging topics in the Virtual Reality field, especially as regards the interaction with large scale virtual environments. This paper presents a device for interacting with a Virtual Reality environment that is based on the detection of the muscular activity and movements of the user by the fusion of two different signals. The idea is that through muscular activities a user is capable of moving a cursor in the virtual space, and making some actions through gestures. The device is based on an accelerometer and on electromyography, a technique that derives from the medical field and that is able to recognize the electrical activity produced by skeletal muscles during their contraction. The device consists of cheap and easy to replicate components: seven electrodes pads and a small and wearable board for the acquisition of the sEMG signals from the user's forearm, a 3 DOF accelerometer that is positioned on the user's wrist (used for moving the cursor in the space) and a glove worn on the forearm in which these components are inserted. The device can be easily used without tedious settings and training. In order to test the functionality, performances and usability issues of the device we have implemented an application that has been tested by a group of users. Specifically, the device has been used as natural interaction technique in an application for drawing in a large scale virtual environment. The muscular activity is acquired by the device and used by the application for controlling the dimension and color of the brush. Copyright © 2011 by ASME.
Abstract: The paper describes a collaborative Mixed-Reality (MR) environment to support the product design assessment. In particular, we have developed a collaborative platform that enables us to improve the design and the evaluation of cars interior. The platform consists of two different systems: the 3D Haptic Modeler (3DHM) and the Mixed Reality Seating Buck (MRSB). The 3DHM is a workbench that allows us to modify the 3D model of a car dashboard by using a haptic device, while the MRSB is a configurable structure that enables us to simulate different driving seats. The two systems allow the collaboration among designers, engineers and end users in order to get, as final result, a concept design of the product that satisfies both design constraints and end users' preferences. The usability of our collaborative MR environment has been evaluated by means of some testing sessions, based on two different case studies, with the involvement of users. Copyright © 2011 by ASME.
Abstract: The haptic feedback perceived during the interaction with consumer products is an important aspect since it concurs in creating, together with the aesthetic features and sonic feedback, the emotional response during the first contacts with a product. And this may be decisive for the user's decision of purchasing a product instead of another one. So the design of the haptic behavior of interaction elements of products can be both a successful strategy for capturing consumers' attention but even a need for avoiding problems during the use. The paper describes the process of virtualization of the interaction with an industrial consumer product by means of haptic, sound and visualization technologies in order to obtain a prototype (interactive Virtual Prototypes) useful to design and test the haptic feedback of interaction elements directly with end users. Copyright © 2011 by ASME.
Abstract: Clothing is the most intimate artefact that interacts with both body and society. Over the past quarter century, with the introduction of new technologies, people are experiencing unprevented changes in their behaviours and way of living. Technology is becoming a large part of daily life and its unchecked influence has many emotional consequences, many of which are overlooked. The aim of this research is integrating textiles with new technologies to create garments that provide new social interactions and avenues for emotional expression. The experimental project has been done to explore new possible interaction scenarios through wearable technologies by turning an intangible phenomenon, emotion, to a tangible artefact. The paper refers to the research question: 'How can an intangible fact, which is known as existing but doesn't have a physical matter, emotion, be embodied and transmitted through technology?' by means of a theoretical study on wearable technologies and its role in emotional communication, following with an experimental project carried out as both virtual and real prototypes. This paper not only focuses on the prototyping process, but also addresses the user experience during the interaction by various user perception tests. © 2011 ACM.
Keywords: emotion | intangible | mediation | social interaction | wearable technologies
Abstract: The paper deals with the application of haptic technology when improving manual skills of people with specific disorders, such Down syndrome, mental retardation, etc. The development of a cutting haptic system is the base of this paper to show specificity of the development and verification of hardware for the given group of handicapped users. The paper shows suitability of using haptic technology for a concrete application, which is designed for user with specific disorders. The cutting operation is performed using a hot wire tool, which is linked through an R-R mechanism to a PHANTOM device. The haptic cutting device is able to cut soft materials as expanded and extruded polystyrene foam from 5 to 20 mm thickness. The device is driven under the user's hand movement and assisted through the Magnetic Geometry Effect (MGE). The haptic cutting device has been used as an input system for tracking the sketching movements made by the user according to the visual feedback received from a physical template without haptic assistance. Then the cutting device has been used as an output system that provides force feedback capabilities. Finally, the system performance has been evaluated by comparing the analysis of the tracking results with the final polystyrene components. © 2011 IADIS.
Keywords: Assisted cutting system | Haptic technology | Unskilled people
Abstract: This paper describes the servo actuated transmission system required to drive a desktop haptic strip interface. The haptic strip is a mechatronic device which is used for exploration of virtual surfaces with aesthetic value. The simulation of tasks, such as the exploration of aesthetic real surfaces made by industrial designers in order to check the quality of prototypes, require full hand contact with the shape on a one-to-one scaled representation of the object. Our mechatronic device allows a continuous, free hand contact on a developable real plastic tape actuated by a servo-controlled mechanism in which is used the tessellation approach. In fact, the triangular mesh simplifies the conformation of the developable real surface as the virtual one. This paper discusses the design concept, novel kinematics and mechanics, improvements of the transmission system and control for the Desktop Strip. © 2011 by ASME.
Abstract: Due to the huge impact of communication technologies, the meaning of social existence is changing towards the use of electronic devices as extensions of senses. While technology is becoming intimate, reaching farther into user's lives than ever before, wearable technology has emerged as a new research field where technology is worn to provide a sensory interface. Through the integration of technology and garments, the research aims to discover new ways of creating wearables that provide new avenues for emotional expression and social interaction. Emotional embodiment through Wearable Technology can strengthen social bonds through a paradigm of increased emotional expression, understanding, and trust. To verify this hypothesis, a set of dynamic garments has been built by developing both virtual and real prototypes and performing user tests. This paper addresses to new scenarios of sensing, interacting, and interpreting emotions through Wearable Technology and its' effects on the user's perception. © 2011 by ASME.
Abstract: This paper describes a multifunctional haptic machine, a device that is used for assisting unskilled people in the assessment and training of hand movements. Sketching, hatching and cutting operations are assisted through the multifunctional device by using the haptic point-based approach. The device has enabled users to haptically interact with a 2D virtual template, which acts as a virtual tool path taking advantages of its force feedback capabilities. For sketching, hatching and cutting operations the haptic device is driven under the user movement and assisted through the Magnetic Geometry Effect (MGE). Two configurations of the multifunctional device have been analyzed; the cartesian and the RR mechanism attached to the PHANTOM device. For sketching and hatching several pencils and pens colors are available. Regarding the cutting operation, a hot wire cutting tool can be replaced for cutting soft materials as expanded and extruded polystyrene foam. This paper discusses the design concept, kinematics and mechanics of the multifunctional haptic device. A brief test using the device in sketching, hatching and cutting operations is also given. © 2011 by ASME.
Abstract: The paper describes a novel 3D interaction technique based on Eye Tracking (ET) for Mixed Reality (MR) environments. We have developed a system that integrates a commercial ET technology with a MR display technology. The system elaborates the data coming from the ET in order to obtain the 3D position of the user's gaze. A specific calibration procedure has been developed for correctly computing the gaze position according to the user. The accuracy and the precision of the system have been assessed by performing several tests with a group of users. Besides, we have compared the 3D gaze position in real, virtual and mixed environments in order to check if there are differences when the user sees real or virtual objects. The paper also proposes an application example by means of which we have assessed the global usability of the system. © 2011 by ASME.
Abstract: The choice of the interaction menu to use is an important aspect for the usability of an application. In these years, different solutions, related to menu shape, location and interaction modalities have been proposed. This paper investigates the influence of haptic features on the usability of 3D menu. We have developed a haptic menu for a specific workbench, which integrates stereoscopic visualization and haptic interaction. Several versions of this menu have been developed with the aim of performing testing sessions with users. The results of these tests have been discussed to highlight the impact that these features have on the user's learning capabilities. © 2011 IFIP International Federation for Information Processi.
Keywords: Haptic Interaction | Haptic Menu | Mixed Reality
Abstract: The preliminary study described in this paper is about the assessment of the control parameters of a non-grounded haptic device giving kinaesthetic stimuli to indicate to the user a direction to follow to reach a target location. The haptic sensation is created by tilting one or more rotating flywheels along an axis, controlling the direction and the amount of tilt, the velocity of the tilt (during the pulse and the repositioning of the flywheel), the frequency of pulses. The current prototype has only one flywheel and is able to tilt along a single axis, thus indicating only "right" or "left" direction. The experiments performed with the prototype allowed us to define a set of requirements (torque produced, duration of the stimuli) for a more complex prototype (2 flywheels and 2 DOF) to be developed and tested as future work. © 2011 IEEE.
Keywords: H.1.2 [Models and Principles]: User/Machine System - Evaluation/Methodology | H.5.2 [Information Interfaces and Presentation]: User Interfaces - Haptic I/O
Abstract: This paper describes the 2D sketching haptic system (2DSHS) designed for the assessment and training of sketching control movements. The system has been developed for people with Down syndrome, who can use the system for drawing simple and complex sketches. These users are able to feel virtual objects by using a haptic device, which acts as a virtual guide taking advantages of its force feedback capabilities; in fact, the haptic device is driven under the user's movements and assisted through the Magnetic Geometry Effect (MGE). The 2DSHS has been used as an input device for tracking the sketching movements made by a user according to the visual feedback received from a physical template without haptic assistance. Then, the 2DSHS has been used as an output device that provides force feedback capabilities through a point-based approach. Preliminary evaluation has been performed in order to validate the system. Two different tasks have been performed -sketching a template and hatching a surface- with the aim to obtain more information related to the accuracy of the device. The performance has been evaluated by comparing the analysis of the tracking results. © 2011 Springer-Verlag.
Keywords: assisted sketching | Haptic technology | unskilled people
Abstract: The ability to capture customers' needs and the voice of customers, and to translate them into a set of product specifications that at best satisfy the target customers has increasingly become a key element of business strategy. The common practice consists in evaluating products at the end of the design process through physical prototypes with the participation of users and potential customers. The same practice can be implemented by using virtual replica of real products, reducing cost and time necessary to build some variants. The paper presents a methodology for the development of the virtual prototype of a piece of furniture, produced by a company that is interested in studying how customers perceive and evaluate some variants of the hinge mechanism. The virtual prototype has been implemented using a tool for virtual reality applications oriented to non-expert programmers. The modularity and flexibility of the approach used for implementing the virtual replica has allowed us to re-use the components, and to easily change the parameters, also during the test activities. © 2011 Springer-Verlag.
Keywords: Fast Prototyping | Virtual Products | Virtual Prototyping
Abstract: The paper describes a collaborative platform to support the development and the evaluation of cars interior by using a Mixed Prototyping (MP) approach. The platform consists of two different systems: the 3D Haptic Modeler (3DHM) and the Mixed Reality Seating Buck (MRSB). The 3DHM is a workbench that allows us to modify the 3D model of a car dashboard by using a haptic device, while the MRSB is a configurable structure that enables us to simulate different driving seats. The two systems allow the collaboration among designers, engineers and end users in order to get, as final result, a concept design of the product that satisfies both design constraints and final users' preferences. The platform has been evaluated by means of several testing sessions, based on two different scenarios, so as to demonstrate the benefits and the potentials of our approach. © 2011 Springer-Verlag.
Keywords: Collaborative design | Ergonomic assessment | Haptic modeling | Mixed Reality | Virtual Prototype
Abstract: Gesture interaction is one of the most important topics in the human-computer interaction. In this field, the main research activities are oriented on recognizing and classifying different gestures in order to interact with the computer directly with the body, without using classical mobile devices such as touchpad or trackball. This paper describes the development and the testing of our wearable interaction system that uses surface electromyography (sEMG) signals to recognize and process the gestures of the users. The core of the system is the "Eracle-board" that is a wearable 3-channel board developed in order to acquire the sEMG signals from the user's forearm. The acquired data are subsequently processed by an external device, which allows us to recognize and classify seven different gestures through the implementation of a neural network. Finally, the effectiveness of the system has been evaluated through some tests carried out with users. © 2011 Springer-Verlag.
Abstract: Several haptic devices have been developed in recent years in order to reproduce the sensation of physical contact with virtual objects. Many of these devices are point-based, and some haptic interfaces behave like small surfaces that conform to a virtual shape. None of these allow a full-hand contact with the shape, and they are, in general, too small to render big surfaces. The simulation of tasks, such as the exploration of aesthetic surfaces made by industrial designers in order to check the quality of prototypes, require full-hand contact with the shape on a one-to-one scaled representation of the object. These explorations follow trajectories that can be approximated with planar or geodesic curves. In this paper, we describe the design and implementation of a linear haptic device that is able to render these trajectories. The device is part of a multimodal system including stereoscopic visualization that allows visual representation of the entire surface. Industrial designers use the system for checking the quality of shapes while exploiting their manual and visual skills. The system has been tested by industrial designers and the results are reported in this paper. © 2011 IEEE.
Keywords: curve rendering | Haptic strip | industrial design | mixed reality | multimodal interfaces | virtual prototyping
Abstract: This research work describes the Desktop Strip haptic interface, a device which is used for exploration of virtual surfaces with aesthetic value. Such a device allows a continuous, free hand contact on a developable plastic tape actuated by a modular servo-controlled mechanism using the tessellation approach. The device has enabled users to interact with and feel a wide variety of virtual objects by using the palm of the hand. This research work discusses the design concept, novel kinematics and mechanics of the Desktop Strip. © 2011 IEEE.
Keywords: H.1.2 [Models and Principles]: User/Machine SystemEvaluation/Methodology | H.5.2 [Information Interfaces and Presentation]: User InterfacesHaptic I/O
Abstract: Virtual Prototyping (VP) aims at substituting physical prototypes currently used in the industrial design practice with their virtual replica. The ultimate goal of VP is reducing the cost and time necessary to implement and test different design solutions. The paper describes a pilot study that aims at understanding how interactive Virtual Prototypes (iVPs) of consumer products (where interaction is based on the combination of haptic, sound and 3D visualization technologies) would allow us to design the interaction parameters that concur in creating the first impression of the products that customers have when interacting with them. We have selected two commercially available products and, once created the corresponding virtual replica, we have first checked the fidelity of the iVPs by comparing them with the corresponding real products, when used to perform the same activities. Then, differently from the traditional use of Virtual Prototypes for product design evaluation, we have used them for haptic interaction design, i.e. as a means to define some design variables used for the specification of new products: variations are applied to iVP haptic parameters so as to test with final users their preferences concerning the haptic interaction with a simulated product. The iVP configuration that users liked most has then been used for the definition of the specifications for the design of the new product. © 2011 IEEE.
Keywords: I.3.6 [Computing Methodologies]: Methodology and TechniquesInteraction techniques | J.6 [Computer Applications]: COMPUTERAIDED ENGINEERINGComputer-aided design (CAD)
Abstract: The paper describes the use of interactive Virtual Prototypes (iVP) for the specification of consumer products and for the evaluation of the perceived quality of the product already in its conceptual form. iVPs are based on multimodal interaction including force-feedback and sound in addition to 3D stereoscopic visualization. The fidelity of the prototypes has been evaluated in comparison with the corresponding real products, when used for performing the same tests. Differently from the traditional use of Virtual Prototypes, which aims at evaluating the product design, we have used iVPs for the interaction design of a new product, i.e. it has been used as a means to define the design parameters used for the specification of a new product. © 2011 IEEE.
Keywords: I.3.6 [Computing Methodologies]: Methodology and TechniquesInteraction techniques | J.6 [Computer Applications]: COMPUTERAIDED ENGINEERINGComputer-aided design (CAD)
Abstract: The core of the paper is focused on the experimental characterization of four different 3D laser scanners based on Time of Flight principle, through the extraction of resolution, accuracy and uncertainty parameters from specifically designed 3D test objects. The testing process leads to four results: z-uncertainty, xy-resolution z-resolution and z-accuracy. The first is obtained by the evaluation of random residuals from the 3D capture of a planar target, the second from the scanner response to an abrupt z-jump, and the last two from direct evaluation of the image extracted by different geometric features progressively closer each other. The aim of this research is to suggest a low cost characterization process, mainly based on calibrated test object easy to duplicate, that allow an objective and reliable comparison between 3D TOF scanner performances. © 2011 SPIE-IS&T.
Keywords: Accuracy | Characterization | Precision | Range sensor | Resolution | Standardization | TOF laser scanner
Abstract: This paper presents the analysis of a 2 DOF end-effector that carries out a haptic tool consisting in a developable servo-actuated plastic strip. To overcome some drawbacks of conventional systems related to the use of point based force-feedback, tactile and glove devices we have developed the triangular mesh approach in the developable haptic interface. The developable haptic strip, has evolved as a result of this research. The haptic interface of our system consists of a haptic strip that is inspired by the deformable tape that designers use for creating and modifying aesthetic shapes. The first step for designing the end effector and the haptic strip mechanism has been the assignment of both the total length of the haptic strip and the number of tessellation triangles required to control the strip as a developable surface. The strip is aimed at haptically rendering medium sized design objects (vases, lamps, household appliances, etc.). Thus, the total length of the strip has been set to be 160 mm. This length value also ensures the possibility to easily manipulate the strip. Seven triangles have been assigned in order to maintain the haptic strip symmetric; this consideration is particularly important because guarantees an adequate balancing of weight in the mechanism. The device is based on a modular architecture of elements that deform a plastic tape. Copyright © 2011 by ASME.
Abstract: Nowadays, increasing importance is devoted to product experience, which induces customers to choose a particular product over another. A shift of the current product development process is necessary to design, validate, and produce several variants of a product, in order to better satisfy customers' needs, preferences, and skills. This is possible if customers become a central part of the product development process and are more directly involved in the design and evaluation of new products, through the use of more natural and easy-to-use tools. Their involvement would lead to better products, which were more effective and in line with customers' aesthetic and functional expectations, and would contribute to the acceptance and success of these products. © 2011 Springer-Verlag London Limited.
Abstract: Ballistic requirements in the design of military aerospace components are critical for the structural assessment. Although experimental tests remain fundamental, numerical analyses allow us to simulate such a very complex scenario. However, in order to improve the consistency of such simulations, several aspects have to be considered in detail. The material behavior is a key one: constitutive laws, able to describe the material behavior in terms of hardening, strain rate, damage criteria, temperature, etc⋯, is fundamental. Moreover a detailed validation of numerical results is needed. The analysis of ballistic data (such as the residual velocity and the direction of the bullet after the impact) is only one option, but reliable simulations of damage shape and size are the real target. Therefore the paper will focus on the comparison between the experimental damage and the simulated one using a 3D acquisition of the experimental impacted area. This approach allows us not only to evaluate the damage in the macroscale field but also in the near micro-scale: shape on the border of the damage. © 2011 Published by Elsevier Ltd.
Keywords: AL-6061-T6 shaft | Ballistic impact | NATO 7.62 mm | Numerical simulations | Reversal engineering
Abstract: This paper describes a framework based on a publish/subscribe paradigm for interprocess communication based on XML messages sent over a TCP/IP connection. The framework manages the exchange of data within the clients of a system and permits the definition of a specific behavior for each client using a finite-state machine approach. Whilst the server-side of the framework is able to receive and dispatch events and data, the client-side of the framework is modeled as a finite-state machine able to perform state transitions after receiving the correct message. This architecture permits the loose-coupling between producers and consumers of data and the bidirectional mapping between the design of the behavior of a system and its implementation. © 2010 IEEE.
Abstract: In Augmented Reality (AR) applications, the technological solutions used for tracking objects in the real environment are generally related to the conditions of the environment and to the application purposes. The selection of the most suitable tracking technology satisfies the best compromise among several issues including performance, accuracy and easiness of use. This paper describes an AR tracking approach based on a marker-based tracking and its development for wide environment by displaying itself on a monitor. This solution allows us to improve the visibility of the marker and the tracking, thanks to a dynamic control of marker's dimensions. The technical features and performances of our approach have been assessed by several testing sessions focused on comparative analyses. © The Eurographics Association 2010.
Abstract: This paper addresses issues related to the assessment of product designs that today successfully uses virtual prototyping methods and techniques. Design assessment is usually performed by engineers and designers. Current trend is involving customers in the products design assessment in order to check the level of acceptability and satisfaction before proceeding with the production. In this view, methods for design review may be used in participatory design sessions where customers can intervene for customizing products. The paper presents a tool that has been originally developed for designers for the validation and modification of aesthetic properties of digital shapes, and that has also been used by customers, in participatory design sessions, for customizing their products. Copyright © 2010 by ASME.
Abstract: In this paper the set up and the carrying out of experimental ballistic tests on a tail rotor transmission shafts for helicopter, which are impacted by a 7.62 NATO projectile, are presented. After the tests, a 3D acquisition of the impacted area on each shaft has been performed in order to acquire exactly the shape of the damage. The acquisition has been carried out with a 3D range camera. The experimental activities have been compared with the results of a numerical simulation of the impact, which has been computed with an explicit finite element code. The direct comparison has been done by superimposing the two meshes (from FE analysis and from 3D acquisition). This method has proved to be effective for identifying analogies and differences and for giving the possibility to promote a "quantitative" discussion with the aim of improving the accuracy of the numerical models and simulation conditions. The adoption of the Reverse Engineering practice has proved to be a powerful method for integrating and comparing the simulation data with real data, and give suggestions to further analysis. Copyright © 2010 by ASME.
Abstract: Most of the activities concerning the design review of new products based on Virtual Reality are conducted from a visual point of view, thus limiting the realism of the reviewing activities. Adding the sense of touch and the sense of hearing to traditional virtual prototypes, may help in making the interaction with the prototype more natural, realistic and similar to the interaction with real prototypes. Consequently, this would also contribute in making design review phases more effective, accurate and reliable. In this paper we describe an application for product design review where haptic, sound and vision channels have been used to simulate the interaction with a household appliance. © 2010 Springer-Verlag.
Keywords: Interaction Design | Multimodal Interaction | Product Design Review | Virtual Prototyping
Abstract: This paper presents an innovative tool based on enactive interaction for the industrial design sector. Enactive interaction is based on the action-perception paradigm where users learn how to do things by doing. Recent progresses of the research on haptics have allowed us to build an innovative tool for the modification of the shape of a digital product that includes a haptic strip as interaction device. The tool allows designers to easily and intuitively modify digital shapes just acting on the extremities of the strip. Compared to traditional design tools based on the manipulation of mathematical surfaces through geometrical manipulators and control points, this tool has proved to better exploit designers' skills and creativity. © 2010 CAD Solutions, LLC.
Keywords: Enactive interaction | Haptics | Product design | Tools for design
Abstract: The paper describes a haptic device whose aim is to permit the assessment of digital prototypes of industrial products with aesthetic value. The device haptically renders curves belonging to digital surfaces. The device is a haptic strip consisting of a modular servo-controlled mechanism able to deform itself, allowing the user to feel the resulting shape with his free hands. The haptic strip is also equipped with two force sensitive handles placed at the extremities, and a capacitive touch sensor along its length, which are used for applying deformations to the digital shape. © 2010 Springer-Verlag Berlin Heidelberg.
Keywords: Haptic linear strip | Multimodal application | Virtual Prototyping
Abstract: This paper presents a system for the evaluation of the shape of aesthetic products. The evaluation of shapes is based on characteristic curves, which is a typical practice in the industrial design domain. The system, inspired by characteristic curves, is based on a haptic strip that conforms to a curve that the designer wishes to feel, explore, and analyze by physically touching it. The haptic strip is an innovative solution in the haptics domain, although it has some limitations concerning the domain of curves that can be actually represented. In order to extend this domain and make users feel the various curve features, for example curvature discontinuities, sound has been exploited as an additional information modality. © 2010 by the Massachusetts Institute of Technology.
Abstract: The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics. © 2010 by the authors.
Keywords: 3D measurement | Accuracy | Laser scanner | Metrological characterization | Pattern projection | Resolution | Uncertainty
Abstract: This paper addresses issues related to enactive interfaces, which are human-computer interfaces based on enactive knowledge, i.e. the information that the user gains through perception-action interaction in the environment. These interfaces are typically multimodal, i.e. are based on the use of several channels for the communication between the user and the computer system, and in addition can be effectively used for exploiting users' skills during the interaction with computers, differently from traditional interfaces. This paper reasons about enactive interfaces and describes an example of an application based on enactive interface, and its evaluation. Copyright © 2009 by ASME.
Abstract: The paper presents a haptic device that allows a user to explore a virtual object along a continuous line. In particular the device is developed with the aim of supporting designers during the evaluation of the aesthetic quality of a virtual product. This is generally done by means of the global and local analysis of the shape in terms of curvature characteristics, presence of inflections points and discontinuities. In order to evaluate such features, designers are used to work on physical prototypes, relying on their skilled sense of touch. It is known that physical prototypes are expensive in terms of cost and time for their re-alization, and a modification on a physical prototype implies a reverse engineering process for appling such modifications on the virtual model. A linear haptic interface, that adapts its shape reproducing a generic curve on a surface, has been developed to replicate the behavior of a physical strip. This is the way to replace real prototypes with virtual ones without changing the evaluation paradigms that designers are used to. The physical limitations encountered in representing discontinuities in po.sition, tangency and curvature, not renderable by bending and de-forming a physical strip, have been overcome thanks to the application of some principles of the theory of haptic illusions by means of sonification of some curve characteristics, The paper describes the linear haptic interface developed and the solution based on haptic illusion that has been implemented to overcome the strip limitations. Copyright © 2009 by ASME.
Abstract: Currently, the design of aesthetic products is a process that requires a set of activities where digital models and physical mockups play a key role. Typically, these are modified (and built) several times before reaching the desired design, increasing the development time and, consequently, the final product cost. In this paper, we present an innovative design environment for computer-aided design (CAD) surface analysis. Our system relies on a direct visuo-haptic display system, which enables users to visualize models using a stereoscopic view, and allows the evaluation of sectional curves using touch. Profile curves are rendered using an haptic device that deforms a plastic strip, thanks to a set of actuators, to reproduce the curvature of the shape co-located with the virtual model. By touching the strip, users are able to evaluate shape characteristics, such as curvature or discontinuities (rendered using sound), and to assess the surface quality. We believe that future computer-aided systems (CAS)/CAD systems based on our approach will contribute in improving the design process at industrial level. Moreover, these will allow companies to reduce the product development time by reducing the number of physical mockups necessary for the product design evaluation and by increasing the quality of the final product, allowing a wider exploration and comparative evaluation of alternatives in the given time. © 2009 Springer-Verlag.
Keywords: Haptic | Immersive | Stereo | Tracking
Abstract: The design review process of new products is time consuming, requires the collaboration and synchronization of activities performed by various experts having different competences and roles, and is today performed using different tools and different product representations. In order to improve the performances of the overall product design process, it would be beneficial the availability of Computer Aided tools supporting both conceptual design and analysis activities within an integrated environment based on a multi-disciplinary model paradigm. The paper presents an environment named PUODARSI that allows product designers to intuitively modify the shape of a product through haptic interaction and to test in real-time the structural and fluid-dynamics impact of these changes. The research work shows that a smooth and effective integration of modeling tools based on haptic interfaces, fluid-dynamics analysis tools, and Virtual Reality visualization systems is feasible in real-time through the use of a proper data model exchange. © 2010 Springer-Verlag.
Keywords: Design review | Haptic interaction | Interactive simulation | Multi-disciplinary model | Virtual design
Abstract: Some Virtual Reality applications are based on the use of haptic interfaces for a more intuitive and realistic manipulation of the virtual objects. Typically, the haptic devices have a xed position in the real space, and their working space is rather limited. As a consequence, there are locations in the virtual space that are out of the working space of the haptic device, and thus cannot be reached by users during the virtual objects manipulation. The paper describes a multimodal navigation modality based on the integrated use of various and low cost interaction devices that can be operated by a user taking into account that one of his hands is engaged for the manipulation of the haptic device. Therefore, we have decided to implement the user interface by using the Nintendo® Wii Remote™ and the BalanceBoard™, which can be operated by the user using the other hand and his feet. The navigation modality has been integrated and tested in a Virtual Reality application for the virtual manual assembly of mechanical components. A preliminary validation of the application has been performed by an expert user with the aim of identifying major usability and performance issues by using the heuristic evaluation method. © 2010 by ASME.
Abstract: This paper describes a multimodal system whose aim is to replicate in a virtual reality environment some typical operations performed by professional designers with real splines laid over the surface of a physical prototype of an aesthetic product, in order to better evaluate the characteristics of the shape they are creating. The system described is able not only to haptically render a continuous contact along a curve, by means of a servo controlled haptic strip, but also to allow the user to modify the shape applying force directly on the haptic device. The haptic strip is able to bend and twist in order to better approximate the portion of the surface of the virtual object over which the strip is laying. This device is 600mm long and is controlled by 11 digital servos for the control of the shape (6 for bending and 5 for twisting) and by two MOOG-FCS HapticMaster devices and two additional digital servos for 6-DOF positioning. We have developed additional input devices, which have been integrated with the haptic strip, which consist of two force sensitive handles positioned at the extremities of the strip, and a capacitive linear touch sensor placed along the surface of the strip, and four buttons. These devices are used to interact with the system, to select menu options, and to apply deformations to the virtual object. The paper describes the interaction modalities and the developed user interface, the applied methodologies, the achieved results and the conclusions elicited from the user tests. © 2010 by ASME.
Abstract: The paper presents a system we have studied and developed to aid the stereoscopic visualisation of virtual models of products whose shape can be explored and modified using a haptic interface. The haptic system has a wide working space that allows users to stand in front of the system and to operate as they are used to do in real life for making physical prototypes. The paper also shows several visualisation solutions that have been studied before obtaining the best performing one in which the stereoscopic glasses worn by the user as well as his point of view are tracked in order to enhance the realistic perception of the 3D mixed image. Some initial tests have been performed with end-users, both CAS designers and model makers, with the support of human factors experts in order to correctly assess the usability of the system and to understand how potentially effective it is in aiding product designers in their work. © 2010 Inderscience Enterprises Ltd.
Keywords: haptic interfaces | projection based visualisation systems | stereoscopic visualisation | virtual prototyping | visuo-haptic display
Abstract: The paper presents the results of a research project aiming to develop an innovative framework for the conceptual design of products based on haptic technology. The system consists of a Computer-Aided Design (CAD) system enhanced with intuitive designer-oriented interaction tools and modalities. The system integrates innovative haptic tools with 6 Degrees of Freedom (DOF) for modelling digital shapes, with sweep operators applied to class-A surfaces and force computation models based on chip formation models. The system aims to exploit designers' skills in modelling products, improving the products' design process by reducing the necessity to build several physical models for the evaluation and testing of product designs. The system requirements have been defined after observing designers during their daily work and translating the way they model shapes using their hands and craft tools into specifications for the system. The system has been tested by designers, who have found it intuitive and effective to use. Copyright © 2010 Inderscience Enterprises Ltd.
Keywords: Conceptual design | Haptic modeling | Product design | Product development | Virtual prototyping
Abstract: The paper presents a haptic device that allows a user to explore a virtual object along a continuous line. In particular the device is developed with the aim of supporting designers during the evaluation of the aesthetic quality of a virtual product. This is generally done by means of the global and local analysis of the shape in terms of curvature characteristics, presence of inflections points and discontinuities. In order to evaluate such features, designers are used to work on physical prototypes, relying on their skilled sense of touch. It is known that physical prototypes are expensive in terms of cost and time for their realization, and a modification on a physical prototype implies a reverse engineering process for appling such modifications on the virtual model. A linear haptic interface, that adapts its shape reproducing a generic curve on a surface, has been developed to replicate the behavior of a physical strip. This is the way to replace real prototypes with virtual ones without changing the evaluation paradigms that designers are used to. The physical limitations encountered in representing discontinuities in position, tangency and curvature, not renderable by bending and de-forming a physical strip, have been overcome thanks to the application of some principles of the theory of haptic illusions by means of sonification of some curve characteristics. The paper describes the linear haptic interface developed and the solution based on haptic illusion that has been implemented to overcome the strip limitations. © 2009 by ASME.
Abstract: This paper addresses issues related to enactive interfaces, which are human-computer interfaces based on enactive knowledge, i.e. the information that the user gains through perception-action interaction in the environment. These interfaces are typically multimodal, i.e. are based on the use of several channels for the communication between the user and the computer system, and in addition can be effectively used for exploiting users' skills during the interaction with computers, differently from traditional interfaces. This paper reasons about enactive interfaces and describes an example of an application based on enactive interface, and its evaluation. © 2009 by ASME.
Abstract: This paper describes a mixed reality application for the assessment of manual assembly of mechanical systems. The application aims at using low cost technologies and at the same time at offering an effective environment for the assessment of a typical task consisting of assembling two components of a mechanical system. The application is based on the use of a 6-DOF interaction device that is used for positioning an object in space, and a haptic interface that is closer to reality and is used for simulating the insertion of a second component into the first one while feeling a force feedback. The application has been validated by an expert user in order to identify the main usability and performance problems and improve its design. © 2009 Springer Berlin Heidelberg.
Keywords: Haptic-based interaction | Virtual Manual Assembly | VR system evaluation
Abstract: Market rules show that most of the times the aesthetic impact of a product is an important aspect that makes the difference in terms of success among different products. The product shape is generally created and represented during the conceptual phase of the product and the last trends show that the use of haptic devices allows users to more naturally and effectively interact with 3D models. Nevertheless the shape needs to satisfy some engineering requirements, and its aesthetic and functional analysis requires the collaboration and synchronization of activities performed by various experts having different competences and roles. This paper presents the description of an environment named PUODARSI that allows designers to modify the shape of a product and engineers to evaluate in real-time the impact of these changes on the structural and fluid dynamic properties of the product, describing the choice of the software tools, the implementation and some usability tests. © 2009 Springer Berlin Heidelberg.
Keywords: Haptic interaction | Interactive simulation | Mixed reality
Abstract: Exploring how to enhance and assist the design process with the support of new Information and Communication Technologies is a growing field of research. The analysis of the design processes of aesthetic shapes highlights open issues that make the processes inefficient and generate some questions about how to improve the process performances. Examples of questions are: how to generate more and better ideas, how to become more creative and how to exploit at best people know-how. The research presented in this paper reasons about issues related to aesthetic shapes perception and proposes some solutions at the basis of future design tools based on new ways of working and new ways of interaction. copyright © 2008by ASME.
Abstract: The design and implementation of a PLM solution in a cross-company environment is a complex and labour intensive operation, which is often coupled with a Business Process Re-engineering (BPR) project to better deploy technologies as well as methodologies and to target the system implementation on the real company needs. Enterprise Modelling (EM) languages are typically used to collect and share process knowledge among the BPR participants. Plenty of techniques are actually available at this scope and it is not always easy to understand how to select and use them in the different steps of re-engineering. The main purpose of this paper is to perform a qualitative analysis of three well known EM languages (IDEF, UML and ARIS) and to propose a new methodology, based on their integrated use, supporting BPR efforts in the Product Development domain. © 2009 Elsevier B.V. All rights reserved.
Keywords: ARIS | Business Process Re-engineering | Enterprise Modelling | IDEF | UML
Abstract: The design of complex industrial products requires more knowledge than the one that a single person can possess. A successful industrial design entails a collaborative activity between designers and engineers during which they share their specialized knowledge accepting and interpreting their points of view through a multi-disciplinary and collaborative methodology. In particular, this paper focuses on issues related to the knowledge creation, management and enrichment for industrial collaborative design in virtual environment. The idea is to endow the virtual environment, used to create the 3D models and to carry out the analysis, with a knowledge management (KM) module able to gather information about different versions of the product and simulation analysis data. Moreover, this KM module includes an annotation tool used to support annotation-based collaboration between designers and engineers through which they can exchange impressions, ideas, and comments in order to achieve a shared solution about an industrial design issue. © 2009 Springer-Verlag Berlin Heidelberg.
Abstract: The paper presents a reference framework for applications based on the mixed prototyping practice and mixed reality techniques and technologies. This practice can be effectively used for rapid design assessment of new products. In particular, the paper addresses applications regarding the use of mixed prototyping practice for the design review of information appliances. Various methods and technologies can be used according to the product aspects to validate and test. The paper describes mixed prototyping applications for positioning information appliances within systems and for the evaluation of ergonomics aspects of interactive devices. © Springer-Verlag 2009.
Keywords: Haptic devices | Mixed prototyping | Mixed reality | Product assessment | Reference framework
Abstract: Resolution analysis represents a 2D imaging topic for the use of particular targets for equipment characterization. These concepts can be extended in 3D imaging through the use of specific tridimensional target object. The core of this paper is focused on experimental characterization of seven different 3D laser scanner through the extraction of resolution, accuracy and uncertainly parameters from 3D target object. The process of every single range map defined by the same resolution leads to different results as z-resolution, optical resolution, linear and angular accuracy. The aim of this research is to suggest a characterization process mainly based on resolution and accuracy parameters that allow a reliable comparison between 3D scanner performances. © 2009 SPIE-IS&T.
Keywords: 3D Scanner | Accuracy | Characterization | Resolution | SFR | Target Object
Abstract: This paper describes a haptic device whose aim is to render the contact with a continuous and developable surface by means of the representation of a geodesic trajectory. Some preliminary tests conducted with industrial designers have showed that the trajectories performed while exploring the surface of a style product, for a qualitative evaluation, follows some particular trajectories that may be mathematically described as geodesic curves. In order to represent these particular curves a haptic strip based on a modular servo-controlled mechanism has been developed. Each module of mechanism allows us to control both the curvature and the torsion. This device, in respect to the commercial existing haptic devices, allows a hand-surface contact with the virtual model in real scale without artifacts, by self-deforming its shape in order to conform to the mathematical curve to render. The strip is 900 mm long and has 9 control points for bending and 8 control points for torsion. Due to these characteristics, it allows us to render exploration trajectories of several kinds of product shapes and dimensions. In order to allow users to fully explore an object surface, we have mounted the strip on a platform consisting of two MOOG-FCS HapticMaster devices, which permits 6DOF orientation of the strip and force feedback control. The paper describes the mechanism of the strip and the 6DOF platform starting from the empirical observations of the exploration of surfaces and highlights the problems encountered and the solutions adopted.
Abstract: The aesthetic impact of a product is an important parameter that makes the difference among products technologically similar and with same functionalities. Product shape, which is strictly connected to the aesthetic impact, has different meanings if seen from the design and the engineering point of view. The conceptual design of shape of aesthetic products is usually performed by designers at the beginning of the product development cycle. Subsequent engineering design and studies, such as structural and fluid-dynamic analyses, lead to several design reviews where the original shape of the object is often modified. The design review process is time consuming, requires the collaboration and synchronization of activities performed by various experts having different competences and roles, and is performed using different tools and different product representations. Then, computer aided tools supporting conceptual design and analysis activities within the same environment are envisaged. The paper presents the conceptual description of an environment named PUODARSI that allows designers to modify the shape of a product and evaluate in real-time the impact of these changes on the results of the structural and fluid dynamic analyses in an Augmented Reality (AR) collaborative environment. Main problems in integrating tools developed for different purposes, such as, haptic interaction, FEM and CFD analyses, AR visualization concern the feasibility of the integration, the data exchange, and the choice of those algorithms that allow all that while guaranteeing low computational time. The paper describes the main issues related to the choice of hardware and software technologies, and the PUODARSI system implementation. © The Eurographics Association 2008.
Abstract: The aesthetic impact of a product is an important parameter that makes the difference among products technologically similar and with same functionalities. The shape, which is strictly connected to the aesthetic impact, has different meanings if seen from the design and the engineering point of view. This paper describes an environment based on an integration of Mixed- Reality technologies, haptic tools and interactive simulation systems, named PUODARSI whose aim is to support designers and engineers during the phase of design review of aesthetic products. The environment allows the designer to modify the object shape, through the use of haptic devices, and the engineer to run the fluid-dynamics simulation on the product shape. The paper describes the main problems faced in integrating tools, originally developed for different purposes and in particular issues concerning data exchange, and the choice of those algorithms that guarantees low computational time as required by the application.
Keywords: Design review | Fluiddynamics analysis | Haptic interfaces | Mixed-Reality
Abstract: Exploring how to enhance and assist the design process with the support of new Information and Communication Technologies is a growing field of research. The analysis of the design processes of aesthetic shapes highlights open issues that make the processes inefficient and generate some questions about how to improve the process performances. Examples of questions are: how to generate more and better ideas, how to become more creative and how to exploit at best people know-how. The research presented in this paper reasons about issues related to aesthetic shapes perception and proposes some solutions at the basis of future design tools based on new ways of working and new ways of interaction. Copyright © 2008 by ASME.
Abstract: Virtual Reality (VR) systems provide new modes of human computer interaction that can support several industrial design applications improving time savings, reducing prototyping costs, and supporting the identification of design errors before production. Enhancing the interaction between humans and virtual prototypes by involving multiple sensorial modalities, VR can be adopted to perform ergonomic analysis. The main problems deal with the evaluation both of functional and cognitive sample users behavior as VR interfaces influence the perception of the ergonomic human factors. We state that ergonomic analysis performed in virtual environment can be successful only if supported with a structured protocol for the study both of functional and cognitive aspects and with the proper VR technologies combination that answers to the specific analysis tasks. An ergonomic analysis protocol is presented. It allows the assessment of the consumers/ response in term of behavioral and cognitive human factors, comprehending both operational and emotional agents. The protocol is also used to identify the best combination of visualization and haptic interfaces to perform the analysis. An experimental example, belonging to house appliances field is adopted to investigate the application of the protocol in the virtual set up. Copyright © 2008 by ASME.
Abstract: This paper presents a haptic system the authors have developed for shape exploration in the field of industrial design. The system consists of a novel haptic-based digital technology, allowing designers to add the tactile experience to the visual one. The haptic interface developed allows designers to see and haptically feel through free hand motions an object surface during its creation and evolution. The system closes the loop of shape modification and its subsequent evaluation: It is possible to evaluate the "what f' related to a new product shape, applying the modfication and comparing the solutions more and more times, and generate and maintain djfferent versions. That improves the level of interaction of designers with the digital models, exploits their skills, and shortens the product development life cycle. Copyright © 2008 by ASME.
Keywords: Haptic interfaces | Haptically enhanced design tools | Haptics in design
Abstract: One of the recent research topics in the area of design and virtual prototyping is offering designers tools for creating and modifying shapes in a natural and interactive way. Multimodal interaction is part of this research. It allows conveying to the users information through different sensory channels. The use of more modalities than touch and vision augments the sense of presence in the virtual environment and can be useful to present the same information in various ways. In addition, multimodal interaction can sometimes be used to augment the perception of the user by transferring information that is not generally perceived in the real world, but which can be emulated by the virtual environment. The paper presents a prototype of a system that allows designers to evaluate the quality of a shape with the aid of touch, vision and sound. Sound is used to communicate geometrical data, relating to the virtual object, which are practically undetectable through touch and vision. In addition, the paper presents the preliminary work carried out on this prototype and the results of the first tests made in order to demonstrate the feasibility. The problems related to the development of this kind of application and the realization of the prototype itself are highlighted. This paper also focuses on the potentialities and the problems relating to the use of multimodal interaction, in particular the auditory channel. ©2008 IEEE.
Keywords: Audio & visual sensors and displays | Haptic | Humancomputer interaction
Abstract: Virtual Reality (VR) technologies provide novel modes of human computer interaction that can be used to support industrial design processes. The integration can be successful if supported by a method to qualify, select and design the VR technologies according to the company's requirements in order to improve collaboration in extended enterprises and timesaving. The aim of the present work is the definition of a method to translate the company's expectations into heuristic values that allow the benchmarking of VR systems. The method has been tested on a real test case. Copyright © 2007 by ASME.
Keywords: Benchmarking criteria | Design review | Virtual reality
Abstract: The paper presents the results of a research project aiming at developing an innovative framework for conceptual design of products based on innovative haptic interfaces. The framework consists of a CAD (Computer Aided Design) system enhanced with intuitive interaction tools and modalities for modeling, modification and evaluation of shapes. The system aims at exploiting designers' skills in modeling products, improving the products design process by reducing the necessity to build several physical models for evaluating and testing the product designs. The system proposes an emotional engineering approach by supporting product designers in translating their sensibility into physical design factors, allowing easy and fast creation and evaluation of diverse shape solutions. The system requirements have been defined after observing designers during their daily work and translating the way they evaluate and modify shapes using hands and craft tools into specifications for the modeling system and the haptic tool. The system prototype has been tested by designers who have found it intuitive and effective to use. Copyright © 2007 by ASME.
Abstract: Reverse Engineering and Rapid Prototyping are consolidated practices in the industrial product design process. In any case, best practices should be set up according to the type of product and of objectives. This paper presents the results of a product re-engineering application in the field of high quality stainless steel cutlery. This activity included a re-modeling of a basis-shape that reproduces at best the original shape of a cutlery set designed in the Thirties for Alessi, which is still considered to be very fashionable. The shape of the manufactured physical pieces of cutlery has changed in time in respect to the original design, mainly because of some problems related to their manufacturing, which has been transferred from the central plant to other branches. We have obtained the digital model of the pieces of cutlery on the basis of the analysis of the available original 2D drawings, and of the laser scanning and subsequent re-engineering design of the available manufactured objects. Particularly complex activity was the removal of the shapes deviations due to the different forming techniques used in the various production factories. At the end of the reconstruction activity, physical prototypes of the new models were produced by means of a sintering process in order to evaluate and compare them to the original ones. © The Eurographics Association 2007.
Abstract: Virtual Reality and related technologies are becoming a common practice in the design industry in general and more specifically in the automotive field. By the joined use of virtual prototyping methodologies it is possible to achieve the reduction of the time-to-market, as well as the costs deriving from the creation of physical prototypes. Ergonomics tests conducted in Virtual Reality environments can be successfully used to influence the design of products. We have set up a mixed reality environment that allows us to simulate and test postural aspects as well as various levels and modalities of users' interaction. In order to achieve a performing interaction, correct registration and visualization parameters must be carefully set for preventing possible interaction errors, such as pointing with precision to the assigned target. The paper presents a methodology for tuning stereoscopic properties of the mixed reality environments used for ergonomic tests of a car interior in order to achieve correct visualization and positioning parameters. The environment includes a virtual human controlled by an optical motion tracking system, physical objects onto which the graphic virtual environment is superimposed, and a rotatory haptic device for controlling the car on-board infotainment. interface. © The Eurographics Association 2007.
Abstract: This paper presents the results of a research project aimed at developing haptic tools for virtual shape modelling resembling real tools like rakes and sandpaper used by modelers and designers in the real workshop. The developed system consists of a CAD (computer aided design) system enhanced with intuitive designer-oriented interaction tools and modalities. The system requirements have been defined on the basis of the observation of designers during their daily work, and translating the way they model shapes using hands and craft tools into specifications for the modelling system based on haptic tools. © Springer-Verlag 2007.
Keywords: 3D interaction techniques | Haptic interfaces | Haptic modelling | Virtual shape modelling
Abstract: In this short paper we describe a Direct Visuo-Haptic Display System (DVHDS) setup based on a DLP projector, an overhead projection plane and a half silvered mirror. The system is specifically designed for being integrated with a haptic system. It offers a large viewing area with a proper haptic work space ideal for full arm movements in an immersive environment. The system allows superimposing the virtual scene onto the user's perspective of the real world. The system supports controlling the sense of depth and the correction of the distortion of the projected image.
Abstract: The field of computer graphics is greatly increasing its overall performance enabling consequently the implementation of most of the product design process phases into virtual environments. The deriving benefits of using virtual practices in product development have been proved intrinsically highly valuable, since they foster the reduction of time to market, process uncertainty, and the translation of most prototyping activities into the virtual environment. In this paper we present the developed platform in mixed environment for ergonomic validation. Specifically we defined a methodology for testing both aspects related to design and ergonomic validation by allowing the tester to interact visually and physically with the car dashboard control devices and related interface by the mean of a rotatory haptic device. By experimental session it has been highlighted that it is possible gathering qualitative data about the design, and finding typical occlusion problems, but also quantitative data can be collected by testing the infotainment interface and the consequent users' distraction during the device use. © Springer-Verlag Berlin Heidelberg 2007.
Keywords: Ergonomic analysis | Mixed environments | Tactons
Abstract: This paper presents some results of a research project aiming at developing haptic tools for virtual shape modeling resembling the real tools like rakes and sandpaper used by modelers and designers in the real workshop. The developed system consists of a CAS (Computer Aided Styling) system enhanced with intuitive designer-oriented interaction tools and modalities. The system requirements have been defined on the basis of the observation of designers during their daily work and translating the way they model shapes using hands and craft tools into specifications for the modeling system based on haptic tools. The haptic tool and the interaction modality developed for exploring and sanding a surface is presented in the paper. © Springer-Verlag Berlin Heidelberg 2007.
Keywords: 3D interaction techniques | Designer-centered user interface | Haptic interfaces | Shape modeling
Abstract: The paper presents the results of a research project aimed at developing an innovative system for modeling industrial products based on haptic technology. The system consists of a Computer Aided Design (CAD) system enhanced with intuitive designer-oriented interaction tools and modalities. The system integrates innovative six degrees of freedom (DOF) haptic tools for modeling digital shapes, with sweep operators applied to class-A surfaces and force computation models based on chip formation models. The system aims at exploiting designers' existing skills in modeling products, improving the products design process by reducing the necessity of building several physical models for evaluating and testing the product designs. The system requirements have been defined observing designers during their daily work and translating the way they model shapes using hands and craft tools into specifications for the modeling system and the haptic tool. The system prototype has been tested by designers who have found it intuitive and effective to use. © Springer-Verlag London Limited 2005.
Keywords: Haptic modeling | Haptics | Product design | Virtual prototyping
Abstract: The paper presents the results of a research project aiming at developing an innovative framework for conceptual design of products based on innovative haptic interfaces. The framework consists of a CAD (Computer Aided Design) system enhanced with intuitive designer-oriented interaction tools and modalities. The system integrates innovative 6 degrees of freedom haptic tools for modeling digital shapes, with sweep operators applied to class-A surfaces and force computation models based on chip formation models. The system aims at exploiting designers' skills in modeling products, improving the products design process by reducing the necessity to build several physical models for evaluating and testing the product designs. The system requirements have been defined after observing designers during their daily work and translating the way they model shapes using hands and craft tools into specifications for the modeling system and the haptic tool. The system prototype has been tested by designers who have found it intuitive and effective to use. Copyright © 2006 by ASME.
Abstract: The paper presents two applications of haptic technologies to demonstrate how they can increase human computer interaction during different steps of design process. The first application aims at developing a system to generate digital shapes by manipulating haptic tools that resemble the physical ones that the modelers use in everyday work. The second is focused on the use of haptic interfaces to evaluate ergonomics of virtual products control boards. We designed and developed the mentioned haptic devices; the first uses two FCS HapticMaster equipped with a innovative strong and stiff 6 DOF device carrying simulated clay modeling tools. The second is an "ad hoc" mechatronic device able to simulate some controls with rotary motions (knobs). The described haptic devices are integrated in more complex virtual reality applications; the paper describes their architecture and the methodologies proposed to simulate material shaping and ergonomic validation. The main aspects of haptic modeling and rendering are also discussed. © 2006 Elsevier Ltd. All rights reserved.
Keywords: Haptic modeling | Haptics | Product design | Virtual prototyping
Abstract: Current tools aimed at supporting the conceptual phase of product design are not intuitive to use, and do not exploit designers' skill and creativity. This paper presents the results of a research work aiming at integrating user-friendly and effective ways of interaction based on ad-hoc haptic interfaces into free-form shape modeling systems. Copyright © 2005 by the Association for Computing Machinery, Inc.
Keywords: Concept product design | Free-form shape modeling | Haptic modeling | Haptics | Virtual prototyping
Abstract: Gestures, besides speech, represent the mostly used means of expression by humans. For what regards the product design field, designers have multiple ways for communicating their ideas and concepts. One of them concerns the model making activity, where designers make explicit their concepts by using some appropriate tools and specific hand movements on plastic material with the intent of obtaining a shape. Some studies have demonstrated that visual, tactile and kinesthetic feedbacks are equally important in the shape creation and evaluation process [1]. The European project "Touch and Design" (T'nD) (www.kaemart.it/touch-and-design) proposes the implementation of an innovative virtual clay modeling system based on novel haptic interaction modality oriented to industrial designers. In order to develop an intuitive and easy-to-use system, a study of designers' hand modeling activities has been carried out by the project industrial partners supported by cognitive psychologists. The users' manual operators and tools have been translated into corresponding haptic tools and multimodal interaction modalities in the virtual free-form shape modeling system. The paper presents the project research activities and the results achieved so far. Copyright 2005 ACM.
Keywords: Haptic interaction | Haptic modeling | Virtual prototyping
Abstract: The efforts made by a company to focus on the manufacturing process to minimize production costs are not any more sufficient to launch competitive products on the market. In recent years, the industry has focused on the integration and optimization of the phases of the product development process and on the introduction of innovations in the attempt to tackle and solve the above mentioned issues. The paper presents the results of a research project whose aim is to study a methodology for the evaluation of the impact and costs related to the adoption of new and innovative technologies for knowledge and innovation management within currently implemented companies' product development processes (As-Is process).
Keywords: Knowledge management | Product development process | Product innovation | Technical creativity
Abstract: This paper outlines new trends in geometric modelling, showing how systems are moving from a geometry-based approach to a physically-based approach. The possibility to simulate the actual behaviour of a product is at the basis of Virtual Prototypes that are becoming a common practice in today's product development process. As a consequence, also interaction modalities and techniques have to be improved in order to satisfy new system requirements and functionalities. The research work described in this paper shows how haptic interaction techniques represent an evolution of current interaction technologies providing intuitive and realistic modalities for interacting with virtual applications. In particular, the paper describes a research work we have carried out integrating haptic technologies together with physically-based modelling and simulation techniques. © 2001 Springer Science+Business Media New York.
Keywords: Haptic interaction | Human-computer interaction | Non-rigid material modelling | Physically-based modelling | Virtual prototyping
Abstract: Haptic interfaces represent a revolution in human computer interface technology since they make it possible for users to touch and manipulate virtual objects. In this work we describe a cross-model interaction experiment to study the effect of adding haptic cues to visual cues when vision is not enough to disambiguate the images. We relate the results to those obtained in experimental psychology as well as to more recent studies on the subject.
Abstract: Kinematic modelling of robots and flexible material simulation have been an important aspect in robotic research issues over the last years. In this article a new approach for the design of arbitrary gripper and robot architectures will be presented. It will be shown how these robot systems can be simulated based on the presented design structure. Further the integration of flexible material simulation will be displayed. A particle based object models will be shown and the mathematical basics will be described. Finally, the integration of the flexible material simulation into the robot simulation software will be displayed and evaluated.
Abstract: The design of mechanical assemblies carried out using today's CAD systems is mainly developed by modelling single parts, and then assembling them in a subsequent design phase. This bottom-up approach is not congruent and not satisfying the way designers are used to design assemblies. Besides, most of the CAD systems are weak for what concerns several functionalities. For example, the manipulation or the modification of the part geometry or the mating conditions of a created assembly is not supported. The introduction of the feature-based approach in assembly design, already successfully used in the design of single parts, would offer several advantages. For example, it would allow designers to start drawing the assembly from a top view, leaving the design of details and subsystems to a following phase. This paper describes some results of research work done within a project funded by the European Union aiming at extending and validating the use of features, to help in the solution of assembly problems in aeronautical applications. Copyright © 1999 Inderscience Enterprises Ltd.
Keywords: Assembly feature | Feature-based design | Knowledge formalization | Knowledge-based system
Abstract: This paper describes the preliminary results of the research work currently ongoing at the University of Parma, and partially carried out within a basic research project funded by the European Union. The research work aims at applying the techniques used in gesture analysis and recognition for understanding human skill performed for non rigid object grasping and manipulation. The various grasping gestures have been classified on the basis of some quantitative features extracted from the hand gesture analysis. Finally, it is planned to map the formalized skill into a robotics system, that will be able to grasp and manipulate non rigid objects, also facing unexpected situations.
Abstract: This paper presents novel interaction techniques and tools to support direct manipulation of flexible material models. We discuss interaction techniques and tools at two different levels: definition of the object/environment model, and manipulation of the digital environment as it was real to modify and validate the model.
Abstract: The design of mechanical assemblies is mainly done after having modeled single parts. Most of the CAD systems are still weak for what concerns assembly. For example, they do not support the manipulation or the modification of the part geometry or the mating conditions once the assembly is made. The introduction of a feature-based approach in assembly design, successfully used in the design of single parts, would offer several advantages. This paper describes some results of the research work done within a project funded by the European Union aiming at extending the use of features, to help in the solution of assembly problems in the aeronautical applications.
Abstract: This article summarizes the main results of a joint endeavor towards a standard reference model (SRM) for intelligent multimedia presentation systems (IMMPSs). After a brief motivation, we give basic definitions for media terms and presentation systems. The core of this contribution is a generic reference architecture that reflects an implementation-independent view of the processes required for the generation of multimedia presentations. The reference architecture is described in terms of layers, components, and knowledge servers. Our SRM focuses on the functions assigned to the layers and components, rather than on the methods or communication protocols that may be employed to realize this functionality. Finally, we point to some possible extensions of the reference model. © 1997 Elsevier Science B.V.
Keywords: Automated generation of multimedia presentation | Intelligent multimedia presentation system | Standard reference model
Abstract: Multiple input devices are increasingly used in user interfaces to make human-computer communication more efficient and effective. Interface designers have not only to decide on which input modes should be supported, but also how to fuse them into a single representation format that can be processed by the underlying application system. Drawing appropriate decisions requires, however, a sufficient understanding of the properties of fusion itself. While others have informally characterized input fusion as a transformation between information types, the purpose of the paper is to explore fusion by means of formal process modelling. That is, fusion processes are defined in a formal framework which supports proof of the existence of necessary properties following directly from the process definitions. The presented approach can be applied to analyse and compare fusion processes in existing systems, as well as an aid for interface designers, who have to verify the behaviour of their systems.
Keywords: Ambiguity | Formal methods | Fusion | Multimodal interfaces
Abstract: User interfaces of current 3D and virtual reality environments require highly interactive input/output (I/O) techniques and appropriate input devices, providing users with natural and intuitive ways of interacting. This paper presents an interaction model, some techniques, and some ways of using novel input devices for 3D user interfaces. The interaction model is based on a tool-object syntax, where the interaction structure syntactically simulates an action sequence typical of a human's everyday life: One picks up a tool and then uses it on an object. Instead of using a conventional mouse, actions are input through two novel input devices, a hand- and a force-input device. The devices can be used simultaneously or in sequence, and the information they convey can be processed in a combined or in an independent way by the system. The use of a hand-input device allows the recognition of static poses and dynamic gestures performed by a user's hand. Hand gestures are used for selecting, or acting as, tools and for manipulating graphical objects. A system for teaching and recognizing dynamic gestures, and for providing graphical feedback for them, is described. © 1994, Taylor & Francis Group, LLC. All rights reserved.
Abstract: Novel 3-D input devices are taking the place of traditional 2-D input devices in current 3-D graphical user interfaces. Among them, hand-input devices seem intuitive and powerful for performing tasks in 3-D environments. They offer the possibility to interact by means of hand gestures, i.e. in a natural way, close to human habits of interacting with the surrounding world. This paper describes a system supporting hand gestures for interacting with 3-D user interfaces. The system provides a visual programming environment for the design of gesture languages, defined as a collection of hand gestures. Moreover, a system for interpreting a gesture language, where gestures are performed by a user wearing a hand-input device, and for providing graphical feedback during the interaction, is presented. © 1994 Academic Press Limited.
Abstract: In user interfaces of modern systems, users get the impression of directly interacting with application objects. In 3D based user interfaces, novel input devices, like hand and force input devices, are being introduced. They aim at providing natural ways of interaction. The use of a hand input device allows the recognition of static poses and dynamic gestures performed by a user's hand. This paper describes the use of a hand input device for interacting with a 3D graphical application. A dynamic gesture language, which allows users to teach some hand gestures, is presented. Furthermore, a user interface integrating the recognition of these gestures and providing feedback for them, is introduced. Particular attention has been spent on implementing a tool for easy specification of dynamic gestures, and on strategies for providing graphical feedback to users' interactions. To demonstrate that the introduced 3D user interface features, and the way the system presents graphical feedback, are not restricted to a hand input device, a force input device has also been integrated into the user interface. © 1993 Eurographics Association
Keywords: Interactive techniques | novel graphic applications | novel input devices
Abstract: This paper presents a toolkit for the generation of User Interface (UI) to CIM applications, in particular scheduling and manufacturing control packages. An analysis of selected applications requirements and state-of-the-art in UI development led to the design and implementation of a UIMS (User Interface Management System) based on the System Builder approach. The prototype allows the design and fully automatic implementation of UI using graphical and interactive tools, without having to depend on routines library or program language practice. (A)