Caruso Giandomenico

Professore Associato


Politecnico di Milano
giandomenico.caruso@polimi.it

Sito istituzionale
SCOPUS ID: 23466272900
Orcid: 0000-0003-2654-093X



Pubblicazioni scientifiche

[1] Picardi A., Ballabio G., Arquilla V., Caruso G., A Study on Haptic Actuators to Improve the User Experience of Automotive Touchscreen Interfaces, Computer-Aided Design and Applications, 22(1), 136-149, (2025). Abstract
X

Abstract: This paper presents a novel analysis of haptic actuators for touchscreen interfaces within the automotive industry. The research, distinct from existing studies, aims to identify the most suitable actuators for delivering effective and efficient haptic feedback, focusing on practical implications. Six experts reviewed three haptic effects and four actuators, providing qualitative feedback on force, quality, reactivity, and duration using a 7-point Likert scale. The test was conducted in a car-simulated environment with a car seat and a central touchscreen display. The four actuators were positioned behind the touchscreen display, each connected to separate control modules communicating via serial interface with the computer managing the Graphic User Interface. This research provides practical insights into choosing haptic actuators for automotive touchscreen interfaces, focusing on enhancing user experience and feedback. The findings can be directly applied to enhance car haptic feedback systems, improving safety, user engagement, and driving experiences.

Keywords: Automotive | Haptic Feedback | Human-Machine Interface

[2] Micoli L.L., Guidi G., Caruso G., Automatic 3D Modeling Process for Predefined Geometrical Categories Based on Convolutional Neural Network and Computer-Vision Analysis of Orthographic Images, Computer-Aided Design and Applications, 21(4), 677-692, (2024). Abstract
X

Abstract: Implementing an alternative reality for Metaverse implies modeling optimized digital content to guarantee real-time interaction and high-quality rendering. Even if 3D reconstruction based on 3D scanning techniques provides a good replica of real objects, the output files are challenging to use in this application. On the other hand, manually developed optimized 3D models require much time and effort. This aspect becomes crucial in scenarios including thousands of 3D models with which humans should interact. This paper proposes a method to automate the 3D modeling process of items whose shapes can be classified according to predefined geometrical categories. The dataset for this study relates to products, which present a wide variety of shapes but are attributable to just a few formal archetypes. In the proposed pipeline, metric orthographic images of the object to be digitally reproduced are analyzed by Convolutional Neural Networks (CNN)s. Subsequently, the same images are analyzed with Computer-Vision (CV) algorithms to extrapolate the characteristic dimensions related to the assigned archetypes. The method has been tested on different items, and the results proved the effectiveness of the whole approach in terms of correct archetypes recognition, parameter extraction, and creation of the 3D model, which are comparable with digitized 3D models with high-quality scanning tools but much lighter in model size.

Keywords: 3D Object Database | Artificial Intelligence | Automatic 3D Modeling | Computer Vision | Convolutional Neural Network | Process Optimization

[3] Aydin F., Caruso G., Mussone L., An Image Processing-Based Method to Analyze Driver Visual Behavior Using Eye-Tracker Data, Applied Sciences (Switzerland), 14(14), (2024). Abstract
X

Abstract: This paper presents a practical method for analyzing drivers’ eye movements, providing a valuable tool for understanding their behavior during driving simulations. The method, which utilizes an image processing technique, addresses the challenges when the driver’s attention is on points without information about the image depth. The screen image changes or moves with the simulation. It allows us to identify the gaze position relative to the road, determining whether the glance is inside or outside. This is achieved by transforming RGB images (frames) collected by the eye-tracker video camera into a b/w image using the Canny filter. This filter can identify objects’ contours by evaluating the change in color of their surfaces. A window is then applied to these new images to extract information about the gaze position in the real world. Four drivers were used as a sample for the method’s testing. The findings demonstrate various driver variations and a disparity between driving in curved and rectilinear segments. The gaze is typically inside the road in curved sections, whereas in rectilinear sections, the gaze is frequently outside.

Keywords: driver behavior | driving simulator | eye-tracker | gaze analysis | image processing

[4] Picardi A., Caruso G., User-Centered Evaluation Framework to Support the Interaction Design for Augmented Reality Applications, Multimodal Technologies and Interaction, 8(5), (2024). Abstract
X

Abstract: The advancement of Augmented Reality (AR) technology has been remarkable, enabling the augmentation of user perception with timely information. This progress holds great promise in the field of interaction design. However, the mere advancement of technology is not enough to ensure widespread adoption. The user dimension has been somewhat overlooked in AR research due to a lack of attention to user motivations, needs, usability, and perceived value. The critical aspects of AR technology tend to be overshadowed by the technology itself. To ensure appropriate future assessments, it is necessary to thoroughly examine and categorize all the methods used for AR technology validation. By identifying and classifying these evaluation methods, researchers and practitioners will be better equipped to develop and validate new AR techniques and applications. Therefore, comprehensive and systematic evaluations are critical to the advancement and sustainability of AR technology. This paper presents a theoretical framework derived from a cluster analysis of the most efficient evaluation methods for AR extracted from 399 papers. Evaluation methods were clustered according to the application domains and the human–computer interaction aspects to be investigated. This framework should facilitate rapid development cycles prioritizing user requirements, ultimately leading to groundbreaking interaction methods accessible to a broader audience beyond research and development centers.

Keywords: augmented reality | evaluation framework | human–computer interaction | methods and tools | user evaluation | user testing

[5] Yan M., Rampino L.R.E., Caruso G., User Acceptance of Autonomous Vehicles: Review and Perspectives on the Role of the Human-machine Interfaces, Computer-Aided Design and Applications, 20(5), 987-1004, (2023). Abstract
X

Abstract: Although autonomous driving has significantly developed in the last years, its acceptance by users is still low, even due to the different interaction modalities between the human agent and Autonomous Vehicles (AVs). Therefore, this paper proposes an analysis of the existing research on the influence of Human-Machine Interfaces (HMIs) on the user acceptance of AVs from the perspective of interaction design. The authors reviewed the fundamental changes in the way users interact with AVs. The paper focuses on the transfer of the vehicle control between the human and the artificial intelligence agent, the user experience of Non-Driving-Related Tasks (NDRTs) and sharing autonomous driving in public transportation, and the impact of external HMI on Vulnerable Road Users (VRUs). In addition, the paper analyzes the concept of acceptability and describes the existing user acceptance models. Finally, the paper explores the future challenges for promoting the design potential of autonomous vehicle HMIs and proposes areas worthy of research to increase the user’s acceptance of this technology

Keywords: autonomous vehicles | human-machine interface | user acceptance model | vulnerable road users

[6] Bellani P., Picardi A., Caruso F., Gaetani F., Brevi F., Arquilla V., Caruso G., Enhancing User Engagement in Shared Autonomous Vehicles: An Innovative Gesture-Based Windshield Interaction System, Applied Sciences (Switzerland), 13(17), (2023). Abstract
X

Abstract: With the rapid advancement of autonomous vehicles, a transformative transportation paradigm is emerging in the automotive industry, necessitating a re-evaluation of how users engage with and utilize these evolving settings. This research paper introduces an innovative interaction system tailored for shared autonomous vehicles, focusing on its development and comprehensive evaluation. The proposed system uses the car’s windshield as an interactive display surface, enabling infotainment and real-time information about the surrounding environment. The integration of two gesture-based interfaces forms a central component of the system. Through a study involving twenty subjects, we analyzed and compared the user experience facilitated by these interfaces. The study outcomes demonstrated that the subjects exhibited similar behaviors and responses across both interfaces, thus validating the potential of these interaction systems for future autonomous vehicles. These findings collectively emphasize the transformative nature of the proposed system and its ability to enhance user engagement and interaction within the context of autonomous transportation.

Keywords: autonomous vehicle | gesture | haptic | head up display | human computer interaction | human machine interface | user experience

[7] Malik U.S., Micoli L.L., Caruso G., Guidi G., Integrating quantitative and qualitative analysis to evaluate digital applications in museums, Journal of Cultural Heritage, 62, 304-313, (2023). Abstract
X

Abstract: Analyzing the effectiveness of a digital museum application is generally a complex process. This is due to the interaction between a digital tool capable of a limited number of narration paths and a human being whose multifaceted reactions are uniquely affected by many cultural, psychological, and emotional elements. The classical approaches in the literature can be divided into two main areas: qualitative methods, usually based on interviews at the end of the experience for assessing the cognitive effect of the digital interaction, and quantitative approaches, generally more oriented with the assessment of the subjects’ emotional reactions. Among the quantitative methods, we can mention those dealing with measurements of physiological parameters during the experience, such as ECG, EEG, temperature, etc. This paper proposes a novel approach based on integrating a qualitative process using a two-stage interview, with the quantitative analysis of the user's keystrokes during his digital interaction with the application. The results show how the quantitative component can decouple the interview answers from the interviewer's possible interferences, simultaneously delivering: i) a more transparent assessment of the user's learning experience and ii) mapping of the application's least effective sections. This provides crucial input for improving the design of digital applications for museums.

Keywords: Digital application for museums | Digital interaction assessment | Learning experience | Qualitative analysis | Quantitative analysis

[8] Morosi F., Becattini N., Caruso G., Cascini G., Measuring the Impact of Augmented Prototyping Systems in Co-Design Activities, Multimodal Technologies and Interaction, 7(11), (2023). Abstract
X

Abstract: In recent years, research reached a very high level of development and validation of augmented prototyping systems in support of collaborative design activities. However, there is still great scepticism in companies when it comes to integrating these new technologies within a consolidated working model. Among others, the main barrier to overcome concerns the lack of understanding of the impact of AR systems on the key objectives of a business, such as improving its efficiency and revenue. For this reason, this paper aims to quantify these indicators by observing the technological impact not on a single design session but on an entire product development process, during which the aspects related to its integration are also considered. Thanks to the collaboration with a design agency, it was possible to compare parameters such as the lead time, number of iterations, person-hours and costs between two similar and realistic projects in which only one was supported by projection-based AR technology.

Keywords: co-design process | industrial validation | key performance indicator | spatial augmented reality | technology acceptance

[9] Yan M., Lin Z., Lu P., Wang M., Rampino L., Caruso G., Speculative Exploration on Future Sustainable Human-Machine Interface Design in Automated Shuttle Buses, Sustainability (Switzerland), 15(6), (2023). Abstract
X

Abstract: Automated Shuttle buses (ASB) are considered an essential and sustainable direction for the future application of autonomous driving technology in public transportation. As the driver’s role gradually decreases and disappears, the Human–Machine Interface (HMI) for information exchange and communication between users and ASB takes a more prominent role and progressively becomes a hotspot in research. However, the unpredictability and complexity of autonomous driving, an exceptionally fast-growing technology, have hindered its future study. This work first reviewed related literature in three categories: internal, external, and station of the ASB. Secondly, the importance of systemic and speculative design is affirmed by exploring existing HMI designs for ASB. Thirdly, the concepts for ASB resulting from three parallel workshops were analyzed. Finally, online questionnaires and interviews completed the critical reflection and discussion. The results show that the introduction of tools and methods related to systemic and speculative design into the design process of the HMI for ASB may help designers to think critically about the future uncertainty of ASB and to deal with the complexity of the system.

Keywords: automated shuttle buses | human-machine interface | speculative design | systemic design | user experience

[10] Arfini S., Bellani P., Picardi A., Yan M., Fossa F., Caruso G., Design for Inclusivity in Driving Automation: Theoretical and Practical Challenges to Human-Machine Interactions and Interface Design, Studies in Applied Philosophy, Epistemology and Rational Ethics, 67, 63-85, (2023). Abstract
X

Abstract: This chapter explores Human–Machine Interaction (HMI) challenges to inclusivity in driving automation by focusing on a case study concerning an automated vehicle for public transportation. Connected and Automated Vehicles (CAVs) are widely considered a tremendous opportunity to provide new mobility options to many who currently cannot drive: the young, the elderly, people suffering from cognitive or physical impairments, and so on. However, the association between driving automation and inclusivity rests too much on the mere automation of driving tasks. More attention should instead be dedicated to the HMI field for CAVs to be suitable for multiple users with different needs. From this perspective, inclusivity is an utterly complicated objective to accomplish—one that requires a new definition of human agents, fine-grained methodological discussions, innovative design solutions, and extensive testing. Based on these considerations, we present a case study and discuss the challenges that need to be faced when designing inclusive CAVs.

Keywords: Automated vehicles | Design ethics | Human–machine interaction | Inclusivity | Interface design

[11] Morosi F., Caruso G., Becattini N., Cascini G., Projected Augmented Reality for Industrial Design: Challenges and Opportunities, Lecture Notes in Networks and Systems, 745 LNNS, 61-73, (2023). Abstract
X

Abstract: Spatial-Augmented Reality (SAR) technology has the potential to impact decision-making in various design fields, such as architecture, engineering, and product design. However, despite its benefits, SAR technology has not yet been fully embraced by professionals. This study aims to identify the essential features of a SAR platform that would meet the demands and expectations of different design activities, including product appearance, interface, and ergonomics. The research involved experts who used a SAR platform in real-life product development, and data was collected through semi-structured questionnaires and interviews. The analysis revealed a significant correlation between the features of the technology, considering both potential and already available ones, and the needs of specific design sectors during product development processes. The study’s outcomes provide valuable insights for advancing SAR technology in the design field and promoting its adoption by professionals.

Keywords: Augmented Reality | Design activity | Technology assessment

[12] Yan M., Lu P., Arquilla V., Brevi F., Rampino L., Caruso G., Systemic Design Strategies for Shaping the Future of Automated Shuttle Buses, Applied Sciences (Switzerland), 13(21), (2023). Abstract
X

Abstract: Automated shuttle buses entail adopting new technologies and modifying users’ practices, cultural and symbolic meanings, policies, and markets. This results in a paradigmatic transition for a typical sociotechnical system: the transport system. However, the focus of the extant literature often lacks an overall vision, addressing a single technology, supply chain, or societal dimension. Although systemic design can manage multiple-level and long-term transitions, the literature does not discuss how systemic design tools can support implementation. This paper takes the four strategies proposed by Pereno and Barbero in 2020 as the theoretical framework to fill this literature gap, discussing the specific systemic design methods applicable to the design of automated shuttle bus systems. A six-week workshop to facilitate the exploration of future autonomous public transportation is taken as a case study. The systemic design approach was applied to enrich the Human–Machine Interaction (HMI) and functional architecture of automated shuttle buses.

Keywords: automated shuttle buses | sociotechnical system | speculative design | systemic design

[13] Shi Y., Boffi M., Piga B.E.A., Mussone L., Caruso G., Perception of Driving Simulations: Can the Level of Detail of Virtual Scenarios Affect the Driver's Behavior and Emotions?, IEEE Transactions on Vehicular Technology, 71(4), 3429-3442, (2022). Abstract
X

Abstract: Human factors studies are becoming more and more crucial in the automotive sector due to the need to evaluate the driver.s reactions to the increasingly sophisticated driving-assistant technologies. Driving simulators allow performing this kind of study in a controlled and safe environment. However, the driving simulation.s Level of Detail (LOD) can affect the users. perception of driving scenarios and make an experimental campaign.s outcomes unreliable. This paper proposes a study investigating possible correlations between driver.s behaviors and emotions, and simulated driving scenarios. Four scenarios replicating the same real area were built with four LODs from LOD0 (only the road is drawn) to LOD3 (all buildings with real textures for facades and roofs are inserted together with items visible from the road). 32 participants drove in all the four scenarios on a fixed-base driving simulator; their performance relating to the vehicle control (i.e., speed, trajectory, brake and gas pedal use, and steering wheel), their physiological data (electrodermal activity, and eye movements), their subjective perceptions, opinions and emotional state were measured. The results showed that drivers. behavior changes in a very complex way. Geometrical features of the route and environmental elements constrain much more driving behavior than LOD does Emotions are not affected by LODs. Generally, different signals showed different correlations with the LOD level, suggesting that future studies should consider their measures while modeling the virtual scenario. It is hypothesized that scenario realism is more relevant during leisurely environmental interaction, whilst simulator fidelity is crucial in task-driven interactions.

Keywords: driver behavior | Driving simulator | environmental psychology | eye tracker | level of detail (LOD) | simulation reliability

[14] Boffi M., Piga B.E.A., Mussone L., Caruso G., Investigating objective and perceived safety in road mobility, Transportation Research Procedia, 60, 600-607, (2022). Abstract
X

Abstract: The paper presents the human-centered interdisciplinary methodology “SafeMob - Safe Mobility Experiential and Environmental Assessment” developed by the authors. The method combines Objective Safety (OS) with Perceived Safety (PS) to evaluate the performance of mobility solutions. The interrelation of data from the person (physio/psycho) and the environment (road, vehicle, and the surrounding or interacting context - including flows, buildings, and weather) makes the methodology holistic and interdisciplinary. The final goal is to provide a ‘Decision Support System’ for stakeholders in the mobility field, the automotive sector, and the urban planning area. The paper describes the overall theoretical approach and a specific case study application using a car simulator. Emotional reactions of users, driving through the same virtual scenario with different Level of Details, are assessed to gather information about the perceived safety of the environment.

Keywords: Drive simulator | Emotions | Level of Detail | Physiological measures | Virtual reality

[15] Morosi F., Caruso G., Cascini G., Spatial Augmented Reality as a Visualization Support for Engineering Analysis, Lecture Notes in Mechanical Engineering, 103-115, (2022). Abstract
X

Abstract: Projector-based Spatial Augmented Reality (P-SAR) is a technology that allows to alter the external appearance of a physical object by means of an almost infinite variety of computer-generated contents. Thanks to the adoption of colored lights, a projected image acquires coherent spatial properties with respect to the data it represents; this has been demonstrated to facilitate the users’ interpretation of complex information. The current paper presents the development of a P-SAR system aiming at supporting the real-time visualization and the inspection of engineering simulation results. Particular attention is paid on detailing the algorithms necessary for the generation of the color maps to be displayed on the prototype’s surface. These are interpolated starting from a discrete array of output data coming from a generic simulation to resemble the configuration of sensors commonly adopted in real experimental setups. An illustrative case study applied to CFD analysis is finally discussed to show the applicability of such immersive environments in engineering fields that require to perform testing activities with equipment like wind tunnels.

Keywords: Engineering analysis | Simulation | Spatial augmented reality

[16] Caruso G., Yousefi M.K., Mussone L., From Human to Autonomous Driving: A Method to Identify and Draw Up the Driving Behaviour of Connected Autonomous Vehicles, Vehicles, 4(4), 1430-1449, (2022). Abstract
X

Abstract: The driving behaviour of Connected and Automated Vehicles (CAVs) may influence the final acceptance of this technology. Developing a driving style suitable for most people implies the evaluation of alternatives that must be validated. Intelligent Virtual Drivers (IVDs), whose behaviour is controlled by a program, can test different driving styles along a specific route. However, multiple combinations of IVD settings may lead to similar outcomes due to their high variability. The paper proposes a method to identify the IVD settings that can be used as a reference for a given route. The method is based on the cluster analysis of vehicular data produced by a group of IVDs with different settings driving along a virtual road scenario. Vehicular data are clustered to find IVDs representing a driving style to classify human drivers who previously drove on the same route with a driving simulator. The classification is based on the distances between the different vehicular signals calculated for the IVD and recorded for human drivers. The paper includes a case study showing the practical use of the method applied on an actual road circuit. The case study demonstrated that the proposed method allowed identifying three IVDs, among 29 simulated, which have been subsequently used as a reference to cluster 26 human driving styles. These representative IVDs, which ideally replicate the driving style of human drivers, can be used to support the development of CAVs control logic that better fits human expectations. A closing discussion about the flexibility of the method in terms of the different natures of data collection, allowed for depicting future applications and perspectives.

Keywords: automotive engineering | autonomous driving | driving behaviour | driving simulator | intelligent agents | pattern clustering

[17] Ariansyah D., Pardamean B., Caruso G., The effect of visual advanced driver assistance systems on a following human driver in a mixed-Traffic condition, Procedia Computer Science, 216, 221-229, (2022). Abstract
X

Abstract: Rapid development in vehicular technology has caused more automated vehicle control to increase on the roads. Studies showed that driving in mixed traffic with an autonomous vehicle (AV) had a negative impact on the time headway (THW) of conventional vehicles (CVs) (i.e., driven by humans). To address this issue, there is a need to equip CV with visual advanced driver assistance systems (ADASs) that helps the driver maintain safe headway when driving near AVs. This study examines the perception of drivers using visual ADAS and their associated risk while driving behind the AV at constant and varying speeds. The preliminary results showed that while visual ADAS could help drivers keep the safe THW, it could affect drivers' ability to react to emergencies. This implies that visual modality alone might not be sufficient and therefore requires some other feedback or intelligent transport systems to help drivers maintain safe driving in a mixed-Traffic condition.

Keywords: car-following | sustainable transport management platooning | virtual realtiy | visual advanced driver assistance system

[18] Fossa F., Arrigoni S., Caruso G., Cholakkal H.H., Dahal P., Matteucci M., Cheli F., Operationalizing the Ethics of Connected and Automated Vehicles: An Engineering Perspective, International Journal of Technoethics, 13(1), (2022). Abstract
X

Abstract: In response to the many social impacts of automated mobility, in September 2020 the European Commission published “Ethics of Connected and Automated Vehicles,” a report in which recommendations on road safety, privacy, fairness, explainability, and responsibility are drawn from a set of eight overarching principles. This paper presents the results of an interdisciplinary research where philosophers and engineers joined efforts to operationalize the guidelines advanced in the report. To this aim, the authors endorse a function-based working approach to support the implementation of values and recommendations into the design of automated vehicle technologies. Based on this, they develop methodological tools to tackle issues related to personal autonomy, explainability, and privacy as domains that most urgently require fine-grained guidance due to the associated ethical risks. Even though each tool still requires further inquiry, they believe that the work might already prove the productivity of the function-based approach and foster its adoption in the CAV scientific community.

Keywords: Application | Autonomy | Connected and Automated Vehicles | Ethics | Explainability | Principles | Privacy

[19] Ulrich L., Nonis F., Vezzetti E., Moos S., Caruso G., Shi Y., Marcolin F., Can adas distract driver’s attention? An rgb-d camera and deep learning-based analysis, Applied Sciences (Switzerland), 11(24), (2021). Abstract
X

Abstract: Driver inattention is the primary cause of vehicle accidents; hence, manufacturers have introduced systems to support the driver and improve safety; nonetheless, advanced driver assistance systems (ADAS) must be properly designed not to become a potential source of distraction for the driver due to the provided feedback. In the present study, an experiment involving auditory and haptic ADAS has been conducted involving 11 participants, whose attention has been monitored during their driving experience. An RGB-D camera has been used to acquire the drivers’ face data. Subsequently, these images have been analyzed using a deep learning-based approach, i.e., a convolutional neural network (CNN) specifically trained to perform facial expression recognition (FER). Analyses to assess possible relationships between these results and both ADAS activations and event occurrences, i.e., accidents, have been carried out. A correlation between attention and accidents emerged, whilst facial expressions and ADAS activations resulted to be not correlated, thus no evidence that the designed ADAS are a possible source of distraction has been found. In addition to the experimental results, the proposed approach has proved to be an effective tool to monitor the driver through the usage of non-invasive techniques.

Keywords: ADAS | CNN | DADA | Deep learning | Driver’s attention | RGB-D camera

[20] Morosi F., Carli I., Caruso G., Cascini G., Dekoninck E., Boujut J.F., Exploring Tablet Interfaces for Product Appearance Authoring in Spatial Augmented Reality, International Journal of Human Computer Studies, 156, (2021). Abstract
X

Abstract: Users acceptance of innovative product appearance authoring tools based on Spatial Augmented Reality (SAR) is still limited due to their perception of a high technology complexity and a low performance/functionality of the current interaction systems. The integration of SAR technologies in professional design activities is still marginal, though many studies in this field have already proved their potential as supporting tools. To overcome this barrier, efficient means for interacting with the digital images projected onto the surfaces of real objects are essential. The aim of the current study is to respond to this demand by proposing and validating three UI configurations displayed by an unique and portable device embedded with a touch screen. These interface layouts, designed to cooperate with the output of the SAR system and to not affect the well-known benefits of its augmented environment, provide different types of visual feedback to the user by duplicating, extending or hiding the information already displayed by the projected mock-up. The experimental study reported here, performed with a panel of 41 subjects, revealed that accuracy, efficiency and perceived usability of the proposed solutions are comparable with each other and in comparison to standard desktop setups commonly used for design activities. According to these findings, the research simultaneously demonstrates (i) the high performances achieved by the touch device when coupled with a SAR system during the execution of authoring tasks, (ii) the capability of the projected mock-up to behave as an actual three-dimensional display for the real time rendering of the product appearance and (iii) the possibility to freely select - according to the users preference, the design task or the type of product - one of the three UI configurations without affecting the quality of the result.

Keywords: Authoring tool | Human-Computer interface | Interface validation | Spatial Augmented Reality | Touch interaction | Usability evaluation

[21] Bellani P., Carulli M., Caruso G., Gestural interfaces to support the sketching activities of designers, Proceedings of the ASME Design Engineering Technical Conference, 2, (2021). Abstract
X

Abstract: The several loops characterizing the design process used to slow down the development of new projects. Since the 70s, the design process has changed due to the new technologies and tools related to Computer-Aided Design software and Virtual Reality applications that make almost the whole process digital. However, the concept phase of the design process is still based on traditional approaches, while digital tools are poor exploited. In this phase, designers need tools that allow them to rapidly save and freeze their ideas, such as sketching on paper, which is not integrated in the digital-based process. The paper presents a new gestural interface to give designers more support by introducing an effective device for 3D modelling to improve and speed up the conceptual design process. We designed a set of gestures to allow people from different background to 3D model their ideas in a natural way. A testing session with 17 participants allowed us to verify if the proposed interaction was intuitive or not. At the end of the tests, all participants succeeded in the 3D modelling of a simple shape (a column) by only using air gestures in a relatively short amount of time exactly how they expected it to be built, confirming the proposed interaction.

Keywords: Design process | Gesture recognition

[22] Morosi F., Caruso G., Configuring a VR simulator for the evaluation of advanced human–machine interfaces for hydraulic excavators, Virtual Reality, (2021). Abstract
X

Abstract: This study is aimed at evaluating the impact of different technical solutions of a virtual reality simulator to support the assessment of advanced human–machine interfaces for hydraulic excavator based on a new coordinated control paradigm and haptic feedbacks. By mimicking the end-effector movements, the control is conceived to speed up the learning process for novice operators and to reduce the mental overload on those already trained. The design of the device can fail if ergonomics, usability and performance are not grounded on realistic simulations where the combination of visual, auditory and haptic feedbacks make the users feel like being in a real environment rather than a computer-generated one. For this reason, a testing campaign involving 10 subjects was designed to discriminate the optimal set-up for the hardware to ensure a higher immersion into the VR experience. Both the audio–video configurations of the simulator (head-mounted display and surround system vs. monitor and embedded speakers) and the two types of haptic feedback for the soil–bucket interaction (contact vs. shaker) are compared in three different scenarios. The performance of both the users and simulator are evaluated by processing subjective and objective data. The results show how the immersive set-up improves the users’ efficiency and ergonomics without putting any extra mental or physical effort on them, while the preferred haptic feedback (contact) is not the more efficient one (shaker).

Keywords: Excavator coordinated control | Haptic control | Human–machine interface | Multi-sensory feedbacks | Virtual reality simulator

[23] Piñones E., Cascini G., Caruso G., Morosi F., Overcoming augmented reality adoption barriers in design: A mixed prototyping content authoring tool supported by computer vision, Proceedings of the Design Society, 1, 2359-2368, (2021). Abstract
X

Abstract: Enhancing the appearance of physical prototypes with digital elements, also known as mixed prototyping, has demonstrated to be a valuable approach in the product development process. However, the adoption is limited also due to the high time and competence required for authoring the digital contents. This paper presents a content authoring tool that aims to improve the user acceptance by reducing the specific competence required, which is needed for segmentation and UV mapping of the 3D model used to implement a mixed prototype. Part of the tasks related to 3D modelling software, in fact, has been transferred to simpler manual tasks applied onto the physical prototype. Moreover, the proposed tool can recognise these manual inputs thanks to a computer-vision algorithm and automatically manage the segmentation and UV mapping tasks, freeing time for the user in a task that otherwise would require complete engagement. To preliminarily evaluate effectiveness and potential of the tool, it has been used in a case study to build up the mixed prototype of a coffee machine. The result demonstrated that the tool can correctly segment the 3D model of a physical prototype in its relevant parts and generate their corresponding UV maps.

Keywords: Augmented reality | Computational design methods | Industrial design | Mixed prototyping | New product development

[24] Shi Y., Azzolin N., Picardi A., Zhu T., Bordegoni M., Caruso G., A virtual reality-based platform to validate hmi design for increasing user’s trust in autonomous vehicle, Computer-Aided Design and Applications, 18(3), 502-518, (2021). Abstract
X

Abstract: This research aims at providing an example of how Virtual Reality technology can support the design and development process of the Human Machine Interaction system in the field of Autonomous vehicles. The autonomous vehicles will be an important role in the future’s daily life, as widely concerned. However, the relationship between the human user and the vehicle changes in the autonomous driving scenario, therefore a new interactive modality should be established to guarantee the operational precision and the comfort of the user. But as an underdevelopment sector, there are no mature guidelines for the interaction design in autonomous vehicles. In the early phase of the autonomous vehicle popularization, the first challenge is to build the trust of the user towards the autonomous vehicle. Keeping high transparency of the autonomous vehicle’s behavior to the user will be very helpful, however, it is not possible to communicate the information that the sensors of the autonomous vehicle are collecting because it can create safety risks. In this research, two hierarchical Human Machine Interaction information systems have been introduced and a virtual reality scenario has been developed, based on the most popular applicating scenario: the autonomous taxi. Possible verification methods are also discussed to apply the tool, considering the current design and development procedure in industry, in order to give constructive help to the researchers and practitioners in the field.

Keywords: Autonomous Vehicle | Fully autonomous vehicle | HMI design | Human Machine Interaction | Virtual Reality | VR

[25] Cascini G., O'Hare J., Dekoninck E., Becattini N., Boujut J.F., Ben Guefrache F., Carli I., Caruso G., Giunta L., Morosi F., Exploring the use of AR technology for co-creative product and packaging design, Computers in Industry, 123, (2020). Abstract
X

Abstract: Extended Reality technologies, including Virtual Reality (VR) and Augmented Reality (AR), are being applied in a wide variety of industrial applications, but their use within design practice remains very limited, despite some promising research activities in this area over the last 20 years. At the same time, design practice has been evolving to place greater emphasis on the role of the client or end-user in the design process through ‘co-creative design’ activities. Whilst offering many benefits, co-creative design activities also present challenges, notably in the communication between designers and non-designers, which can hinder innovation. In this paper, we investigate the potential of a novel, projection-based AR system for the creation of design representations to support co-creative design sessions. The technology is tested through benchmarking experiments and in-situ trials conducted with two industrial partners. Performance metrics and qualitative feedback are used to evaluate the effectiveness of the new technology in supporting co-creative design sessions. Overall, AR technology allows quick, real-time modifications to the surfaces of a physical prototype to try out new ideas. Consequently, designers perceive the possibility to enhance the collaboration with the end-users participating in the session. Moreover, the quality and novelty of ideas generated whilst using projection-based AR outperform conventional sessions or handheld display AR sessions. Whilst the results of these early trials are not conclusive, the results suggest that projection-based AR design representations provide a promising approach to supporting co-creative design sessions.

Keywords: Co-creation | Co-design | Design representation | Prototype | Spatial augmented reality

[26] Micoli L.L., Caruso G., Guidi G., Design of digital interaction for complex museum collections, Multimodal Technologies and Interaction, 4(2), 1-20, (2020). Abstract
X

Abstract: Interactive multimedia applications in museums generally aim at integrating into the exhibition complementary information delivered through engaging narratives. This article discusses a possible approach for effectively designing an interactive app for museum collections whose physical pieces are mutually related by multiple and articulated logical interconnections referring to elements of immaterial cultural heritage that would not be easy to bring to the public with traditional means. As proof of this concept, a specific case related to ancient Egyptian civilization has been developed. A collection of Egyptian artifacts such as mummies, coffins, and amulets, associated with symbols, divinities, and magic spells through the structured funerary ritual typical of that civilization, has been explained through a virtual application based on the concepts discussed in the methodological section.

Keywords: Ancient Egypt | Cultural heritage exhibition | Digital heritage | Intangible heritage | Interactive devices | Multimedia | Virtual museum

[27] Bruno F., Ceriani A., Zhan Z., Caruso G., Del Mastro A., Virtual reality to simulate an inflatable modular hydroponics greenhouse on mars, Proceedings of the ASME Design Engineering Technical Conference, 9, (2020). Abstract
X

Abstract: A human mission to Mars has long been advocated. As each year the scientific researches bring mankind closer to establishing human habitats on Mars, the question of how astronauts can sustain themselves whilst away from the blue planet becomes crucial. The project presented in this paper aims at designing and developing the Virtual Reality (VR) simulation of an inflatable modular greenhouse featuring a system that manages the growth of the plants and helps the astronauts control and monitor the whole greenhouse more extensively. The use of VR technology allows simulating an immersive environment of a Mars habitat highlighting its greenhouse overcoming the limitation of physical locations. Wearing the Oculus Rift head-mounted display (HMD) while holding Oculus Rift Touch Controllers, astronauts or Mars exploration enthusiasts could experience the highly interactive and realistic environment. Its goal is to provide training and evaluative simulations of astronauts’ basic tasks and performances in the greenhouse on Mars while testing the growing method of hydroponics equipped with a smart growing controlling and monitoring system.

Keywords: Control | Greenhouse | Hydroponics | Interactive environment | Mars exploration | Mars habitat | Monitor system | Oculus Rift | Virtual Reality

[28] Aruanno B., Caruso G., Rossini M., Molteni F., Espinoza M.C.E., Covarrubias M., Virtual and augmented reality platform for cognitive tele-rehabilitation based system, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12376 LNCS, 130-137, (2020). Abstract
X

Abstract: Virtual and Augmented Reality systems have been increasingly studied, becoming an important complement to traditional therapy as they can provide high-intensity, repetitive and interactive treatments. Several systems have been developed in research projects and some of these have become products mainly for being used at hospitals and care centers. After the initial cognitive rehabilitation performed at rehabilitation centers, patients are obliged to go to the centers, with many consequences, as costs, loss of time, discomfort and demotivation. However, it has been demonstrated that patients recovering at home heal faster because surrounded by the love of their relatives and with the community support.

Keywords: Cognitive rehabilitation | Gaming | LeapMotion | Oculus rift | VR/AR

[29] Morosi F., Caruso G., High-fidelity rendering of physical colour references for projected-based spatial augmented reality design applications, Computer-Aided Design and Applications, 18(2), 343-356, (2020). Abstract
X

Abstract: Spatial Augmented Reality allows users to visualise information onto physical objects by projecting digital contents on them. Product design applications could profitably exploit this feature to create prototypes partially real and partially virtual (mixed prototypes) to be used for the evaluation of products during the development processes. A mixed prototype needs a high visual quality, because design decisions are taken on the base of its aspect, and projected colours should match the colour standards (e.g. Pantone, RAL, etc.) to be able to rely on the visualised colours. The current paper analyzes the effect of a colour calibration method, based on the iteration of comparison and compensation phases, onto the projected images using objective measurements and subjective users’ evaluations. The procedure, whose effectiveness is verified thanks to the presented results, allows to replicate any colour available inside the projector gamut by simply using a physical sample.

Keywords: Colour calibration | Colour fidelity | Product design | Spatial augmented reality

[30] Shi Y., Bordegoni M., Caruso G., User studies by driving simulators in the era of automated vehicle, Computer-Aided Design and Applications, 18(1), 211-226, (2020). Abstract
X

Abstract: In order to increase safety in the road and improve the user experience in the vehicle, the user studies have been conducted by researchers and practitioners in the automobile industry over the decades. Also, over time, the technology and design inside the car have changed and are leading to a faster, safer, and more comfortable user experience in driving, thanks to the results gained from the user studies. On the other side, the boosting automated driving technology gives new challenges to user studies in the validation of new technologies from the user’s perspective, improving the acceptance, employing the right usage, and so on. Laboratory driving simulation becomes one of the main methods for user studies because of its safety, ease of control, and precision in the scene restoration. In this paper, a typical fixed-base driving simulator will be introduced with a user interaction model in order to help the researchers to define the user study scope in each vehicle automation level and even predict the potential user study issues in the future autonomous vehicle technology and scenario. The strategy in the current study is to treat the different levels of automation in vehicles differently. Three case studies are provided accordingly from the low-automated to semi-autonomous driving and eventually fully autonomous driving. Each one addresses some of the critical points that should be paid attention to, in the user studies of the corresponding automation level, applying the previous model. In the low automation condition, the case study showed the effectiveness of the proposed method in the verification of olfactory modality interaction in the driver’s attention maintenance. The case study in the semi-automation condition demonstrated the capacity of the current method of capturing the user’s behavior changes in the take-over task, which is the most critical scenario in conditional autonomous driving. And the last case study showed the possibility to conduct comfort-related user studies in the full automation condition using the method, by monitoring the cognitive workload of users under different autonomous driving styles.

Keywords: Driving simulation | Driving simulator | User behavior studies | User experience | User studies

[31] Morosi F., Rossoni M., Caruso G., Coordinated control paradigm for hydraulic excavator with haptic device, Automation in Construction, 105, (2019). Abstract
X

Abstract: The usability of heavy construction equipment is strongly affected by the design of their human-machine interfaces. Lack of confidence with the current input devices is due to their counterintuitive design and the absence of loop feedback between the end effector and human hands. In the last few years, many researchers have been demonstrated that haptic devices, joined with a suitable design of the control levers, could help to face this problem. In this paper, an innovative control logic for hydraulic excavators has been proposed based on the inverse kinematic of the arms of the hydraulic excavator. The aim of this control is to reduce the cognitive effort of the users if compared with the one required by the current control systems. The implementation of this control logic has been based on previous research projects, technical documentations and interviews with experts. The proposed control logic has been evaluated by means of experimental activities with a virtual simulator which test the usability and efficiency of the proposed solution.

Keywords: Coordinated control | Excavator | Haptic device | Human-machine interface | Usability evaluation | Virtual reality

[32] Micoli L.L., Gonizzi Barsanti S., Caruso G., Guidi G., DIGITAL CONTENTS for ENHANCING the COMMUNICATION of MUSEUM EXHIBITION: The PERVIVAL PROJECT, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 42(2/W9), 487-493, (2019). Abstract
X

Abstract: The PERVIVAL project aims at developing an interactive system with the preliminary function of explaining a complex museum collection in a simple and immediate way and allowing the visitor to better understand the museum collection he is about to see. In particular, the interactive system aims at enhancing the understanding of the collections of funeral furnishings of Egyptians, which are characterized by a multiplicity of objects of rich symbolism and connected to each other through complex funeral rituals. The idea is to explain the religious creed of ancient Egyptians through the objects placed in the tomb, having in this way a double benefit: enlightening the rituals and placing the objects back in their primary function. In this way, the knowledge of the visitor is not only enlarged through the description of something that is described on papyruses or inscriptions (hence, not comprehensible) but also the proper function of every single object will be explained through the connection among them, as a function of amulets or goods necessary to travel through the World of the Dead. The connection between the different objects allows a much greater understanding of the exposed collection that would be perceived in this way not as a set of single isolated pieces, but as a harmonious set of complementary elements between they represent a specific historical-cultural context.

Keywords: Cultural Heritage | Dissemination | Egyptians | Exhibition | Museum | Storytelling | Virtual Reality

[33] Shi Y., Maskani J., Caruso G., Bordegoni M., Explore user behaviour in semi-autonomous driving, Proceedings of the International Conference on Engineering Design, ICED, 2019-August, 3871-3880, (2019). Abstract
X

Abstract: The control shifting between a human driver and a semi-autonomous vehicle is one of the most critical scenarios in the road-map of autonomous vehicle development. This paper proposes a methodology to study driver's behaviour in semi-autonomous driving with physiological-sensors-integrated driving simulators. A virtual scenario simulating take-over tasks has been implemented. The behavioural profile of the driver has been defined analysing key metrics collected by the simulator namely lateral position, steering wheel angle, throttle time, brake time, speed, and the take-over time. In addition, heart rate and skin conductance changes have been considered as physiological indicators to assess cognitive workload and reactivity. The methodology has been applied in an experimental study which results are crucial for taking insights on users' behaviour. Results show that individual different driving styles and performance are able to be distinguished by calculating and elaborating the data collected by the system. This research provides potential directions for establishing a method to characterize a driver's behaviour in a semi-autonomous vehicle.

Keywords: Evaluation | Semi-autonomous vehicle | Simulation | User behaviour | Virtual reality

[34] Shi Y., Huang W., Cheli F., Bordegoni M., Caruso G., Do autonomous vehicle driving styles affect user state?: A preliminary investigation, Proceedings of the ASME Design Engineering Technical Conference, 1, (2019). Abstract
X

Abstract: A bursting number of achievements in the autonomous vehicle industry have been obtained during the past decades. Various systems have been developed to make automated driving possible. Due to the algorithm used in the autonomous vehicle system, the performance of the vehicle differs from one to another. However, very few studies have given insight into the influence caused by implementing different algorithms from a human factors point of view. Two systems based on two algorithms with different characteristics are utilized to generate the two driving styles of the autonomous vehicle, which are implemented into a driving simulator in order to create the autonomous driving experience. User’s skin conductance (SC) data, which enables the evaluation of user’s cognitive workload and mental stress were recorded and analyzed. Subjective measures were applied by filling out Swedish occupational fatigue inventory (SOFI-20) to get a user self-reporting perspective view of their behavior changes along with the experiments. The results showed that human’s states were affected by the driving styles of different autonomous systems, especially in the period of speed variation. By analyzing users’ self-assessment data, a correlation was observed between the user “Sleepiness” and the driving style of the autonomous vehicle. These results would be meaningful for the future development of the autonomous vehicle systems, in terms of balancing the performance of the vehicle and user’s experience.

Keywords: Autonomous vehicle | Driving style | Human behavior

[35] Ariansyah D., Caruso G., Ruscio D., Bordegoni M., Analysis of autonomic indexes on drivers' workload to assess the effect of visual adas on user experience and driving performance in different driving conditions, Journal of Computing and Information Science in Engineering, 18(3), (2018). Abstract
X

Abstract: Advanced driver assistance systems (ADASs) allow information provision through visual, auditory, and haptic signals to achieve multidimensional goals of mobility. However, processing information from ADAS requires operating expenses of mental workload that drivers incur from their limited attentional resources. The change in driving condition can modulate drivers' workload and potentially impair drivers' interaction with ADAS. This paper shows how the measure of cardiac activity (heart rate and the indexes of autonomic nervous system (ANS)) could discriminate the influence of different driving conditions on drivers' workload associated with attentional resources engaged while driving with ADAS. Fourteen drivers performed a car-following task with visual ADAS in a simulated driving. Drivers' workload was manipulated in two driving conditions: one in monotonous condition (constant speed) and another in more active condition (variable speed). Results showed that drivers' workload was similarly affected, but the amount of attentional resources allocation was slightly distinct between both conditions. The analysis of main effect of time demonstrated that drivers' workload increased over time without the alterations in autonomic indexes regardless of driving condition. However, the main effect of driving condition produced a higher level of sympathetic activation on variable speed driving compared to driving with constant speed. Variable speed driving requires more adjustment of steering wheel movement (SWM) to maintain lane-keeping performance, which led to higher level of task involvement and increased task engagement. The proposed measures appear promising to help designing new adaptive working modalities for ADAS on the account of variation in driving condition.

[36] Ruscio D., Caruso G., Mussone L., Bordegoni M., Eco-driving for the first time: The implications of advanced assisting technologies in supporting pro-environmental changes, International Journal of Industrial Ergonomics, 64, 134-142, (2018). Abstract
X

Abstract: Eco-driving assistance devices are being introduced to reduce CO2 emissions, but the overall changes in the user behavior have not been sufficiently explored. While in-vehicle driver advanced systems are designed to support a single driving task (e.g. reducing emissions), they also imply the adoption of a different driving behavior and different driving attitudes in order to be efficient. Adopting for the first time a new driving style could affect driver's acceptance and undermine new technologies’ efficacy. Purpose of the present research is to measure and evaluate the user's responses to first-time use of eco-driving assisting technology. Driver's performances in a virtual simulator were compared between experimental and a control group. The actual driving parameters and CO2 emissions were recorded and compared to the optimal eco-driving style calculated by CarMaker software. The cognitive costs of the new driving style were measured by changes in the modulation of autonomic nervous system and NASA-TLX workload scale. Acceptance of the assisted driving style and general eco-friendly attitudes were analyzed by self-reported measures. Results show that being exposed for the first time to eco-driving technology produces a reduction of cumulate fuel consumption only due to speed reduction, and not to changes in the driving style parameters, as recommended by the assisting software. Overall CO2 emissions of eco-driving group were not different from the control group. Rather, the first time use of the eco-driving assistance increases perceived fatigue and the physiological cardiac autonomic balance related to increased workload over time. These difficulties show that eco-driving style cannot be simply adopted by following the assistance device indications. It seems rather a process, which requires specific support during in the first driving-interaction with eco-driving technology. The design of assistance device that aims to change the driving style, could benefit from the measurement of the user's workload to avoid primacy effect that potentially undermine technology efficacy in supporting user-sustainable behaviors.

Keywords: Acceptance | Assistance technologies | CO2 emissions | Human-machine interaction | Pro-environmental behavior | Workload

[37] Morosi F., Carli I., Caruso G., Cascini G., Dhokia V., Ben Guefrache F., Analysis of co-design scenarios and activities for the development of a spatial-augmented reality design platform, Proceedings of International Design Conference, DESIGN, 1, 381-392, (2018). Abstract
X

Abstract: This paper discusses how Spatial Augmented Reality (SAR) can support design sessions in the fields of product, interface and packaging design. We analyse how the scope of a design session and the type of collaboration require different features of the SAR technology. We benchmark a SAR platform under development within the SPARK project (http://spark-project.net/) and state of the art solutions against the proposed classification framework to evaluate the current state of the platform, its limitations and to outline SAR technology requirements for future development possibilities.

Keywords: Collaborative design | Design creativity | Mixed prototype | Participatory design | Spatial augmented reality (SAR)

[38] Ruscio D., Bascetta L., Gabrielli A., Matteucci M., Ariansyah D., Bordegoni M., Caruso G., Mussone L., Collection and comparison of driver/passenger physiologic and behavioural data in simulation and on-road driving, 5th IEEE International Conference on Models and Technologies for Intelligent Transportation Systems, MT-ITS 2017 - Proceedings, 403-408, (2017). Abstract
X

Abstract: The i.Drive Lab has developed inter-disciplinary methodology for the analysis and modelling of behavioral and physiological responses related to the interaction between driver, vehicle, infrastructure, and virtual environment. The present research outlines the development of a validation study for the combination of virtual and real-life research methodologies. i.Drive driving simulator was set up to replicate the data acquisition of environmental and physiological information coming from an equipped i.Drive electric vehicle with same sensors. i.Drive tests are focused on the identification of driver's affective states that are able to define recurring situations and psychophysical conditions that are relevant for road-safety and drivers' comfort. Results show that it is possible to combine different research paradigms to collect low-level vehicle control behavior and higher-level cognitive measures, in order to develop data collection and elaboration for future mobility challenges.

Keywords: driving assessment | on-road tests | physiological measures | test vehicle | virtual reality simulator

[39] Caruso G., Ruscio D., Ariansyah D., Bordegoni M., Driving simulator system to evaluate driver's workload using adas in different driving contexts, Proceedings of the ASME Design Engineering Technical Conference, 1, (2017). Abstract
X

Abstract: The advancement of in-vehicle technology for driving safety has considerably improved. Current Advanced Driver- Assistance Systems (ADAS) make road safer by alerting the driver, through visual, auditory, and haptic signals about dangerous driving situations, and consequently, preventing possible collisions. However, in some circumstances the driver can fail to properly respond to the alert since human cognition systems can be influenced by the driving context. Driving simulation can help evaluating this aspect since it is possible to reproduce different ADAS in safe driving conditions. However, driving simulation alone does not provide information about how the change in driver's workload affects the interaction of the driver with ADAS. This paper presents a driving simulator system integrating physiological sensors that acquire heart's activity, blood volume pulse, respiration rate, and skin conductance parameters. Through a specific processing of these measurements, it is possible to measure different cognitive processes that contribute to the change of driver's workload while using ADAS, in different driving contexts. The preliminary studies conducted in this research show the effectiveness of this system and provide guidelines for the future acquisition and the treatment of the physiological data to assess ADAS workload.

[40] Carulli M., Vitali A., Caruso G., Bordegoni M., Rizzi C., Cugini U., ICT technology for innovating the garment design process in fashion industry, Smart Innovation, Systems and Technologies, 65, 525-535, (2017). Abstract
X

Abstract: The Italian fashion industry is nowadays subject to radical transformation; therefore, it needs to remain competitive and, at the same time, innovate itself, in order to strengthen its position in the global market. An important opportunity of innovation can be the introduction of ICT technologies in the garment design process, which today is based on traditional methods and tools. Moreover, this innovation could be particularly important for online sales, in order to reduce the customers’ doubts during purchasing. The research presented in this paper describes a framework for designing clothes as realistic 3D digital models and for allowing customers to evaluate the designed clothes by using realistic virtual mannequins of their bodies instead of the standard ones. A case study will be presented in the paper. The obtained results show that the framework can innovate the traditional garment design process and it could have a huge impact on fashion industry and customers behaviours.

Keywords: Body scanning | Cloth simulation | Design process | Motion capture | Virtual prototype

[41] Bordegoni M., Covarrubias M., Caruso G., Cugini U., Freehand Gesture and Tactile Interaction for Shape Design, Journal of Computing and Information Science in Engineering, 16(4), (2016). Abstract
X

Abstract: This paper presents a novel system that allows product designers to design, experience, and modify new shapes of objects, starting from existing ones. The system allows designers to acquire and reconstruct the 3D model of a real object and to visualize and physically interact with this model. In addition, the system allows designer to modify the shape through physical manipulation of the 3D model and to eventually print it using a 3D printing technology. The system is developed by integrating state-of-the-art technologies in the sectors of reverse engineering, virtual reality, and haptic technology. The 3D model of an object is reconstructed by scanning its shape by means of a 3D scanning device. Then, the 3D model is imported into the virtual reality environment, which is used to render the 3D model of the object through an immersive head mounted display (HMD). The user can physically interact with the 3D model by using the desktop haptic strip for shape design (DHSSD), a 6 degrees of freedom servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. The DHSSD device is controlled by means of hand gestures recognized by a leap motion sensor.

[42] Mansutti A., Covarrubias Rodriguez M., Caruso G., Bordegoni M., Cugini U., Visuo-tactile system for 3D digital models rendering, Computer-Aided Design and Applications, 13(2), 236-245, (2016). Abstract
X

Abstract: Abstract: The product design process is based on a sequence of phases where the concept of the shape of a product is typically represented through a digital 3D model of the shape, and often also by means of a corresponding physical prototype. The digital model allows designers to perform the visual evaluation of the shape, while the physical model is used to better evaluate the aesthetic characteristics of the product, i.e. its dimension and proportions, by touching and interacting with it. Design and evaluation activities are typical cyclical, repeated as many times as needed in order to reach the optimal and desired shape. This reiteration leads to an increase of the development time and, consequently, of the overall product development cost. The aim of this research work is to develop a novel system for the simultaneous visual and tactile rendering of product shapes, thus allowing designers to both touch and see new product shapes already during the product conceptual development phase. The proposed system for visual and tactile shape rendering consists in a Tactile Display able to represent in the real environment the shape of a product, which can be explored naturally through free hand interaction. The device is designed in order to be portable, low cost, modular and high performing in terms of types of shapes that can be represented. The developed Tactile Display can be effectively used if integrated with an Augmented Reality system, which allows the rendering of the visual shape on top of the tactile haptic strip. This allows a simultaneous representation of visual and tactile properties of a shape. By using the Tactile Display in the initial conceptual phases of product design, the designers will be able to change the shape of a product according to the tactile evaluation, before the development of the physical prototype. This feature will lead to a decrease of the number of physical prototypes needed, thereby reducing, both cost and overall time of the product development process.

Keywords: augmented reality | shape rendering | Tactile display | virtual prototyping

[43] Caruso G., Camere S., Bordegoni M., System based on abstract prototyping and motion capture to support car interior design, Computer-Aided Design and Applications, 13(2), 228-235, (2016). Abstract
X

Abstract: The paper describes a system to supply meaningful insights to designers during the concept generation of new car interiors. The aim of the system is to capture the movements of car passenger's and making these acquisitions directly available as a generative input. The system has been developed by integrating the Abstract Prototyping technique with Motion Capture technology. In addition, a systematic procedure allows the treatment of the collected data to obtain a graphical representation, which can be easily used with standard NURBS-based modeling software. The effectiveness of the system has been evaluated through a testing session conducted with subjects. The outcomes of the testing sessions have highlighted benefits and limitations of the implemented system.

Keywords: Abstract prototyping | human factors | motion capture

[44] Barsanti S.G., Caruso G., Guidi G., Virtual navigation in the ancient Egyptian funerary rituals, Proceedings of the 2016 International Conference on Virtual Systems and Multimedia, VSMM 2016, (2016). Abstract
X

Abstract: This paper shows a VR application for explaining the meaning of the various pictograms and hieroglyphs typical of the ancient Egypt funerary rituals. The interaction between the user and the 3D environment is obtained through an Oculus Rift head mounted stereoscopic display, coupled with a Leap Motion controller as input device that digitize in real-time the hands of the end-user, displaying a skeletal version of those in the virtual environment. The interactive application is based on Unity3D and it explains the details of the rituals starting from the crate of an Egyptian sarcophagus and some typical funerary objects like the Heart Scarab and the Ushabty.

Keywords: ancient Egypt | funerary rituals | heart scarab | Hieroglyph | Leap Motion | Oculus Rift | sarcophagus | Unity3D | ushabty | VR

[45] Rodriguez M.C., Rossini M., Caruso G., Samali G., Giovanzana C., Molteni F., Bordegoni M., Sound feedback assessment for upper limb rehabilitation using a multimodal guidance system, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9759, 529-536, (2016). Abstract
X

Abstract: This paper describes the implementation of a Multimodal Guidance System (MGS) for upper limb rehabilitation through vision, haptic and sound. The system consists of a haptic device that physically renders virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs about patient’s actions while performing a manual task as for example: starting and/or finishing an sketch; different sounds related to the hand’s velocity while sketching. The goal of this sonification approach is to strengthen the patient’s understanding of the virtual shape which is used in the rehabilitation process, and to inform the patient about some attributes that could otherwise remain unseen. Our results provide conclusive evidence that the effect of using the sound as additional feedback increases the accuracy in the tasks operations.

Keywords: Haptic guidance | Sound interaction | Upper-limb rehabilitation

[46] Camere S., Caruso G., Bordegoni M., Di Bartolo C., Mauri D., Pisino E., Form follows data: A method to support concept generation coupling experience design with motion capture, Proceedings of the International Conference on Engineering Design, ICED, 5(DS 80-05), 135-144, (2015). Abstract
X

Abstract: Human movements express non-verbal communication: The way humans move, live and act within a space influences and reflects the experience with a product. The study of postures and gestures can bring meaningful information to the design process. This paper explores the possibility to adopt Motion Capture technologies to inform the design process and stimulate concept generation with an Experience Design perspective. Motion data could enable designers to tackle Experience-driven design process and come up with innovative designs. However, due to their computational nature, these data are largely inaccessible for designers. This study presents a method to process the raw data coming from the Motion Capture system, with the final goal of reaching a comprehensible visualization of human movements in a modelling environment. The method was implemented and applied to a case study focused on User Experience within the car space. Furthermore, the paper presents a discussion about the conceptualization of human movement, as a way to inform and facilitate Experience-driven design process, and includes some propositions of applicable design domains.

Keywords: Body tracking | Conceptual design | Data visualization | Motion capture | User experience

[47] Gonizzi Barsanti S., Caruso G., Micoli L.L., Covarrubias Rodriguez M., Guidi G., 3D visualization of cultural heritage artefacts with virtual reality devices, International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 40(5W7), 165-172, (2015). Abstract
X

Abstract: Although 3D models are useful to preserve the information about historical artefacts, the potential of these digital contents are not fully accomplished until they are not used to interactively communicate their significance to non-specialists. Starting from this consideration, a new way to provide museum visitors with more information was investigated. The research is aimed at valorising and making more accessible the Egyptian funeral objects exhibited in the Sforza Castle in Milan. The results of the research will be used for the renewal of the current exhibition, at the Archaeological Museum in Milan, by making it more attractive. A 3D virtual interactive scenario regarding the "path of the dead", an important ritual in ancient Egypt, was realized to augment the experience and the comprehension of the public through interactivity. Four important artefacts were considered for this scope: Two ushabty, a wooden sarcophagus and a heart scarab. The scenario was realized by integrating low-cost Virtual Reality technologies, as the Oculus Rift DK2 and the Leap Motion controller, and implementing a specific software by using Unity. The 3D models were implemented by adding responsive points of interest in relation to important symbols or features of the artefact. This allows highlighting single parts of the artefact in order to better identify the hieroglyphs and provide their translation. The paper describes the process for optimizing the 3D models, the implementation of the interactive scenario and the results of some test that have been carried out in the lab.

Keywords: 3D modelling | Cultural Heritage | Leap Motion | Oculus Rift | Unity | Virtual Reality | Visualisation

[48] Bordegoni M., Camere S., Caruso G., Cugini U., Body tracking as a generative tool for experience design, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9185, 122-133, (2015). Abstract
X

Abstract: Beyond ergonomic measurements, the study of human movements can help designers in exploring the rich, non-verbal communication of users’ perception of products. This paper explores the ability of human gestures to express subjective experiences and therefore, to inform the design process at its early stages. We will investigate the traditional techniques used in the Experience Design domain to observe human gestures, and propose a method to couple Experience-driven design approach with Motion Capture technique. This will allow integrating qualitative user observations with quantitative and measurable data. However, the richness of information that Motion Capture can retrieve is usually inaccessible for designers. This paper presents a method to visualize human motion data so that designers can make sense of them, and use them as the starting point for concept generation.

Keywords: Body Tracking | Concept design | Data visualization | Motion Capture | User experience

[49] Caruso G., Carulli M., Bordegoni M., Augmented Reality System for the Visualization and Interaction with 3D Digital Models in a Wide Environment, Computer-Aided Design and Applications, 12(1), 86-95, (2015). Abstract
X

Abstract: ABSTRACT: This paper proposes a new interactive Augmented Reality (AR) system, which has been conceived to allow a user to freely interact with virtual objects integrated in a real environment without the need to wear cumbersome equipment. The AR system has been developed by integrating the Fog Screen display technology, stereoscopic visualization and the Microsoft Kinect. The user can select and manage the position of virtual objects visualized on the Fog Screen display by using directly his/her hands. A specific software application has been developed to perform evaluation testing sessions with users. The aim of the testing sessions was to verify the influence of issues related to tracking, visualization and interaction modalities on the overall usability of the AR ‘system. The collected experimental results demonstrate the easiness of use and the effectiveness of the new interactive AR system and highlight the features preferred by the users.

Keywords: augmented reality | design review | gesture-based interface

[50] Akyeampong J., Udoka S., Caruso G., Bordegoni M., Evaluation of hydraulic excavator Human-Machine Interface concepts using NASA TLX, International Journal of Industrial Ergonomics, 44(3), 374-382, (2014). Abstract
X

Abstract: This study evaluated newly proposed Human-Machine Interface (HMI) design concepts for improving the ergonomics of hydraulic excavators. The design concepts were based on an augmented interaction technique which involved the use of heads-up display (HUD) and coordinated control as HMI elements. Two alternative HMI designs were elaborated in order to separately evaluate the ergonomic impacts of the head-up display and the coordinated control by comparing them to the standard HMI design. The effectiveness of these three HMI designs in terms of the reduction of the operators' mental and physical workload were assessed by conducting experiments utilizing human subjects, ages 23-35 years. The National Aeronautics and Space Administration's Task Load Index (NASA TLX) method was used for collecting subjective workload scores based on a weighted average of ratings of six factors: Mental Demand, Physical Demand, Temporal Demand, Own Performance, Effort, and Frustration Level. The results showed that the type of HMI design affects different aspects of the operator's workload. Indeed, it showed how the proposed augmented interaction is an effective solution for reducing the ergonomic gaps in terms of mental workload, and to a lesser extent the physical workload, subjected by the standard HMI design. Relevance to industry: This study proposes innovative HMI solutions featuring heads-up display and coordinated control to improve the ergonomics of the hydraulic excavator HMI, particularly in reducing the operators' mental and physical workload. The results of this study promises to be an innovative approach for developing new HMI designs by hydraulic excavator manufacturers. © 2014 Elsevier B.V.

Keywords: Coordinated control | Ergonomics | Heads-up display | Human-Machine Interface | Hydraulic excavator | NASA TLX

[51] Caruso G., Re G.M., Carulli M., Bordegoni M., Novel Augmented Reality system for Contract Design Sector, Computer-Aided Design and Applications, 11(4), 389-398, (2014). Abstract
X

Abstract: This paper presents a novel transportable and cost-effective Augmented Reality system developed to support interior design activities in the contract design sector. The main functioning principles and technical information about its implementation are provided to show how this system allows overcoming some of the issues related to the use of the Augmented Reality in interior design activities. The effectiveness of this system is verified through two different testing sessions based on a case study, which relates to the contract design sector. The testing sessions involved many interior designers with the intent of demonstrating the possibility of integrating this Augmented Reality system in the "everyday" interior design practice in the specific context of the contract design. © 2014 CAD Solutions, LLC.

Keywords: augmented reality | interior design | virtual prototyping

[52] Gatti E., Caruso G., Bordegoni M., Spence C., Can the feel of the haptic interaction modify a user's emotional state, 2013 World Haptics Conference, WHC 2013, 247-252, (2013). Abstract
X

Abstract: Haptic perception constitutes an important component of our everyday interaction with many products. At the same time, several studies have, in recent years, demonstrated the importance of involving the emotions in the user-product interaction process. The present study was designed to investigate whether haptic interactions can affect, or modulate, people's responses to standardized emotional stimuli. 36 participants completed a self-assessment test concerning their emotional state utilizing as a pointer either a PHANToM device simulating a viscous force field while they moved the stylus, or else a stylus with no force field. During the presentation of the emotional pictures, various physiological parameters were recorded from participants. The results revealed a significant difference in the self-reported arousal associated with the pictures but no significant difference in the physiological measures. The behavioural findings are interpreted in terms of an effect of the haptic feedback on participants' perceived/interpreted emotional arousal. These results suggest that haptic feedback could, in the future, be used to modify participants' interpretation of their physiological states. © 2013 IEEE.

Keywords: Affective Haptics | Emotions | Haptic Interactions & Design | Physiology

[53] Ferrise F., Caruso G., Bordegoni M., Multimodal training and tele-assistance systems for the maintenance of industrial products: This paper presents a multimodal and remote training system for improvement of maintenance quality in the case study of washing machine, Virtual and Physical Prototyping, 8(2), 113-126, (2013). Abstract
X

Abstract: The paper describes an application based on Virtual and Augmented Reality technologies specifically developed to support maintenance operations of industrial products. Two scenarios have been proposed. In the first an operator learns how to perform a maintenance operation in a multimodal Virtual Reality environment that mixes a traditional instruction manual with the simulation, based on visual and haptic technologies, of the maintenance training task. In the second scenario, a skilled user operating in a multimodal Virtual Reality environment can remotely train another operator who sees the instructions about how the operations should be correctly performed, which are superimposed onto the real product. The paper presents the development of the application as well as its testing with users. Furthermore limits and potentialities of the use of Virtual and Augmented Reality technologies for training operators for product maintenance are discussed. © 2013 Copyright Taylor and Francis Group, LLC.

Keywords: assembly | haptics | virtual prototyping

[54] Caruso G., Polistina S., Bordegoni M., Simple measurement and annotation technique of real objects in augmented reality environments, Proceedings of the ASME Design Engineering Technical Conference, 2 B, (2013). Abstract
X

Abstract: The paper describes a technique that allows measuring and annotating real objects in an Augmented Reality (AR) environment. The technique is based on the marker tracking, and aims at enabling the user to define the three-dimensional position of points, within the AR scene, by selecting them directly on the video stream. The technique consists in projecting the points, which are directly selected on the monitor, on a virtual plane defined according to the bi-dimensional marker, which is used for the tracking. This plane can be seen as a virtual depth cue that helps the user to place these points in the desired position. The user can also move this virtual plane to place points within the whole 3D scene. By using this technique, the user can place virtual points around a real object with the aim of taking some measurements of the object, by calculating the minimum distance between the points, or in order to put some annotations on the object. Up to date, these kinds of activities can be carried out by using more complex systems or it is needed to know the shape of the real object a priori. The paper describes the functioning principles of the proposed technique and discusses the results of a testing session carried out with users to evaluate the overall precision and accuracy. Copyright © 2013 by ASME.

[55] Re G.M., Caruso G., Bordegoni M., Augmented reality interactive system to support space planning activities, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8022 LNCS(PART 2), 291-300, (2013). Abstract
X

Abstract: The Space Planning (SP) is a process that allows making an environment more ergonomic, functional and aesthetically pleasing. The introduction of Computer Aided tools for this kind of practice led to an increase of the quality of the final result thanks to some versatile support used for the generation of different options to consider for the evaluation. In particular, those based on Augmented Reality (AR) technologies allow evaluating several options directly in a real room. In this paper, an AR system, developed with the aim of supporting Space Planning activities, is proposed. The system has been developed in order to overcome some problems related to the tracking in wide environments and to be usable in different typologies of Space Planning environments. The paper also presents a qualitative evaluation of the AR system in three different scenarios. The positive results obtained through these evaluation tests show the effectiveness and the suitability of the system in different Space Planning contexts. © 2013 Springer-Verlag Berlin Heidelberg.

Keywords: Augmented Reality | HCI | Space Planning design

[56] Re G.M., Caruso G., Belluco P., Bordegoni M., Hybrid technique to support the tracking in unstructured augmented reality environments, Proceedings of the ASME Design Engineering Technical Conference, 2(PARTS A AND B), 1361-1370, (2012). Abstract
X

Abstract: In this paper, we present a new Augmented Reality (AR) tracking technique that integrates the marker-based tracking with the tracking ability of a commercial mobile robot. The role of the mobile robot is to co-work with the user for extending the working space of the marker-based tracking technique. The robot follows the userâTMs movements during the exploration in the AR environment and updates the position of a fiducial marker, which is fixed on it. The robot is automatically controlled through the device used to visualize the AR scene. The paper discusses the issues related to the integration of these two tracking techniques and proposes an AR application, which has been developed to demonstrate the feasibility of our approach. Technical issues and performances have been assessed through several testing sessions. Copyright © 2012 by ASME.

[57] Bordegoni M., Caruso G., Mixed reality distributed platform for collaborative design review of automotive interiors: This paper presents how mixed reality technologies allow a closer collaboration among designers, final users and engineers and hence reduce the time for reviewing and validating car interior designs, Virtual and Physical Prototyping, 7(4), 243-259, (2012). Abstract
X

Abstract: The design of a new product requires a series of validations before its approval and manufacture. Virtual prototyping based on mixed reality technology seems a promising technique, especially when applied to the design review of products that are characterised by interaction with users. This paper presents a new methodology that allows the collaborative design review and modification of some components of automotive interiors. Professionals can work together through a mixed reality distributed design platform by interacting intuitively and naturally with the virtual prototype of the product. The methodology has been validated by means of tests with users, aiming at assessing the effectiveness of the approach, and at identifying potential usability issues. © 2012 Copyright Taylor and Francis Group, LLC.

Keywords: automotive design | collaborative design | design methodology | mixed reality | virtual reality

[58] Caruso G., Polistina S., Bordegoni M., Collaborative mixed-reality environment to support the industrial product development, ASME 2011 World Conference on Innovative Virtual Reality, WINVR 2011, 207-215, (2011). Abstract
X

Abstract: The paper describes a collaborative Mixed-Reality (MR) environment to support the product design assessment. In particular, we have developed a collaborative platform that enables us to improve the design and the evaluation of cars interior. The platform consists of two different systems: the 3D Haptic Modeler (3DHM) and the Mixed Reality Seating Buck (MRSB). The 3DHM is a workbench that allows us to modify the 3D model of a car dashboard by using a haptic device, while the MRSB is a configurable structure that enables us to simulate different driving seats. The two systems allow the collaboration among designers, engineers and end users in order to get, as final result, a concept design of the product that satisfies both design constraints and end users' preferences. The usability of our collaborative MR environment has been evaluated by means of some testing sessions, based on two different case studies, with the involvement of users. Copyright © 2011 by ASME.

[59] Caruso G., Bordegoni M., A novel 3D interaction technique based on the eye tracking for mixed reality environments, Proceedings of the ASME Design Engineering Technical Conference, 2(PARTS A AND B), 1555-1563, (2011). Abstract
X

Abstract: The paper describes a novel 3D interaction technique based on Eye Tracking (ET) for Mixed Reality (MR) environments. We have developed a system that integrates a commercial ET technology with a MR display technology. The system elaborates the data coming from the ET in order to obtain the 3D position of the user's gaze. A specific calibration procedure has been developed for correctly computing the gaze position according to the user. The accuracy and the precision of the system have been assessed by performing several tests with a group of users. Besides, we have compared the 3D gaze position in real, virtual and mixed environments in order to check if there are differences when the user sees real or virtual objects. The paper also proposes an application example by means of which we have assessed the global usability of the system. © 2011 by ASME.

[60] Caruso G., Gatti E., Bordegoni M., Study on the usability of a haptic menu for 3d interaction, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6947 LNCS(PART 2), 186-193, (2011). Abstract
X

Abstract: The choice of the interaction menu to use is an important aspect for the usability of an application. In these years, different solutions, related to menu shape, location and interaction modalities have been proposed. This paper investigates the influence of haptic features on the usability of 3D menu. We have developed a haptic menu for a specific workbench, which integrates stereoscopic visualization and haptic interaction. Several versions of this menu have been developed with the aim of performing testing sessions with users. The results of these tests have been discussed to highlight the impact that these features have on the user's learning capabilities. © 2011 IFIP International Federation for Information Processi.

Keywords: Haptic Interaction | Haptic Menu | Mixed Reality

[61] Caruso G., Polistina S., Bordegoni M., Aliverti M., Collaborative mixed-reality platform for the design assessment of cars interior, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6774 LNCS(PART 2), 299-308, (2011). Abstract
X

Abstract: The paper describes a collaborative platform to support the development and the evaluation of cars interior by using a Mixed Prototyping (MP) approach. The platform consists of two different systems: the 3D Haptic Modeler (3DHM) and the Mixed Reality Seating Buck (MRSB). The 3DHM is a workbench that allows us to modify the 3D model of a car dashboard by using a haptic device, while the MRSB is a configurable structure that enables us to simulate different driving seats. The two systems allow the collaboration among designers, engineers and end users in order to get, as final result, a concept design of the product that satisfies both design constraints and final users' preferences. The platform has been evaluated by means of several testing sessions, based on two different scenarios, so as to demonstrate the benefits and the potentials of our approach. © 2011 Springer-Verlag.

Keywords: Collaborative design | Ergonomic assessment | Haptic modeling | Mixed Reality | Virtual Prototype

[62] Caruso G., Belluco P., Robotic arm for car dashboard layout assessment in Mixed Reality environment, Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, 62-68, (2010). Abstract
X

Abstract: Mixed Reality (MR) technologies allow us to create different environments, by merging real and virtual objects, and can be successfully used for the assessment of industrial products during the product development phases. This paper describes the integration of a MR environment including an industrial robotic arm for the ergonomics assessment of the driving seat in the automotive field. The robotic arm has been used to configure the MR environment automatically while guaranteeing the correct merging between real and virtual objects and the repeatability of the testing sessions. The user's presence, near an industrial robot, arises some issues on the safety and on the human-robot interaction that we have been addressed in order to improve the performing of the ergonomic test. Finally, the developed system has been validated through some testing sessions with end users to verify the effectiveness of our solution. © 2010 IEEE.

[63] Re G.M., Caruso G., Belluco P., Bordegoni M., Monitor-based tracking system for wide Augmented Reality environments, Eurographics Italian Chapter Conference 2010, 153-158, (2010). Abstract
X

Abstract: In Augmented Reality (AR) applications, the technological solutions used for tracking objects in the real environment are generally related to the conditions of the environment and to the application purposes. The selection of the most suitable tracking technology satisfies the best compromise among several issues including performance, accuracy and easiness of use. This paper describes an AR tracking approach based on a marker-based tracking and its development for wide environment by displaying itself on a monitor. This solution allows us to improve the visibility of the marker and the tracking, thanks to a dynamic control of marker's dimensions. The technical features and performances of our approach have been assessed by several testing sessions focused on comparative analyses. © The Eurographics Association 2010.

[64] Caruso G., Tedioli L., Mixed reality seating buck system for ergonomic evaluation, Proceedings of the 8th International Symposium on Tools and Methods of Competitive Engineering, TMCE 2010, 1, 511-524, (2010). Abstract
X

Abstract: The paper describes the development of a seating buck system for ergonomic evaluation of the driver's cab. Seating buck is a configurable structure that, thanks to the mixed reality technologies, allows us to simulate different driving seat and to perform different evaluation tests. In particular we are interested to evaluate the ergonomics of car's dashboard with its knobs, buttons, display and other control systems. For this reason, we studied and developed a seating buck system that addresses these issues. In particular, we investigated on the possibility of changing in real time the position of some components of the car dashboard with the aim of analysing different layouts. The effectiveness of the system has been subsequently validated through some test sessions with users. © Organizing Committee of TMCE 2010 Symposium.

Keywords: Ergonomic analysis | Haptic devices | Mixed reality | Rapid prototyping | Seating buck

[65] Caruso G., Re G.M., AR-Mote: A wireless device for augmented reality environment, 3DUI 2010 - IEEE Symposium on 3D User Interfaces 2010, Proceedings, 99-102, (2010). Abstract
X

Abstract: One of the open issues in Augmented Reality (AR) applications is certainly related to interaction techniques. In these years many different solutions have been proposed with the intent of providing user interfaces that allow users to interact with the AR environment in a natural and intuitive way. Most of them have addressed the issue of representing users' hands in the AR environment. We propose the use of a commercial and low-cost wireless device to use as input device for AR. This paper describes the integration of this device into an AR application, and some preliminary tests aiming at evaluating the tracking accuracy and precision. In addition, we demonstrate the usability of our system through a preliminary testing session with users. ©2010 IEEE.

Keywords: Multimedia Information Systems|Artificial, augmented, and virtual realities|User Interfaces|Ergonomics|Input devices and strategies|Computer-Aided Engineering|CAD

[66] Caruso G., Re G.M., Interactive augmented reality system for product design review, Proceedings of SPIE - The International Society for Optical Engineering, 7525, (2010). Abstract
X

Abstract: The product development process, of industrial products, includes a phase dedicated to the design review that is a crucial phase where various experts cooperate in selecting the optimal product shape. Although computer graphics allows us to create very realistic virtual representations of the products, it is not uncommon that designers decide to build physical mock-ups of their newly conceived products because they need to physically interact with the prototype and also to evaluate the product within a plurality of real contexts. This paper describes the hardware and software development of our Augmented Reality design review system that allows to overcome some issues related to the 3D visualization and to the interaction with the virtual objects. Our system is composed by a Video See Through Head Mounted Display, which allows to improve the 3D visualization by controlling the convergence of the video cameras automatically, and a wireless control system, which allows us to create some metaphors to interact with the virtual objects. During the development of the system, in order to define and tune the algorithms, we have performed some testing sessions. Then, we have performed further tests in order to verify the effectiveness of the system and to collect additional data and comments about usability and ergonomic aspects. © 2010 Copyright SPIE - The International Society for Optical Engineering.

Keywords: 3D interaction | Augmented reality | Design review | HMD | Stereoscopic visualization

[67] Caruso G., Cugini U., Augmented reality video see-through HMD oriented to product design assessment, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 5622 LNCS, 532-541, (2009). Abstract
X

Abstract: Current state of the art technology offers various solutions for developing virtual prototyping applications that also allow the interaction with the real environment. In particular, Augmented Reality (AR) technologies include tracking systems, stereoscopic visualization systems, photorealistic rendering tools, hi-resolution video overlay systems that allow us to create various types of applications where the virtual prototype is contextualized within the real world. One application domain is product design: AR technologies allow designers to perform some evaluation tests on the virtual prototype of industrial products without the necessity to produce a physical prototype. This paper describes the development of a new Video See-Through Head Mounted Display (VST-HMD) that is high-performing and based on stereoscopic visualization. The developed display system overcomes some issues concerning the correct visualization of virtual objects that are close to the user's point of view. The paper also presents the results of some tests about an AR application developed for product design assessment. © 2009 Springer Berlin Heidelberg.

Keywords: Augmented Reality | Design Assessment | Head Mounted Display | Video See-Through HMD

[68] Bordegoni M., Cugini U., Caruso G., Polistina S., Mixed prototyping for product assessment: A reference framework, International Journal on Interactive Design and Manufacturing, 3(3), 177-187, (2009). Abstract
X

Abstract: The paper presents a reference framework for applications based on the mixed prototyping practice and mixed reality techniques and technologies. This practice can be effectively used for rapid design assessment of new products. In particular, the paper addresses applications regarding the use of mixed prototyping practice for the design review of information appliances. Various methods and technologies can be used according to the product aspects to validate and test. The paper describes mixed prototyping applications for positioning information appliances within systems and for the evaluation of ergonomics aspects of interactive devices. © Springer-Verlag 2009.

Keywords: Haptic devices | Mixed prototyping | Mixed reality | Product assessment | Reference framework

[69] Bordegoni M., Ferrise F., Ambrogio M., Caruso G., Bruno F., Caruso F., Environment based on Augmented Reality and interactive simulation for product design review, 6th Eurographics Italian Chapter Conference 2008 - Proceedings, 27-34, (2008). Abstract
X

Abstract: The aesthetic impact of a product is an important parameter that makes the difference among products technologically similar and with same functionalities. Product shape, which is strictly connected to the aesthetic impact, has different meanings if seen from the design and the engineering point of view. The conceptual design of shape of aesthetic products is usually performed by designers at the beginning of the product development cycle. Subsequent engineering design and studies, such as structural and fluid-dynamic analyses, lead to several design reviews where the original shape of the object is often modified. The design review process is time consuming, requires the collaboration and synchronization of activities performed by various experts having different competences and roles, and is performed using different tools and different product representations. Then, computer aided tools supporting conceptual design and analysis activities within the same environment are envisaged. The paper presents the conceptual description of an environment named PUODARSI that allows designers to modify the shape of a product and evaluate in real-time the impact of these changes on the results of the structural and fluid dynamic analyses in an Augmented Reality (AR) collaborative environment. Main problems in integrating tools developed for different purposes, such as, haptic interaction, FEM and CFD analyses, AR visualization concern the feasibility of the integration, the data exchange, and the choice of those algorithms that allow all that while guaranteeing low computational time. The paper describes the main issues related to the choice of hardware and software technologies, and the PUODARSI system implementation. © The Eurographics Association 2008.

[70] Bordegoni M., Ferrise F., Ambrogio M., Caruso G., Bruno F., Caruso F., Mixed-reality environment based on haptics and interactive simulation for product design review, 20th European Modeling and Simulation Symposium, EMSS 2008, 140-145, (2008). Abstract
X

Abstract: The aesthetic impact of a product is an important parameter that makes the difference among products technologically similar and with same functionalities. The shape, which is strictly connected to the aesthetic impact, has different meanings if seen from the design and the engineering point of view. This paper describes an environment based on an integration of Mixed- Reality technologies, haptic tools and interactive simulation systems, named PUODARSI whose aim is to support designers and engineers during the phase of design review of aesthetic products. The environment allows the designer to modify the object shape, through the use of haptic devices, and the engineer to run the fluid-dynamics simulation on the product shape. The paper describes the main problems faced in integrating tools, originally developed for different purposes and in particular issues concerning data exchange, and the choice of those algorithms that guarantees low computational time as required by the application.

Keywords: Design review | Fluiddynamics analysis | Haptic interfaces | Mixed-Reality

[71] Mengoni M., Peruzzini M., Mandorli F., Bordegoni M., Caruso G., Performing ergonomic analysis in virtual environments: A structured protocol to assess humans interaction, Proceedings of the ASME Design Engineering Technical Conference, 3(PARTS A AND B), 1461-1472, (2008). Abstract
X

Abstract: Virtual Reality (VR) systems provide new modes of human computer interaction that can support several industrial design applications improving time savings, reducing prototyping costs, and supporting the identification of design errors before production. Enhancing the interaction between humans and virtual prototypes by involving multiple sensorial modalities, VR can be adopted to perform ergonomic analysis. The main problems deal with the evaluation both of functional and cognitive sample users behavior as VR interfaces influence the perception of the ergonomic human factors. We state that ergonomic analysis performed in virtual environment can be successful only if supported with a structured protocol for the study both of functional and cognitive aspects and with the proper VR technologies combination that answers to the specific analysis tasks. An ergonomic analysis protocol is presented. It allows the assessment of the consumers/ response in term of behavioral and cognitive human factors, comprehending both operational and emotional agents. The protocol is also used to identify the best combination of visualization and haptic interfaces to perform the analysis. An experimental example, belonging to house appliances field is adopted to investigate the application of the protocol in the virtual set up. Copyright © 2008 by ASME.

[72] Bordegoni M., Caruso G., Ferrise F., Giraudo U., A mixed environment for ergonomic tests: Tuning of the stereo viewing parameters, 5th Eurographics Italian Chapter Conference 2007 - Proceedings, 1, 127-134, (2007). Abstract
X

Abstract: Virtual Reality and related technologies are becoming a common practice in the design industry in general and more specifically in the automotive field. By the joined use of virtual prototyping methodologies it is possible to achieve the reduction of the time-to-market, as well as the costs deriving from the creation of physical prototypes. Ergonomics tests conducted in Virtual Reality environments can be successfully used to influence the design of products. We have set up a mixed reality environment that allows us to simulate and test postural aspects as well as various levels and modalities of users' interaction. In order to achieve a performing interaction, correct registration and visualization parameters must be carefully set for preventing possible interaction errors, such as pointing with precision to the assigned target. The paper presents a methodology for tuning stereoscopic properties of the mixed reality environments used for ergonomic tests of a car interior in order to achieve correct visualization and positioning parameters. The environment includes a virtual human controlled by an optical motion tracking system, physical objects onto which the graphic virtual environment is superimposed, and a rotatory haptic device for controlling the car on-board infotainment. interface. © The Eurographics Association 2007.

[73] Bordegoni M., Giraudo U., Caruso G., Ferrise F., Ergonomic interactive testing in a mixed-reality environment, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 4563 LNCS, 431-440, (2007). Abstract
X

Abstract: The field of computer graphics is greatly increasing its overall performance enabling consequently the implementation of most of the product design process phases into virtual environments. The deriving benefits of using virtual practices in product development have been proved intrinsically highly valuable, since they foster the reduction of time to market, process uncertainty, and the translation of most prototyping activities into the virtual environment. In this paper we present the developed platform in mixed environment for ergonomic validation. Specifically we defined a methodology for testing both aspects related to design and ergonomic validation by allowing the tester to interact visually and physically with the car dashboard control devices and related interface by the mean of a rotatory haptic device. By experimental session it has been highlighted that it is possible gathering qualitative data about the design, and finding typical occlusion problems, but also quantitative data can be collected by testing the infotainment interface and the consequent users' distraction during the device use. © Springer-Verlag Berlin Heidelberg 2007.

Keywords: Ergonomic analysis | Mixed environments | Tactons

[74] Bruno F., Caurso G., Muzzupappa M., 3D input devices integration in CAD environment, 4th Eurographics Italian Chapter Conference 2006 - Proceedings, 189-195, (2006). Abstract
X

Abstract: Virtual Reality (VR) technologies are becoming commonly used tools in the product development process, starting from the styling review in the conceptual design phase, until to the Digital Mock-up (DMU) validation in the advanced stages of the design process. What has not yet been sufficiently investigated is the possibility to interact with the DMU directly inside the CAD environment using 3D input devices. Although few CAD systems, like CATIA V5, have an additional module to support VR devices, in most cases it is still necessary to customize the application to obtain the functionalities desired by the user. The present paper discusses difficulties and advantages related to the integration of VR techniques and 3D input devices in CAD systems. The work has been conducted analyzing and testing the potentialities of the Unigraphics NX3 CAD environment and implementing a software plug-in that allows the user to perform such interaction tasks employing 3D input devices. © 2006 The Eurographics Association.

Tieniti in contatto con l'Associazione ADM

Per qualunque informazione non esitare a contattare la Segreteria ADM tramite le modalità previste nella sezione Contatti

Soci ADM 212

N° pubblicazioni censite 11013