This paper presents a novel method to tie geometric boundary representation (BREP) to voxel-based collision detection for use in haptic manual assembly simulation. Virtual Reality, in particular haptics, has been applied with promising results to improve preliminary product design, assembly prototyping and maintenance operations. However, current methodologies do not provide support for low clearance assembly tasks, reducing the applicability of haptics to a small subset of potential situations. This paper discusses a new approach, which combines highly accurate CAD geometry (boundary representation) with voxel models to support a hybrid method involving both geometric constraint enforcement and voxel-based collision detection to provide stable haptic force feedback. With the methods presented here, BREP data can be accessed during voxel-based collision detection. This information can be used for constraint recognition and lead to constraint-guidance during the assembly process.
ABSTRACTThis paper presents a novel method to tie geometric boundary representation (BREP) to voxel-based collision detection for use in haptic manual assembly simulation. Virtual Reality, in particular haptics, has been applied with promising results to improve preliminary product design, assembly prototyping and maintenance operations. However, current methodologies do not provide support for low clearance assembly tasks, reducing the applicability of haptics to a small subset of potential situations. This paper discusses a new approach, which combines highly accurate CAD geometry (boundary representation) with voxel models to support a hybrid method involving both geometric constraint enforcement and voxel-based collision detection to provide stable haptic force feedback. With the methods presented here, BREP data can be accessed during voxel-based collision detection. This information can be used for constraint recognition and lead to constraint-guidance during the assembly process.