Target / Actual Comparison
The applications aim at researching integrated solutions for manufacturability and assembly validation in virtual as well as mixed-reality environments.
Based on the architecture developed during the ARVIDA project, information from various sources (mixed reality, part tracking and human tracking) is checked, processed and merged or summarised.
The sub-project is structured in two work packages.
The work package 3.2.1 “Analytical mixed reality” deals with the subject of the manufacturability examination for the automotive industry. The target is to achieve an easy-to-operate and precise digitalisation, i.e. three-dimensional recording of the real world in both large and small measuring volumes. It also includes the research of efficient procedures for the integration of this environment data into mixed-reality manufacturability assessments.
During the first half of the project, an initial prototype has been created by Daimler Protics GmbH, 3D Interactive, the TUM and the DFKI in order to detect real component’s three-dimensional capturing in small measurement volumes and to compare them with the related CAD planning data. The following figures show examples of some results from the first half of the project.
For three-dimensional detection of a large measurement volume, a prototype was created in the first half of the project by Daimler Protics GmbH, 3D Interactive and SICK AG. Here, a SLAM (Simultaneous Localisation and Mapping) based localisation method is used to merge the static three-dimensional single scans of an environment (recorded with a measuring carriage) into a holistic 3D environment detection model. The following figures show some results in the form of a 2D SLAM localisation map, and in the form of a reconstructed three-dimensional environment mapping model.
The interoperability of previously independent systems of virtual technologies regarding the used tracking methods could also be detected within the first half of the project. Therefore the tracking system data of the partners Daimler Protics GmbH, TUM, Extend 3D and Fraunhofer IGD were exchanged via the ARVIDA reference architecture. Thus single tracking components can easily be replaced in systems that implement the ARVIDA reference architecture. Furthermore the partners developed solutions to exchange captured 3D data between different virtual technology systems.
The work package 3.2.2 “MR-Fabrication” deals with the subject of Mixed-Reality applications in industry. The target is to achieve a systems solution to support operating personnel using virtual technologies in the conduct of such business.
For the scenario "Producing without drawing" PMI (Production Manufacturing Information) shall be attached directly to the objects, which then can be visualised and displayed with a suitable tool or, with the help of a combined tracking and projection system display the manufacturing information directly onto the component. In addition, an interactive assembly guide will assist the workers in the assembly of complex modules.
The same principle applies in another scenario of front projection in freeform surfaces (GFK).
In this context, in addition to the strip light projection, other forms of structured illumination were analysed, with special emphasis on the the use of structure (edges, distinctive points, etc.), which is already included in the projection to be displayed. It is practically a counterpart to marker-less tracking.
The aim is to achieve the coverage of large shapes. These scenarios are supported by the company Extend3D. The partner FHG IGD has developed methods for real-time fusion of mobile 3D scans. This method can be used when the 3D scan data can be transferred in real time from the 3D scanning system to a mobile computer. The 3D scanner has to be associated with a video camera that is used for tracking the pose (position and orientation) of the mobile 3D scanner. Therefore calibration methods have been developed which make it possible to calibrate the 3D scanning system to the video camera, so that the pose of the tracked 3D video camera can be transferred to the pose of the mobile 3D scanner. To improve usability, the semiskilled initialising edges can be rendered, so that the user knows which areas he can use for initialisation.
Technologies for model-based tracking were developed so that contour-based descriptors, which are used for tracking, can be trained with the help of 3D rendered images. For this, the 3D scans of a ship section to be tracked, are rendered through a virtual camera. The virtual camera that is used for training the initialising poses is moved around in one hemisphere around the identified initialisation point. The size of the initialisation area, which is taught around the initialisation point can be set by parameters. The tracking has been optimised so that a plurality of cameras can be synchronised and so supports the stereo tracking.
The company 3DInteractive mainly identified 3 areas where development is needed regarding target / actual comparison:
- asynchronous update
- laserscan import
- geometry reconstruction
"Asynchronous update" enables the simultaneous construction of a visualisation database during the rendering process. For this, the concept previously designed by 3DInteractive was implemented, tested and provided as an interface to the project partners. This interface is a requirement for a more general REST API visualisation, which also considers the operation of multiple clients and simultaneous reading and writing of the visualisation database.
For the exchange of 3D data, recording data between the subproject partners, the E57 format has been set. This can already be imported by 3DInteractive and was also tested in collaboration with the SICK AG (stationary and mobile scanners) and the DFKI (improved Kinect fusion data). Furthermore, the continuous update of a scan for so called Unordered Point Clouds (UPCs) was connected with different visualisation filters to the asynchronous interface.
The "geometry reconstruction" includes generating a mesh (triangles) from scan data (both ordered and unordered). For this mesh filter the level-of-detail (LOD) is generated by considering the filter and the quality measure per point, with the colour data being stored in a so-called mega texture. For the latter, the technology "Virtual Texturing" was connected to the asynchronous interface to transmit texture data from multiple sources simultaneously and during rendering and therefore to update the visualisation database.
The focus of Siemens Industry Software in the Work Package 1 was the participation and support of the working group “Data”. In particular, open-ended questions regarding the Teamcenter interface REST (Reference Architecture) about SOA discussed in web sessions which were addressed and explained on site at the partner IGD, with appropriate recommendations.
The integration of heterogeneous database structures can be implemented in practice on the basis of various mechanisms and standards. The granularity of requests based on REST can be adapted to the data structure of the supplying system, and thus an implicite format adjustment relating to structure and attributes can be realised. If format conversions of documents become necessary, they have to be conducted separately, for example, Catia -> JT or DWG -> TIFF. Format conversions can be done in the target system, in the source system or outside (on the way). The following figure shows the scheme of the coupling between the Teamcenter software existing at the partner TKMS and third-party systems.
Through research of the project partners, the database landscape was covered with its various databases and to be trained ARVIDA tools / applications. How and where the connection of REST takes place is shown schematically in this figure:
The attribute flow within the heterogeneous system landscape of the product model-related data has been detected. Thus, the neutral reference architecture based on semantic knowledge can be optimally developed. A key component is keeping the information in the SQL database of Teamcenter; characteristic is also the connection to partner-specific systems, such as Geomagic.