<?xml version="1.0" encoding="utf-8"?>
<!-- generator="Joomla! - Open Source Content Management" -->
<feed xmlns="http://www.w3.org/2005/Atom"  xml:lang="en-gb">
	<title type="text">Prystine - Demonstrators</title>
	<subtitle type="text"></subtitle>
	<link rel="alternate" type="text/html" href="https://prystine.automotive.oth-aw.de"/>
	<id>https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer</id>
	<updated>2024-07-23T13:31:09+00:00</updated>
	<author>
		<name>Prystine</name>
	</author>
	<generator uri="https://www.joomla.org">Joomla! - Open Source Content Management</generator>
	<link rel="self" type="application/atom+xml" href="https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer?format=feed&amp;type=atom"/>
	<entry>
		<title>Demonstrators</title>
		<link rel="alternate" type="text/html" href="https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer/197-demonstrator"/>
		<published>2021-06-28T12:42:21+00:00</published>
		<updated>2021-06-28T12:42:21+00:00</updated>
		<id>https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer/197-demonstrator</id>
		<author>
			<name>r.furche</name>
		</author>
		<summary type="html">&lt;p&gt;{pdf=images/Dissemination/PRYSTINE_Demonstrators.pdf|90%|500|native}&lt;/p&gt;</summary>
		<content type="html">&lt;p&gt;{pdf=images/Dissemination/PRYSTINE_Demonstrators.pdf|90%|500|native}&lt;/p&gt;</content>
		<category term="Disseminaton" />
	</entry>
	<entry>
		<title>Prystine Demonstrators</title>
		<link rel="alternate" type="text/html" href="https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer/200-prystine-demonstrators"/>
		<published>2021-08-04T08:59:35+00:00</published>
		<updated>2021-08-04T08:59:35+00:00</updated>
		<id>https://prystine.automotive.oth-aw.de/index.php/project/prystine-poster-and-flyer/200-prystine-demonstrators</id>
		<author>
			<name>r.furche</name>
		</author>
		<summary type="html">&lt;h1&gt; &lt;/h1&gt;
&lt;h1&gt;1.1 LiDAR + AURIX&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator is showcasing the 2D MEMS-based LiDAR&lt;/p&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo - person walking away&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610234&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610234&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo-car passing near&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610268&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610268&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo - person walking away and back&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610287&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610287&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.2 RADAR + AURIX&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator is showcasing the clustering of radar components. When fault occurs in one cluster, it operates non-affected cluster, which provides higher availability.&lt;/p&gt;
&lt;p&gt; {pdf=images/Dissemination/PRYSTINE_2nd_review_20200717_demos_IFAG.pdf|90%|500|native}&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.3 Radar&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;NXPNL’s novel radar-to-radar interference detection technique reduces the number of false negatives with respect to the state-of-the-art by reducing the noise floor, when interference is present. Lab measurement results have shown 5 to 10 dB noise floor reduction with respect to the state-of-art. On-the road measurements with a radar sensor prototype, using off-the-shelf automotive components, confirm this and have shown that the detection range can be doubled using this technique in the presence of a nearby interferer, restoring most of the radar’s original performance.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.3 NXPNL vehicle-level health monitoring in Toyota Prius v3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610323&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610323&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.4 Radar&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;IMEC‘s demonstration platform comprising the PRYSTINE scalable 60 GHz radar designed in 28nm CMOS. This demonstrator is described in D6.1 and D6.10. No video demonstration is foreseen.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.5 IC-, vehicle-level health monitoring&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;NXPNL‘s health monitors analyze safety of in-vehicle ICs and software components by detecting five fault models: wrong communication timing, corrupt packet data, implausible message streams, as well as OS and hardware anomalies. The prototyped health monitors were integrated in an autonomous vehicle and demonstrated to detect diverse malfunctions. Based on this monitoring information, the redundant automated driving ECUs in the vehicle can respond in time to various faults and realize fail-operational behavior of the system.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.5 NXPNL vehicle-level health monitoring in Toyota Prius v3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610323&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610323&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.1 Fail-operational autonomous driving platform&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator illustrates the possibility to bring hardware architectures to the next level of safety for highly automated driving. The use of a sensor fusion failover mechanism, developed by TTTech Auto in the project, enables the implementation of embedded control to advance safe technologies. Thus, valuably contributing to the mobility of the future. The benefits from this modularity concept, combining COTS elements such as the SoCs, Infineon‘s AURIX™ automotive microcontroller, power supply, Deterministic backbone network for low latency data exchange, multiple cameras, etc. allow for the flexibility of the developed solution and advances the automotive market.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.tttech-auto.com/enabling-state-of-the-art-robustness-and-enhanced-reliability-by-developing-fail-operational-architectures-for-highly-automated-safe-driving/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;More Information&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.2 Drive-by-wire car&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The demonstrator of EDI, presents a novel approach to software component integration: the developed COMPAGE framework (fail-operational system component management framework) and AI-based algorithms are capable of identifying faulty sensors by analyzing different types of data , e.g. LIDAR, Radar, cameras. The system is equipped with Aurix microcontroller providing additional safety integration level and redundancy, acting as a fallback system for LIDAR and RADAR perception subsystems to facilitate successful Automatic Emergency Breaking execution.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=6_0ch9_m13U&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;See the Video here&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.3 Data Fusion and Fall-back&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;A fully integrated security engineering process for realizing secure autonomous driving as well as a trust model for evaluating the trustworthiness of sensor data, with the data fusion module for improving the accuracy of object detection and tracking, is presented in this demonstrator by the University of Turku in collaboration with TTS.&lt;/p&gt;
&lt;p&gt;The main building blocks of this novel approach are; fail-operational middleware, secure data communication and the sensors’ reliability-aware data fusion for assisting automated heavy-duty vehicle driving.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=Z9Cytnp7oa8&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;See the Video here &lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.4 Passenger vehicle for low speed autonomy&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The main objective of this demonstrator is to show an autonomous parking solution utilizing the newly developed FUSION algorithms. The developed perception algorithms also provide a working basis for the Ford heavy duty truck demonstrator in SC5. The proposed solution is related to Automated Vale Parking Systems and provides fail-operationality and robustness by the utilization and fusion of multiple sensor sources, including Lidar, cameras, and Radar.&lt;/p&gt;
&lt;p&gt;Watch the results: &lt;a href=&quot;https://www.academia.edu/video/k68D31&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Path planning for parking: &lt;a href=&quot;https://www.youtube.com/watch?v=OV1AdaRoUd0&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Multi Object Tracking: &lt;a href=&quot;https://www.youtube.com/watch?v=xVZKEfezkZI&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;3D Object Detection: &lt;a href=&quot;https://www.youtube.com/watch?v=Pr71kOZ-OW0&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Occupancy grid filtering: &lt;a href=&quot;https://www.youtube.com/watch?v=YR8K2sN453Q&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Semantic segmentation: &lt;a href=&quot;https://www.youtube.com/watch?v=QANzA4D8duc&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.5 Fail-operational AI Inference Processing&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The fail-operational multi-processor will be demonstrating run time fault detection of a multi-processor system at lower HW overhead than full duplication as for lock step. Videantis GmbH is developing a fail-operational multiprocessor system with flexible redundancy at reduced silicon overhead for an AI algorithm in the project. Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.1 - 3.3 Demonstrators&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;See an overview over the demonstrators here: &lt;a href=&quot;https://www.youtube.com/watch?v=9RCvlfiJS5w&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.1 Demonstrator E/E architecture demonstrator for automotive electronics enabling AD&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.1 E/E architecture demonstrator for automotive electronics enabling AD&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610532&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610532&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture&lt;/h4&gt;
&lt;h4&gt;&lt;a href=&quot;https://vimeo.com/580610563&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610563&lt;/a&gt;&lt;/h4&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.3 Dynamically shaped, reliable mobile communication&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;* LiDAR / RADAR sensor compound demonstrator&lt;/p&gt;
&lt;p&gt;* Enhanced reliability and performance of V2N data connections,&lt;/p&gt;
&lt;p&gt;*Dependable embedded control by co-integration of cellular connections and network-level connection management &lt;br /&gt; * Fail-operational V2N communication for urban and rural environments based on FUSION&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.3 Dynamically shaped, reliable mobile communication&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610375&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610375&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.1 Hardware In the Loop (HIL) for lidar sensor data processing&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;{pdf=images/Dissemination/24_Passenger_vehicle_for_low_speed_autonomy.pdf|90%|500|native}&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.2 Hardware In the Loop (HIL) for back-maneuver assist&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.3 Hardware In the Loop (HIL) for data fusion based VRU detection&lt;/h1&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;h4&gt;PRYSTINE - 4.3 Hardware In the Loop (HIL) for data fusionbased VRU detection&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610762&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610762&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.4 Hardware In the Loop (HIL) for back-maneuver assist&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.5 CiThruS field test&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.5 CiThruS field test&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610718&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610718&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.6 Trajectory planning and vehicle dynamics control&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.6 Trajectory planning and vehicle dynamics control&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610798&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610798&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.7 Fusion of real and virtual sensor data for chassis control&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.7 Fusion of real and virtual sensor data for chassis control&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610651&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610651&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.9 Lab demo for Programmable Accelerator Architecture for multi-sensor data fusion and perception&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;5.1 Heavy Duty Truck&lt;/h1&gt;
&lt;h1&gt; &lt;/h1&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 1&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610429&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610429&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 2&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610689&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610689&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610458&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610458&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;5.2 Truck (3 axels lorry with full size trailer)&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.2 Truck (3 axels lorry with full size trailer)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610489&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610489&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.1 &quot;Traffic light time-to-green&quot;&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Based on the received traffic light phase schedule, the system calculates the approaching speed at which the vehicle reaches the green wave, according to the actual traffic condition (vehicles ahead).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 6.1 &quot;Traffic light time-to-green&quot;&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610593&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610593&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.2 &quot;Vulnerable Road User (VRU) detection and Trajectory recognition&quot;&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The vehicle, with the support of the infrastructure (I2V), recognizes the type of obstacles and predicts the Vulnerable Road Users (VRUs) trajectory, in order to avoid potential collisions.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.3 &quot;Driver monitoring and emergency maneuver “&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The Driver Monitoring System (DMS) detects the cognitive status, to decide if s/he is still capable to control the vehicle, or alternatively, if s/he is able to get back into the control-loop in case of a “take over request” (TOR) from the system.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 6.3 &quot;Driver monitoring and emergency maneuver“&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610899&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610899&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.1 Shared control and arbitration (Level 2-3), studying driver-automation interaction and methods for vehicle authority transition Driver in the Loop (DiL) Simulator&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;a) Demo shows how automation assist the driver under two different condition (driver distraction and possible collision).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Shared control and arbitration (Level 2-3)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610622&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610622&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;b) Demo on the visualization of the HMI.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the visualization of the HMI&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/597067678&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597067678&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;c) Demo on the Driver monitoring system (Driver Activity).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the Driver monitoring system (Driver Activity)&lt;/h4&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;&lt;a href=&quot;https://vimeo.com/597067678&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597069383&lt;/a&gt;&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;p&gt;d) Demo on the Driver monitoring system (Drowsiness detection)&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the Driver monitoring system (Drowsiness detection)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/597069383&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610193&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.1 - 7.3 Demonstrators&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;See an overview over the demonstrators here: &lt;a href=&quot;https://www.youtube.com/watch?v=gfZaopPwoB4&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.2 Layered Control (Level 2-3-4), studying cooperation between a passenger car and a bus, and driver role in supervising or controlling the vehicle when requested.&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;a) Demo shows how automation and driver operate under different levels of scenario complexity  &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.2 Layered Control (Level 2-3-4)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610852&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610852&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;b) Demo shows the performance of the Driver monitoring system&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.2 Demo shows the performance of the Driver monitoring system&lt;/h4&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;&lt;a href=&quot;https://vimeo.com/597075188&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597075188&lt;/a&gt;&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;
&lt;h1&gt;7.3 Highly automated vehicle (Level 3-4), study AI-based decision algorithms for urban and highway scenarios.&lt;/h1&gt;
&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;{pdf=images/Dissemination/73_Highly_automated_vehicle.pdf|90%|500|native}&lt;/div&gt;</summary>
		<content type="html">&lt;h1&gt; &lt;/h1&gt;
&lt;h1&gt;1.1 LiDAR + AURIX&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator is showcasing the 2D MEMS-based LiDAR&lt;/p&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo - person walking away&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610234&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610234&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo-car passing near&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610268&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610268&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.1 Murata LIDAR demo - person walking away and back&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610287&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610287&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.2 RADAR + AURIX&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator is showcasing the clustering of radar components. When fault occurs in one cluster, it operates non-affected cluster, which provides higher availability.&lt;/p&gt;
&lt;p&gt; {pdf=images/Dissemination/PRYSTINE_2nd_review_20200717_demos_IFAG.pdf|90%|500|native}&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.3 Radar&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;NXPNL’s novel radar-to-radar interference detection technique reduces the number of false negatives with respect to the state-of-the-art by reducing the noise floor, when interference is present. Lab measurement results have shown 5 to 10 dB noise floor reduction with respect to the state-of-art. On-the road measurements with a radar sensor prototype, using off-the-shelf automotive components, confirm this and have shown that the detection range can be doubled using this technique in the presence of a nearby interferer, restoring most of the radar’s original performance.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.3 NXPNL vehicle-level health monitoring in Toyota Prius v3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610323&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610323&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.4 Radar&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;IMEC‘s demonstration platform comprising the PRYSTINE scalable 60 GHz radar designed in 28nm CMOS. This demonstrator is described in D6.1 and D6.10. No video demonstration is foreseen.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;1.5 IC-, vehicle-level health monitoring&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;NXPNL‘s health monitors analyze safety of in-vehicle ICs and software components by detecting five fault models: wrong communication timing, corrupt packet data, implausible message streams, as well as OS and hardware anomalies. The prototyped health monitors were integrated in an autonomous vehicle and demonstrated to detect diverse malfunctions. Based on this monitoring information, the redundant automated driving ECUs in the vehicle can respond in time to various faults and realize fail-operational behavior of the system.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 1.5 NXPNL vehicle-level health monitoring in Toyota Prius v3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610323&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610323&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.1 Fail-operational autonomous driving platform&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;This demonstrator illustrates the possibility to bring hardware architectures to the next level of safety for highly automated driving. The use of a sensor fusion failover mechanism, developed by TTTech Auto in the project, enables the implementation of embedded control to advance safe technologies. Thus, valuably contributing to the mobility of the future. The benefits from this modularity concept, combining COTS elements such as the SoCs, Infineon‘s AURIX™ automotive microcontroller, power supply, Deterministic backbone network for low latency data exchange, multiple cameras, etc. allow for the flexibility of the developed solution and advances the automotive market.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.tttech-auto.com/enabling-state-of-the-art-robustness-and-enhanced-reliability-by-developing-fail-operational-architectures-for-highly-automated-safe-driving/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;More Information&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.2 Drive-by-wire car&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The demonstrator of EDI, presents a novel approach to software component integration: the developed COMPAGE framework (fail-operational system component management framework) and AI-based algorithms are capable of identifying faulty sensors by analyzing different types of data , e.g. LIDAR, Radar, cameras. The system is equipped with Aurix microcontroller providing additional safety integration level and redundancy, acting as a fallback system for LIDAR and RADAR perception subsystems to facilitate successful Automatic Emergency Breaking execution.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=6_0ch9_m13U&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;See the Video here&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.3 Data Fusion and Fall-back&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;A fully integrated security engineering process for realizing secure autonomous driving as well as a trust model for evaluating the trustworthiness of sensor data, with the data fusion module for improving the accuracy of object detection and tracking, is presented in this demonstrator by the University of Turku in collaboration with TTS.&lt;/p&gt;
&lt;p&gt;The main building blocks of this novel approach are; fail-operational middleware, secure data communication and the sensors’ reliability-aware data fusion for assisting automated heavy-duty vehicle driving.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=Z9Cytnp7oa8&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;See the Video here &lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.4 Passenger vehicle for low speed autonomy&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The main objective of this demonstrator is to show an autonomous parking solution utilizing the newly developed FUSION algorithms. The developed perception algorithms also provide a working basis for the Ford heavy duty truck demonstrator in SC5. The proposed solution is related to Automated Vale Parking Systems and provides fail-operationality and robustness by the utilization and fusion of multiple sensor sources, including Lidar, cameras, and Radar.&lt;/p&gt;
&lt;p&gt;Watch the results: &lt;a href=&quot;https://www.academia.edu/video/k68D31&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Path planning for parking: &lt;a href=&quot;https://www.youtube.com/watch?v=OV1AdaRoUd0&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Multi Object Tracking: &lt;a href=&quot;https://www.youtube.com/watch?v=xVZKEfezkZI&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;3D Object Detection: &lt;a href=&quot;https://www.youtube.com/watch?v=Pr71kOZ-OW0&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Occupancy grid filtering: &lt;a href=&quot;https://www.youtube.com/watch?v=YR8K2sN453Q&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Semantic segmentation: &lt;a href=&quot;https://www.youtube.com/watch?v=QANzA4D8duc&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;2.5 Fail-operational AI Inference Processing&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The fail-operational multi-processor will be demonstrating run time fault detection of a multi-processor system at lower HW overhead than full duplication as for lock step. Videantis GmbH is developing a fail-operational multiprocessor system with flexible redundancy at reduced silicon overhead for an AI algorithm in the project. Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.1 - 3.3 Demonstrators&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;See an overview over the demonstrators here: &lt;a href=&quot;https://www.youtube.com/watch?v=9RCvlfiJS5w&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.1 Demonstrator E/E architecture demonstrator for automotive electronics enabling AD&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.1 E/E architecture demonstrator for automotive electronics enabling AD&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610532&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610532&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture&lt;/h4&gt;
&lt;h4&gt;&lt;a href=&quot;https://vimeo.com/580610563&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610563&lt;/a&gt;&lt;/h4&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;3.3 Dynamically shaped, reliable mobile communication&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;* LiDAR / RADAR sensor compound demonstrator&lt;/p&gt;
&lt;p&gt;* Enhanced reliability and performance of V2N data connections,&lt;/p&gt;
&lt;p&gt;*Dependable embedded control by co-integration of cellular connections and network-level connection management &lt;br /&gt; * Fail-operational V2N communication for urban and rural environments based on FUSION&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 3.3 Dynamically shaped, reliable mobile communication&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610375&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610375&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.1 Hardware In the Loop (HIL) for lidar sensor data processing&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;{pdf=images/Dissemination/24_Passenger_vehicle_for_low_speed_autonomy.pdf|90%|500|native}&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.2 Hardware In the Loop (HIL) for back-maneuver assist&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.3 Hardware In the Loop (HIL) for data fusion based VRU detection&lt;/h1&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;h4&gt;PRYSTINE - 4.3 Hardware In the Loop (HIL) for data fusionbased VRU detection&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610762&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610762&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.4 Hardware In the Loop (HIL) for back-maneuver assist&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.5 CiThruS field test&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.5 CiThruS field test&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610718&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610718&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.6 Trajectory planning and vehicle dynamics control&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.6 Trajectory planning and vehicle dynamics control&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610798&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610798&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.7 Fusion of real and virtual sensor data for chassis control&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 4.7 Fusion of real and virtual sensor data for chassis control&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610651&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610651&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;4.9 Lab demo for Programmable Accelerator Architecture for multi-sensor data fusion and perception&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Results are being finalized and will be available later.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;5.1 Heavy Duty Truck&lt;/h1&gt;
&lt;h1&gt; &lt;/h1&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 1&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610429&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610429&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 2&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610689&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610689&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.1 Heavy Duty Truck - Use Case 3&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610458&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610458&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;5.2 Truck (3 axels lorry with full size trailer)&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 5.2 Truck (3 axels lorry with full size trailer)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610489&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610489&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.1 &quot;Traffic light time-to-green&quot;&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;Based on the received traffic light phase schedule, the system calculates the approaching speed at which the vehicle reaches the green wave, according to the actual traffic condition (vehicles ahead).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 6.1 &quot;Traffic light time-to-green&quot;&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610593&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610593&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.2 &quot;Vulnerable Road User (VRU) detection and Trajectory recognition&quot;&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The vehicle, with the support of the infrastructure (I2V), recognizes the type of obstacles and predicts the Vulnerable Road Users (VRUs) trajectory, in order to avoid potential collisions.&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;6.3 &quot;Driver monitoring and emergency maneuver “&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;The Driver Monitoring System (DMS) detects the cognitive status, to decide if s/he is still capable to control the vehicle, or alternatively, if s/he is able to get back into the control-loop in case of a “take over request” (TOR) from the system.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 6.3 &quot;Driver monitoring and emergency maneuver“&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610899&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610899&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.1 Shared control and arbitration (Level 2-3), studying driver-automation interaction and methods for vehicle authority transition Driver in the Loop (DiL) Simulator&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;a) Demo shows how automation assist the driver under two different condition (driver distraction and possible collision).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Shared control and arbitration (Level 2-3)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610622&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610622&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;b) Demo on the visualization of the HMI.&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the visualization of the HMI&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/597067678&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597067678&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;c) Demo on the Driver monitoring system (Driver Activity).&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the Driver monitoring system (Driver Activity)&lt;/h4&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;&lt;a href=&quot;https://vimeo.com/597067678&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597069383&lt;/a&gt;&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;p&gt;d) Demo on the Driver monitoring system (Drowsiness detection)&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.1 Demo on the Driver monitoring system (Drowsiness detection)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/597069383&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610193&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h4&gt; &lt;/h4&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.1 - 7.3 Demonstrators&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;See an overview over the demonstrators here: &lt;a href=&quot;https://www.youtube.com/watch?v=gfZaopPwoB4&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h1&gt;7.2 Layered Control (Level 2-3-4), studying cooperation between a passenger car and a bus, and driver role in supervising or controlling the vehicle when requested.&lt;/h1&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;a) Demo shows how automation and driver operate under different levels of scenario complexity  &lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.2 Layered Control (Level 2-3-4)&lt;/h4&gt;
&lt;p&gt;&lt;a href=&quot;https://vimeo.com/580610852&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/580610852&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;b) Demo shows the performance of the Driver monitoring system&lt;/p&gt;
&lt;h4&gt;PRYSTINE - 7.2 Demo shows the performance of the Driver monitoring system&lt;/h4&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;&lt;a href=&quot;https://vimeo.com/597075188&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;https://vimeo.com/597075188&lt;/a&gt;&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;
&lt;h1&gt;7.3 Highly automated vehicle (Level 3-4), study AI-based decision algorithms for urban and highway scenarios.&lt;/h1&gt;
&lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt; &lt;/div&gt;
&lt;div class=&quot;style__VideoLink-v39vxn-7 bpfdkX&quot;&gt;{pdf=images/Dissemination/73_Highly_automated_vehicle.pdf|90%|500|native}&lt;/div&gt;</content>
		<category term="Disseminaton" />
	</entry>
	<entry>
		<title>PRYSTINE DISSEMINATION MATERIAL</title>
		<link rel="alternate" type="text/html" href="https://prystine.automotive.oth-aw.de/index.php/project/dissemination"/>
		<published>2021-09-28T13:08:41+00:00</published>
		<updated>2021-09-28T13:08:41+00:00</updated>
		<id>https://prystine.automotive.oth-aw.de/index.php/project/dissemination</id>
		<author>
			<name>r.furche</name>
		</author>
		<category term="Disseminaton" />
	</entry>
</feed>
