Popis: |
Cooperative, Connected and Automated Mobility will enable the close coordination of actions between vehicles, road users and traffic infrastructures, resulting in profound socioeconomic impacts. In this context, location and yaw angle of vehicles is considered vital for safe, secured and efficient driving. Motivated by this fact, we formulated a multimodal sensor fusion problem which provides more accurate localization and yaw information than the original sources. Simultaneously estimating location and yaw parameters of vehicles can be treated as the task of cooperative odometry or awareness. To do so, V2V communication as well as multimodal self and inter-vehicular measurements from various sensors are considered for the problem formulation. The solution strategy is based on the maximum likelihood criterion as well as a novel alternating gradient descent approach. To simulate realistic traffic conditions, CARLA autonomous driving simulator has been used. The detailed evaluation study has shown that each vehicle, relying only on its neighborhood, is able to accurately re-estimate both its own and neighboring states (comprised of locations and yaws), effectively realising the vision of 360◦ awareness. |