Popis: |
INTRODUCTION A core aspect of Global Navigation Satellite Systems (GNSSs) is the time scale they use to operate. Since they use independent time scales, inter-system time-offsets are one of the most significant biases to be taken into account in a multi-constellation solution, and in the framework of interoperability. In [1] a performance analysis is presented considering GPS, Galileo, GLONASS and BeiDou, showing inter-system time-offsets on the order of 10 to 100 ns. While a multi-system solution enables more satellites in view and possibly a better Geometric Dilution of Precision (GDOP), it must be taken into account that any additional constellation involves an additional bias. So, if a single-constellation solution involves four unknowns, including the user’s spatial coordinates and the receiver time offset, a multi-system solution exploiting measurements from N_GNSS constellations involves 4 + N_GNSS^-1 unknows, where the additional N_GNSS^_1 unknows are the inter-system time offsets to be estimated. This means that in order to get an improvement with respect to a single-system solution, at least two satellites from any additional constellation must be in view. In general, on-Earth users have enough satellites in view to get an improvement in GDOP thanks to a multi-GNSS solution. However, this is not always true when the user is in a low-visibility environment. In those cases, a multi-GNSS solution would ideally be beneficial, providing more satellites in view. On the other hand, the inter-system time-biases may constitute the bottle neck, and actually make the solution unavailable. Different approaches have been proposed to overcome this issue. The ICG-IGS Joint Trial Project (IGS-IGMA), led by the International Committee on GNSS (ICG) and the International GNSS Service (IGS), includes as long term objectives to “make all performance standard entries for each GNSS openly available” and to “provide a multi-GNSS service performance standard” [2]. The IGS Multi-GNSS Experiment (MGEX) [3-5] has, among its objectives to provide multi-GNSS products, exploit the IGS monitoring station network, and estimate biases and provide standards. In [6], different methods for the estimation of the inter-system biases are evaluated; the measurement model is constrained assuming the inter-system offset as constant over short time intervals, enabling the solution with only four satellites from mixed constellations. Another possible approach is to provide the users with the inter-system time-offset estimates. [7] describes the implementation of the GPS to Galileo Time Offset (GGTO), which is currently broadcast as part of the Galileo message, with an accuracy of 20 ns (95%, initial service target) [8]. However, as analyzed in [6], [9] and detailed in [10], different receivers have different impacts on the inter-system bias, being on the order of 20 ns and therefore comparable with GGTO [11]. This means that in order to exploit the broadcast estimate, inter-system biases due to the receiver must be calibrated or bounded. Discussion on this still open topic, and different possible approaches to address the receiver biases are presented, for example, in [8], [12], [13]. However, some test results show that in poor visibility conditions some users may benefit using the broadcast value of GGTO, even in presence of the intersystem bias due to the receiver effects [14]. While some users with limited satellite visibility may be able to estimate the inter-system bias and keep that estimate for the epoch when the visibility is poorer, some users may have such limited visibility to find this kind of approach unpractical. For instance, users in the high-altitude Space Service Volume (SSV), such as GEO and HEO satellites. This kind of users would possibly get high benefits from interoperable GNSS. Given the increasing number of applications related to the high-altitude SSV, there is a growing interest in providing those SSV users with PVT solution from GNSS [15-20]. Different approaches have been evaluated, including the opportunity of exploiting GNSS sidelobes signals [21], given that some missions, as for instance [22-23], demonstrated navigation performance in the high altuitude SSV exploiting the GNSS sidelobes that greatly exceed the expected performance [24]. How detailed in [24], these results are given to a combination of factor, including that the actual transmitted GPS power exceed the levels from specifications, even if in different ways in different satellite blocks, in particular [22], and that receiver technology allows to track very weak signals. However, transmissions from the antenna side lobes are totally excluded from performance specifications, in terms of power and errors. Therefore, an analysis of side-lobes measurements was conducted, detailed in [24]. As stated in [25], GPS is a critical infrastructure for space navigation, on which space users rely; however, space users are vulnerable to design changes, if service provider does not specify requirements on those performance. Following these guidelines, the Interface Specification document [26] specifies, for GPS block III, the SSV User-Received Signal Levels. However, only the signals main lobes are considered. Here, the analysis here has been conducted considering only the main lobes of the GNSS signals, considering the minimum performance in terms of main lobe beam-width and minimum radiated transmit power as specified in the performance standard documents, provided by the GNSS service providers and summarized in [27]. Different analyses have been performed to evaluate the availability of GNSS to those users [17], [28]. In this paper, an analysis has been conducted that considers not only the availability in terms of number of satellites in view given a desired received power, as in [28], but also the geometry and the resulting GDOP that a user in the SSV would experience with or without the provision of intersystem time-offset estimates. The discussion about the user receiver calibration is not further detailed here. For this analysis, it is assumed that the user’s receiver has been calibrated, and has a residual bias small enough to satisfy the user’s requirements. This analysis have been performed in the framework of the Bobcat-1 project at Ohio University, to analyze a possible application of inter-constellation time-offsets estimates, which is one of the objectives of the Bobcat-1 project. Bobcat-1 is the first CubeSat being developed in the Avionics Engineering Center (AEC) at Ohio University, Electrical Engineering and Computer Science (EECS) department, in Athens Ohio; Bobcat-1 has been selected for launch through the NASA CubeSat Launch Initiative (CLI), and is expected to be launched in the third quarter of 2020. Figure 1 shows the CubeSat under development at Ohio University. The details of the CubeSat and the mission development are not the focus of this paper. The primary objectives of Bobcat-1 are educational on one side, providing Ohio University graduate and undergraduate students with hands-on experience on a spacecraft, and scientific on the other side. The primary experiment that will be carried out by Bobcat1 is the feasibility and performance study of inter-constellation time-offset estimates from Low Earth Orbit (LEO). Given the growing applications of CubeSat technology, the interest on LEO measurements is growing and different studies have been conducted, as for instance [29]. After the analysis of high altitude SSV performance provided with estimates of inter-system time-offsets, in this paper a discussion is presented on the time-offsets estimate method, outlining methodology, challenges and calibration techniques. |