Head-worn displays (HWDs) and aircraft-mounted sensors are common means to support helicopter pilots who operate in degraded visual environment. The use of see-through HWDs is beneficial in brownout and adverse weather conditions because these displays can visualize occluded real world features like the horizon, nearby obstacles, or the desired landing spot. The German Aerospace Center (DLR) investigates an enhanced vision concept called “Virtual Cockpit”. Instead of a see-through display, an immersive HWD is used to give helicopter pilots an enhanced out-the-window view. As shown in previous publications, the virtual reality (VR) technology creates benefits for several applications. This contribution explores the advantages and limitations of displaying an exocentric perspective view on the VR glasses. Moving the pilot’s eye point out of the cockpit to a viewpoint behind and above the aircraft, appears to be especially useful in situations where the pilot’s natural view is degraded by the own aircraft structure. Moreover, it is beneficial for certain maneuvers, in which the real location of the pilot’s eye is not optimal for capturing the whole situation. The paper presents results from a simulator study with 8 participants, in which the developed symbology was tested in confined area hover and landing scenarios. The 3D exocentric perspective views increased spatial awareness in the tested scenarios and significantly reduced the required head motion. Further research is needed regarding the attitude awareness with such displays. Apart from helicopter operations, the results may also be relevant for remote piloting solutions and for other types of vehicles with restricted external vision.
Increasing the pilot’s situational awareness is a major goal for the design of next-generation aircraft cockpits. A fundamental problem is posed by the pilot’s out-the-window view, which is often degraded due to adverse weather, darkness, or the aircraft structure itself. A common approach to this problem is to generate an enhanced model of the surroundings via aircraft-mounted sensors and databases containing terrain and obstacle information. In the helicopter domain, the resulting picture of the environment is then presented to the pilot either via a panel-mounted display or via a see-through head-worn display. We investigate a third method for information display. The concept—called Virtual Cockpit—applies a nonsee-through head-worn display. With such a virtual reality display, advantages of established synthetic and enhanced vision systems can be combined while existing limitations can be overcome. In addition to a theoretical discussion of advantages and drawbacks, two practical implementation examples of this concept are shown for helicopter offshore operations. Two human factors studies were conducted in a simulation environment based on the game engine Unity. They prove the general potential of the Virtual Cockpit to become a candidate for a future cockpit in the long term.
This paper introduces a display concept for helicopter obstacle awareness and warning systems. The key feature of the concept is the integration of a 360-degree coplanar orthogonal top view in the egocentric perspective of a helmet mounted see-through display. The concept intends to provide obstacle awareness while pilots are looking outside. The concept should further improve the situational and spatial awareness as well as the workload of helicopter pilots when operating in challenging surroundings. The display concept is applied to two helicopter off-shore operations and its specific obstacle situation. The first operation represents a hoist operation at the lower access point of an off-shore wind turbine. The second regards an off-shore platform landing operation. The paper depicts the two use cases, related work concerning obstacle awareness and warning systems, and recapitulates situational awareness plus the properties of orthogonal coplanar in comparison to the properties of perspective representations. Thereafter the two main aspects of the developed HMI concept were presented, i.e., the combination of the exocentric orthogonal coplanar top view with the egocentric perspective view, and secondly three ways for the integration of the top view inside the helmet mounted display. The implemented HMI design represents work in progress, i.e., looking forward to develop an optimal holistic and balanced display concept featuring helicopter obstacle awareness and warning systems.
Helicopters play an important role during construction and operation of o
shore wind farms. Most of the time helicopter offshore operations are conducted over open water and often in degraded visual environment. Such scenarios provide very few usable visual cues for the crew to safely pilot the aircraft. For instance, no landmarks exist for navigation and orientation is hindered by weather phenomena that reduce visibility and obscure the horizon. To overcome this problem, we are developing an external vision system which uses a non-see-through, head-worn display (HWD) to show fused sensor and database information about the surroundings. This paper focuses on one aspect of our system: the computer-generated representation of relevant visual cues of the water surface. Our motivation is to develop a synthetic view of the surroundings that is superior to the real out-the-window view. The moving water surface does not provide fixed references for orientation and sometimes even produces wrong motion cues. Thus, we replace it by a more valuable, computer-generated clear view. Since pilots estimate wind direction and speed by checking the movement characteristics of the water surface, our synthetic display also integrates this information. This paper presents several options for a synthetic vision display supporting offshore operations. Further, it comprises results from simulator trials, where helicopter pilots performed final approaches and landings on an offshore platform supported by our display. The results will contribute to the advancement of our HWD-based virtual cockpit concept. Additionally, our findings may be relevant to conventional, head-down synthetic vision displays visualizing offshore environments.
In recent years the number of offshore wind farms is rapidly increasing. Especially coastal European countries are building numerous offshore wind turbines in the Baltic, the North, and the Irish Sea. During both construction and operation of these wind farms, many specially-equipped helicopters are on duty. Due to their flexibility, their hover capability, and their higher speed compared to ships, these aircraft perform important tasks like helicopter emergency medical services (HEMS) as well as passenger and freight transfer flights. The missions often include specific challenges like platform landings or hoist operations to drop off workers onto wind turbines. However, adverse weather conditions frequently limit helicopter offshore operations. In such scenarios, the application of aircraft-mounted sensors and obstacle databases together with helmet-mounted displays (HMD) seems to offer great potential to improve the operational capabilities of the helicopters used. By displaying environmental information in a visual conformal manner, these systems mitigate the loss of visual reference to the surroundings. This helps the pilots to maintain proper situational awareness. This paper analyzes the specific challenges of helicopter offshore operations in wind parks by means of an online survey and a structured interview with pilots and operators. Further, the work presents how our previously introduced concept of an HMD-based virtual flight deck could enhance helicopter offshore missions. The advantages of this system – for instance its “see-through the airframe”-capability and its highly-flexible cockpit setup – enable us to design entirely novel pilot assistance systems. The gained knowledge will be used to develop a virtual cockpit that is tailor-made for helicopter offshore maneuvers
Synthetic vision systems (SVS) appear as spreading technology in the avionic domain. Several studies prove enhanced situational awareness when using synthetic vision. Since the introduction of synthetic vision a steady change and evolution started concerning the primary flight display (PFD) and the navigation display (ND). The main improvements of the ND comprise the representation of colored ground proximity warning systems (EGPWS), weather radar, and TCAS information. Synthetic vision seems to offer high potential to further enhance cockpit display systems. Especially, concerning the current trend having a 3D perspective view in a SVS-PFD while leaving the navigational content as well as methods of interaction unchanged the question arouses if and how the gap between both displays might evolve to a serious problem. This issue becomes important in relation to the transition and combination of strategic and tactical flight guidance. Hence, pros and cons of 2D and 3D views generally as well as the gap between the egocentric perspective 3D view of the PFD and the exocentric 2D top and side view of the ND will be discussed. Further a concept for the integration of a 3D perspective view, i.e., bird’s eye view, in synthetic vision ND will be presented. The combination of 2D and 3D views in the ND enables a better correlation of the ND and the PFD. Additionally, this supports the building of pilot’s mental model. The authors believe it will improve the situational and spatial awareness. It might prove to further raise the safety margin when operating in mountainous areas.
ARINC 661-Cockpit Display System Interfaces to User Systems represents an evolving standard for the next generation of avionic systems. The standard defines a client-server architecture including user applications which control layers in certain windows of a cockpit display system (CDS). When creating avionic displays for a CDS, one can use a predefined set of simple to complex widgets. These are very useful when designing and implementing avionic displays. However, a proper widget and concept enabling synthetic vision, e.g. for terrain visualization, is not provided by the standard. Due to the fact that synthetic vision systems become more and more popular there is the need enabling synthetic vision with ARINC 661. This contribution deals with the question how synthetic vision (SV) might be realized, e.g. in an ARINC 661 compliant primary flight display (PFD) or navigation display (ND). Hence, different approaches for the implementation of SV will be discussed. One approach has been implemented to perform the feasibility study. A first study was done using the open source software project j661 managed by Dassault Aviation. Secondly, another implementation of a SV PFD was done using the SCADE ARINC 661 tools provided by ESTEREL Technologies. XPlane was used as terrain rendering application. The paper will give some rough figures of the programmed SV PFD as well as it will present the results of the feasibility study.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.