Digital Airborne Camera: Introduction and Technology


Free download. Book file PDF easily for everyone and every device. You can download and read online Digital Airborne Camera: Introduction and Technology file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Digital Airborne Camera: Introduction and Technology book. Happy reading Digital Airborne Camera: Introduction and Technology Bookeveryone. Download file Free Book PDF Digital Airborne Camera: Introduction and Technology at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Digital Airborne Camera: Introduction and Technology Pocket Guide.
Search form

The sensor technologies used in today's automatic ground detection systems are mainly infrared cameras, but in some cases optical cameras may be used. The images are sent to a central control unit via a radio communication system [ 2 , 3 ]. Major fires are detectable from space and their progress can be monitored using satellite technology.


  • Recommended for you!
  • 1. Introduction.
  • JXTA.

Satellite systems are by far the mostly developed fire remote sensing tools. Satellites providing information in the infrared spectrum by deploying TIR imaging spectrometers are suitable for fire detection. However, none of them has been used operationally for fire detection, mainly due to insufficient revisiting time with the exception of MSG-SEVIRI and GOES, which are on a geostationary orbit and provide images every 15 minutes, however, with a coarse spatial resolution on the ground.

It is noteworthy that these satellites can be used in a multi-sensor approach to improve the temporal resolution and may also be used in a complementary way with other surveillance systems. Apart from the operational satellites, whose primary task, however, is not fire detection, there are plans for certain satellite missions dedicated to fire detection. For instance, FUEGO [ 4 ] is planned to be a satellite system intended to provide the fire fighting forces with early fire warning and fire monitoring capabilities at selected regions of the Earth which are considered at high fire risk.

Each satellite will have a payload including a sensor that will acquire information in the middle infrared region of the spectrum, in addition to dedicated thermal sensors designed especially for the detection of fires. Since the temporal resolution is the primary mission requirement and the number of satellites must be kept as low as possible for economic reasons, the orbital configuration has exceptional importance. If this mission succeeds, it will be the first one specifically dedicated to fire detection.

It has to be stressed though that cost-effectiveness of satellite networks strictly dedicated to fire detection is somewhat questionable.

Download Digital Airborne Camera Introduction And Technology

Another innovative initiative reported is that of the DLR German Aerospace Agency , which participates in the development of new satellites and sensors for fire detection. BIRD, which was launched in October , incorporates thematic on-board data processing through a neural network classifier, and real-time discrimination between smoke and water clouds. The fixed orbit of satellite platforms applies a restraint in revisiting capability both in tactical operations during the crisis that require frequent and repetitive observations, and in surveillance of vulnerable areas under high-risk conditions.

Aircraft manned or unmanned are manoeuvrable and may very quickly revisit the critical areas providing rapid response for emergency situations. Airborne TIR sensors usually cameras can detect new hot spots that develop rapidly into wildland fires. Besides aircrafts equipped with TIR sensors can be used for supporting fire-fighters in safety tasks, and for detecting escape routes or security zones, in areas where the human visibility is restricted due to the smoke.

In the s the first research and development activities on infrared fire imaging from aircraft started at the National Interagency Fire Center NIFC. During the operation of these airborne cameras the pressing need was realised for an integrated system that would transmit temperature information to the ground in real time. The following years AIRDAS was further developed to include improved telemetry, rapid geo-referencing algorithms, and multiple spectral band data collection and transmission.

A review of airborne infrared systems is conducted by Rodriguez Y Silva [ 3 ].

Full Aerial Image Processing Workflow with PhaseOne Industrial

This brief review proves that in the last decade there have been advancements in technologies supporting wild land fire management and tactical emergency response [ 8 , 9 , 10 ]. These advancements include: a improved capability for remote sensing towers, aircrafts and satellites; b Geographic Information Systems and Decision Support Systems [ 11 ]; c image processing and image geo-referencing; d Global Positioning System GPS in combination with Inertial Navigation Systems INS ; and e fire behaviour models [ 12 , 13 ].

In addition, uncooled forward-looking infrared scanners and cameras provide an opportunity for lower cost fire detection and monitoring systems. The present paper introduces SITHON, a complete system tailored to the unique requirements of the wildland fire fighting community. Section 2 presents the system in detail, whilst Section 3 demonstrates the results of a test flight. In Section 4 the main conclusions are drawn. The project lasted three years and was initiated in July The main objectives of the project were to foster collaboration between the private service sector, technological partners and research community, to demonstrate the synergy of evolving technologies both terrestrial and airborne for increasing the information content of data collected, and enhancing the timeliness in data collection, processing and transition towards an effective detection, monitoring and management of wildfire suppression actions.

It is well known that time is a crucial parameter in fire combating and fire containment. The level of efficiency depends on the availability of fire fighting resources, the firemen endurance, as well as the knowledge of the terrain, the fire location and the ways to access to it.

Featured channels

For this purpose the SITHON project applied remote sensing systems of terrestrial and airborne detection sensors for the localization, notification and monitoring of the active forest fires. The whole SITHON system comprises a wireless network of in-situ optical cameras, and an airborne fire detection system based on a fully digital thermal imaging sensor. The network of sensors is linked to an integrated GIS environment in order to facilitate the fire fighting management and support the decision making process during crisis.

The GIS data bases incorporates qualitative and quantitative information layers needed to estimate the fire risk, such as the vegetation types and quantity of fuel matter, the road network for accessing the active fires, the area's morphology, the sensitive and endangered locations settlements, camps, folds, archaeological sites, etc , the high danger infrastructures fuel stations, flammable materials, industrial areas, etc , the natural or artificial water reservoirs, etc.

The platform of SITHON includes a Crisis Operating Centre, which receives information in the form of images and data from the wireless sensor detection systems, displays it on wide screen monitors and analyses it to derive the dynamic picture of fire evolution. The integration of the GPS, the IMU and camera data along with a digital elevation model DEM allows for real time orthorectification and geo-positioning of the acquired frames without using ground surveyed control points.

The ultimate aims in SITHON system design were directed to meet operational tactical requirements and products generation which are useful for crisis management including:. Timeliness is of paramount importance for the success of such a system. The system is designed to ensure automatic fire detection. It is mountable on any airborne platform and can be operated within 15 to 20 minutes after the first fire announcement. Clearly, the post-processing of stereo pairs of images is useful only for post-crisis management to allow precise mapping and assessment of the damages in the impacted zones.

During crisis, the imaging system is integrated with telemetry equipment ready to transmit the output products to fire-fighting forces in real-time. The integration of the camera was simplified due to existing design elements of the plane. For instance, the aircraft was equipped with a gyro-stabilized camera mount, on which the infrared camera body was mounted and fixed. The Thermovision R camera used is a light-weight high performance uncooled forward looking infrared camera.


  1. The Philosophy of Pornography: Contemporary Perspectives;
  2. Times Square rabbi: finding the hope in lost kids lives?
  3. Digital Airborne Camera Introduction and Technology.
  4. EuroSDR/ISPRS workshop Oblique | EuroSDR!
  5. [PDF] Digital Airborne Camera: Introduction and Technology Read Online.
  6. It uses advanced uncooled Focal Plane Array FPA micro-bolometer technology and stores images and data to memory cards. The detector consists of a cavity with 77 microbolometers made of Si-elements coated with a thermal resistance layer. The microbolometers are thermal detectors and are operated without cooling. But besides the thermal radiance of the target, the radiation of internal camera components may be detected by the uncooled detector arrays, and this can decrease the measurement accuracy. To tackle this problem the inside case temperature is measured at several locations and these measurements are used in the automatic re-calibration of the camera at predefined time intervals.

    In addition, in order to avoid any convective heat transfer to the detector elements, the bolometers are placed in a vacuum chamber and their temperature is kept close to the ambient temperature using Peltier elements. Although the response time of bolometers is remarkably longer than quantum detectors, an image frequency of 60 Hz is achievable with this system. Bolometers are not selective with respect to the spectral range, therefore optical filters are used with this camera to optimise response at a spectral range of 7.

    The camera optical system is using lenses with a focal length f of 20 mm. The technical characteristics of Thermovision R infrared camera are summarized in Table 1. For this range of temperatures discrimination was approximately 1. This was considered effective for the purposes of SITHON project as the interest was focused on mapping relative temperature differences between points rather than estimating absolute temperatures. The resulted interior camera orientation after extensive tests [an example is shown in Figure 2 b ], is reported in Table 1.

    This calibration procedure is repeated annually prior to fire season and camera deployment. The black and white box indicates the effective area of measurements. The forward motion of the aircraft during camera operation may result in a blurring effect of the acquired image. The image blur parameter is defined as:.

    Table 2 shows the variation of the blur parameter for different flying heights and aircraft velocities during the demonstration of the SITHON airborne system. In all cases the estimated size of the forward motion blur is less than the FPA pixel size. In addition, Table 2 shows the ground surface detected by the SITHON imaging sensor for combinations of flying altitude and aircraft velocities reported during the system demonstration. As mentioned above, the direct geo-referencing of the acquired images is based on the combination of GPS measurements with inertial measurements for the camera orientation angles pitch, roll, and yaw.

    It is a low-cost, light weight, small size and low power package which integrates an internal GPS receiver that collects GPS positions and velocity information and passes it to the data fusion processor to be combined with the inertial data to generate the state vector.

    During operations the device provides in real time differential measurements of linear accelerations and rotational rate increments. The actual position, velocity and platform attitude in relevance to an inertial coordinate system is obtained by integration of the dynamic differential recordings. Gyro and accelerometer data from the IMU device allow the accurate determination of both attitude pitch, roll, and yaw and position X o , Y o , Z o for the camera projection center at the time of exposure. To account for inaccuracies derived by device drifts in long time operations, the MIDG II inertial system is using a GPS receiver, which is ideally coupled with the inertial data using a Kalman Filter.

    This allows accurate upgrading of the estimated navigation and positioning data [ 9 ]. Significant software developments were made to allow remote control and operation of the camera system by the payload engineer on board. The camera and the IMU are rigidly attached and oriented to the camera opening at the bottom of the aircraft Figure 5. The system commands the acquisition of images at the appropriate time intervals, accounting for the aircraft velocity, the flying height and the physical camera frame dimensions, in order to acquire imagery without gaps in terrain coverage and to allow adequate overlapping along the flight axis in case stereoscopic acquisition is required for off-line processing.

    This operation is fully automatic and is based on the following formula:. Table 4 illustrates the time interval of successive image acquisitions, which have been tested during the demonstration of SITHON system. Time interval of successive image acquisitions for different combinations of flying velocity, flying height and image overlap percentage. During the operations the payload engineer is provided with several interfaces showing dynamically i the status of system's components and subcomponents, ii the data input to the system images, GPS and INS data , iii the status of communication between the system controller onboard and data server on the ground , and iv the output products geo-referenced image, geographic coordinates of the hot spots detected, the id of the data package processed, etc.

    For every camera exposure the corresponding GPS and IMU data are automatically registered together with the image data to create a new data package. The data package is forwarded to a server PC system Pentium IV, 2GHz , installed on the ground station or on board the aircraft depending on the chosen communication architecture. The data package is decoded on the server PC, and transformed to useful products before the next camera frame is captured.

    1. Introduction

    The following operations are carried out on the server station in real time:. Direct image geo-referencing is based on a the combination of the GPS data for the positioning of the camera exposure centre, and b the inertial measurements returning the camera's orientation angles at the time of exposure.

    The restored and ortho-rectified thermal or video images are projected to the reference cartographic projection system, and undergo a number of additional processing operations including global image thresholding, in order to detect areas with temperatures exceeding a predetermined value hot spots. The generated temperature and hot spots map products are then combined with existing background and asset maps of the area under monitoring.

    Digital Airborne Camera Introduction And Technology 12 Last

    Figure 6 a illustrates the set of operations undertaken by the software package, which has been developed for sensor control and input raw data processing to derive a set of usable information products. Moreover, Figure 6 b summarizes the operations applied on the server PC station including data package decoding, image thresholding and geo-referencing.

    The first experiments were carried out in May The system in its final configuration was flown over the Sithonia Peninsula of Chalkidiki in May [ Figure 7 a ]. The system's sensing and imaging capability was evaluated in different conditions by changing the flight height and aircraft cruising speed, as well as the size and the intensity of fire flames detected. For this to be achieved, a number of different pre-designed navigation paths as in Figure 7 b , have been inserted into the aircrafts navigation control system to ensure that the entire Sithonia Peninsula is covered without gaps in image acquisition.

    Log in to Wiley Online Library
    Digital Airborne Camera: Introduction and Technology Digital Airborne Camera: Introduction and Technology
    Digital Airborne Camera: Introduction and Technology Digital Airborne Camera: Introduction and Technology
    Digital Airborne Camera: Introduction and Technology Digital Airborne Camera: Introduction and Technology
    Digital Airborne Camera: Introduction and Technology Digital Airborne Camera: Introduction and Technology
    Digital Airborne Camera: Introduction and Technology Digital Airborne Camera: Introduction and Technology

Related Digital Airborne Camera: Introduction and Technology



Copyright 2019 - All Right Reserved