Εμφάνιση αναρτήσεων με ετικέτα Pix4D. Εμφάνιση όλων των αναρτήσεων
Εμφάνιση αναρτήσεων με ετικέτα Pix4D. Εμφάνιση όλων των αναρτήσεων

Παρασκευή 24 Ιουλίου 2015

Pix4Dmapper and DJI Phantom Used in Debris Search of Airplane Collision



Late Tuesday morning on July 7th, a U.S. Air Force F-16 fighter jet and a small single-engine Cessna 150 collided mid-air over South Carolina - forcing the jet pilot to eject and sending the two Cessna pilots to their deaths.

The situation was grim. Debris was spread over an 11 km range, the majority of it underwater in swampy, alligator and snake infested waters. Fuel from the crash was either leaked or burning from the fuel tanks, and temperatures outside were sizzling at nearly 39 degrees. Over 150 personnel from at least 20 local, federal and state agencies were called out to locate, document and recover both the debris from the airplanes and the victim’s bodies.

Bill Salsbury, Berkeley County Coroner and Incident Commander for the search and recovery, had already dealt with more than six airplane crashes, but never one of this size and scope. He called drone-service companySkyView Aerial Solutions and asked CEO Tom Fernandez and CBDO Tom Lucey to come to the crash site the next day and gather aerial photographs with their drone. The two were happy to volunteer in the recovery efforts, and met with several authorities to explain that not only could they help document debris by taking photographs, but they could map the area as well by using image-processing software Pix4Dmapper.

“The expectation was that we were going to go 100-200 feet in the air and take pictures, so they could zoom in on them,” said Fernandez. “We explained that with the drone and Pix4Dmapper, we could map the entire area they wanted, but at a lower altitude and at higher resolution.”


Debris Highlights
Fernandez and Lucey first mapped a former rice paddy with 1.8 meters of standing water, where most of the large debris from the Cessna had fallen. The two flew their DJI Phantom Vision 2+ at 30 meters, using the Pix4Dmapper Capture App of their android to handle the flight plan and image acquisition. In approximately six minutes the flight was complete, at which pointthe 87 acquired images were processed in the cloud with Pix4D technology. Orthomosaic rendering and full resolution processing took only one hour.

The emergency responders had a helicopter on site but were unable to use it for visual identification of debris due to the ripple effect placed on the water surface by the beating of the chopper blades.

“When we finished mapping the rice paddy, we were approached by the dive team,” said Fernandez. “They knew that we would be able to provide them with information before anyone else.”

Fernandez and Lucey took a screenshot of the resulted orthomosaic in the browser of their laptop, then put the .jpg and original geo-tagged photos onto a thumb drive for the North Charleston Police Department Dive Team, who viewed the information from their laptop on site.

With a ground resolution of less than 2 cm per pixel, close examination of the orthomosaic revealed many pieces of wreckage and other items, submerged at the bottom of the paddy but visible in the map. The dive team took the data with them on their airboat, but even then had difficulty seeing debris on the bottom due to the silt kicked up by the boat. Fortunately, by using the coordinates on the orthomosaic they were still able to recover debris as small as half a meter despite the low visibility. There was also an additional problem for the divers, said Lucey:

“While they went to each coordinate, there were officers on shore with binoculars, calling out to warn the divers of alligators!”

After the first successful mapping, authorities requested more images and orthomosaics, and Fernandez and Lucey teamed with the National Transportation Safety Board (NTSB) to map farther away under a forest canopy. Traditionally in a case like this, the NTSB document all of the debris by hand, using an iPhone, pen, and pad of paper. Each piece must be geotagged and logged, so staff take a picture of the items, find the geolocation using google maps, and record it on paper. For the forest mapping, the Phantom was particularly useful as it could spot debris resting on the treetops, something a traditional means of documenting would not be able to see except by helicopter.

By the end of the day, seven flights had been conducted over three areas to assist emergency workers with their arduous task. In total, the recovery of the debris and bodies took more than three days, and estimates from various authorities claim the maps and images saved approximately three additional days of work.

The mapping was not planned in advance as part of the emergency workflow, but emerged as a practical and innovative solution to a crash of unexpected scale. In addition, the application brings up questions of other such applications in both emergency response and a variety of other industries. At the end of the ordeal, workers were simply happy to have a tool that enabled them to do their jobs more efficiently.

“It’s so enlightening to have a tool that can make their job more efficient,” said Lucey, “Not only for the emergency responders but also for the families of the victims.”

Contacts:
Krista Montgomery


Τετάρτη 15 Ιουλίου 2015

How 3 technologies equal 1 huge step forward for Remote Sensing



By Bill McNeil


Over the last several decades, great strides have been made with respect to camera and lens combinations, sensor technology and image processing software, yet, until recently, remote sensing has remained expensive and complex. Thanks to the confluence of three different technologies — inexpensive unmanned aerial vehicles, lightweight action cameras and powerful new photomosaic software — the process has gotten less expensive and easier to use.

Solo UAVs

For years, much of the data collected for remote sensing has been gathered from cameras or other sensing devices carried by expensive manned aircraft. Now, new drones from the likes of 3D Robotics, DJI and others can perform many of the same tasks at less than a third of the cost.


3DR’s Solo with a GoPro Camera

The recently introduced Solo from 3DR is a good example of how the industry is taking advantage of this new technology. Solo is a photographic platform, with the actual photography accomplished through the use of an attached gimbal and a GoPro camera. This symbiotic relationship enables plug-and-play, live streaming of HD video from the GoPro directly to a 3DR mobile app.

One of the many features of Solo is the ability to fly autonomously or “hands-free”. This is an invaluable tool for remote sensing because once an area to be mapped is selected, Solo computes the flight path and while in flight, onboard software automatically communicates with the GoPro camera, capturing and geotagging photos. The combination of HD streaming and autonomous flight means data processing for remote sensing can begin while Solo is still airborne.


An autonomous flight plan created by 3D Robotics UAV

Additional Solo features include the ability to automatically circle a specific location while keeping the area framed to render a wraparound shot, a Follow mode and straight-line autonomous flight.


The Solo gimbal with a GoPro Hero4 camera

GoPro cameras

GoPro is a U.S. corporation that develops, manufactures and sells small, high-definition, battery powered, lightweight, portable cameras often used to create action videos. Their waterproof cameras can take 12MP pictures at 30 frames per second and 240 frames per second slow motion video.

Although GoPros are extremely popular, they’ve had limited use in remote sensing applications because it wasn’t possible to access the camera controls remotely. In other words, you couldn’t take individual shots, start and stop recording, or change the settings during flight. Another issue was that the wide-angle view, created by the fisheye lens, distorted images and made them difficult to use for mapping or terrestrial 3D modeling.

GoPro’s latest cameras, the Hero3 and Hero4, now provide access to the controls via Wi-Fi and Bluetooth. Their Smart Remote technology also allows the control of multiple GoPro cameras from distances of up to 600 feet. Solo, through the use of their attached gimbal, is one of the first companies to access these controls. Without this capability it would be difficult for Solo, or any UAV, to collect useful data for remote sensing applications.

Pix4D processes terrestrial and aerial imagery

Despite the advances 3DR and GoPro bring to the industry, there is still one more component: data processing software. The images that are captured during an automated flight need to be processed into a photomosaic map. Think of it as a huge jigsaw puzzle with many pieces that don’t exactly fit. They don’t always fit because some images may have a slight oblique angle, vary by color, which is often caused by cloud shadows, have lens distortion, or some or all of the above.

Pix4D, based in Lausanne, Switzerland, has the problem figured out. TheirPix4Dmapper software is an elegant solution that automatically processes terrestrial and aerial imagery acquired by GoPro cameras. Not only can their application remove fisheye lens distortion, they can also process input from any lens or angle, and optionally include geotags and ground control points.

GoPro data processed by Pix4Dmapper can be used to measure volume extractions, generate contour lines and build precise maps and models for construction, cultural heritage or archaeology sites. The 3D images below are of a fountain created from GoPro data that were processed by Pix4Dmapper.



Together, 3DR’s Solo, GoPro cameras and Pix4Dmapper are making remote sensing easier and less costly, and as a result, more available for widespread use.