After photographing the remains of the Titanic, standing alone in King Tut’s tomb and suction-cupping cameras to humpback whales, Corey Jaskolski is convinced he scored the “coolest job in the world.”
Jaskolski, an engineer, National Geographic Fellow and founder of Virtual Wonders, spoke at the morning lecture, which, due to technical difficulties started around 11:15 a.m., on Monday in the Amphitheater, opening Week Three, “A Planet in Balance: A Week in Partnership with National Geographic Society.”
“I get to build tools to help people see the world in a different light,” Jaskolski said. “I get to explore the world and help conservationists protect species and places, but I also get to deploy (those tools) in these incredible places.”
Photography and videography were Jaskolski’s passions growing up, but he figured becoming an engineer was the “adult thing to do.” He attended Massachusetts Institute of Technology for electrical engineering and computer science.
While at MIT, he developed the world’s first pressure-tolerant lithium-polymer battery pack. The packs got the attention of James Cameron, director of “Titanic,” who wanted to use them to power underwater robots to explore the ship’s remains.
Cameron led the expedition on the Russian research vessel, the Akademik Mstislav Keldysh, while Jaskolski drove small robots through the Titanic, creating the first images of the ship’s interior since it sank. It was 12,500 feet deep in the ocean that Jaskolski realized he didn’t have to choose between his passion and his career.
“I didn’t have to decide between my passion for photography and video and what seemed like a good career, electrical engineering and computer science,” he said. “I could merge those two things together into a career of technology for exploration, and I’ve been doing that with National Geographic ever since.”
Jaskolski shares what it’s like to work for @NatGeo. #CHQ2019 pic.twitter.com/ecA2q3aK9C
— The Chautauquan Daily (@chqdaily) July 8, 2019
Along with working for National Geographic, Jaskolski also runs Virtual Wonders, a company designed to develop new technologies to “capture the world in a different light.”
The technologies include 3D scanning, aerial imaging and artificial intelligence, all utilized to develop data that can help scientists and citizens better understand placeology; poaching in parks; the degradation of archaeological sites due to acid rain; and how these technologies can aid conservationists against the “war on conservation.”
“These assets we develop are also amazing for democratizing exploration,” he said. “We don’t just develop this imagery and these tools to show scientists the world better; we do it to share the world with everyone, so everyone can understand the beauty of these places and start to really understand how important it is that we protect them — whether they are cultural sites or natural sites.”
Jaskolski started small with landscape scans. Because mountains can’t move, the technology was simple to use, but when he started scanning animals, the process had to be taken to a “new level.”
“This is a really challenging, new problem for us because normally the things that we scan don’t move,” he said. “We scan mountains, we scan archaeological ruins, we scan cultural places, giant natural jungles — but to scan a living animal, we can’t do our traditional technique, which is taking thousands of photos over months.”
Jaskolski shares some images from his 3D scannings. The Sumatran Rhino (seen here) is on the brink of extinction. This isn’t a photo, but an actual 3D scanning of the Rhino. @NatGeo #CHQ2019 pic.twitter.com/O3uKXOFiLB
— The Chautauquan Daily (@chqdaily) July 8, 2019
Instead, Jaskolski and his team have to take all the necessary images at once. To do so, they built a wall of cameras that is triggered to go off when an animal steps in a certain spot. Instead of a standard image, the photos create a rendering from the scan that can be presented in 3D or used for virtual and augmented reality.
After working on animal scans and in natural environments, Jaskolski said he felt “well prepared” when National Geographic Labs asked to collaborate. The team proposed he build technologies that could provide conservation intelligence that is “data driven, technology powered and community sustained conservation.”
The main component of the project was creating programs to reduce poaching.
In Africa, poaching for profit is the most pressing issue facing wildlife and, according to Jaskolski, it supports “war efforts and criminal trade.”
The poachers come in groups, effectively “small military groups,” that use AK-47s, machine guns and rocket-propelled grenades.
“They are not just going and shooting one or two elephants and they’re not, unfortunately, just shooting the male elephants that have big tusks,” he said. “When they find a herd of elephants, they spray automatic gunfire on the entire herd of elephants, killing every single one of them. This is a huge problem and it’s funding a tremendous amount of not just animal extinction, but human rights abuses.”
Out of the plethora of challenges in anti-poaching efforts, the biggest one is that it’s an “asymmetric problem.”
“There are only so many anti-poaching teams and so many park rangers versus the large amount of poachers and people doing this for profit,” he said. “Because of that, there’s not a lot of data, there is a lack of situational awareness and there’s the inability to track and respond to illegal activity.”
There is also an inability to analyze and predict future threats. A lot of the funding for AI technology goes toward industries like finance, healthcare and social media. For conservation, Jaskolski said technology is in the “dark ages.”
#CHQ2019 pic.twitter.com/w7faK9Vlo2
— The Chautauquan Daily (@chqdaily) July 8, 2019
National Geographic Labs decided to change that by combining conservation intelligence software and hardware like smart, AI-powered sensors and aerial imaging systems with training programs for local people on the ground to help them understand, monitor and detect poaching.
All of these elements are compiled into a “conservation intelligence toolbox,” a single system that can be customized per location to maximize the potential of decreased poaching and increased conservation of the area.
One of the most commonly used tools is high-resolution aerial scanning, similar to the interactive platform Google Earth.
But Google Earth doesn’t collect detailed enough data for National Geographic’s purposes. While Google captures around 50 centimeters per pixel, aerial scanning can capture 2 to 3 centimeters per pixel. That specificity allows Jaskolski to see individual animal species in the data sets.
“We can cover huge areas; covering thousands of square kilometers a day using aerial 3D scanning, and we can get resolutions that are as much as a 100 times the satellite imagery,” he said. “We can start to use this data in the same way satellite data was used: for mapping, artificial intelligence and understanding where deforestation is happening.”
Sorting through terapixels of data, they can detect not only species, but illegal activity like gold mining and charcoal burning, and human activity like the buildup of settlements or refugee camps.
Jaskolski also developed AI that can locate particular objects in an image. The program can then create a multitude of different maps showcasing the data. To use the technology to prevent poaching, National Geographic Labs created a “smart sensor network,” an array of sensors with visuals and audio that can act as the eyes and ears for rangers across a park.
“We can deploy hundreds of thousands of these overtly — on trees and hiding in swamps, and they can stay there for years and years, all solar powered,” he said.
For species, the technology was easy to use because thousands of images already existed for AI to learn from. But for things like rifles, Jaskolski had to get creative.
He scanned and rendered thousands of orientations of an AK-47. Then he placed the rifle in various scenes to train the AI to locate them.
“This not only gives us a way to say ‘Hey, that’s not just a person in the park, it’s a poacher,’ it also gives the rangers in the park an assessment of how big of a threat this is,” Jaskolski said. “If you have 10 or 12 people walking around with AK-47s, they might respond very differently than they would if it was just one or two people out there poaching elephants.”
Outside of poaching, Jaskolski has started using the technology for areas that are difficult to “ground truth,” collecting information by direct observation.
He tested the program on the Osa Peninsula of southwestern Costa Rica, an immediate conservation concern because it holds 2.5% of all biodiversity on earth. The peninsula has hundreds of tree species, which means scientists would have to walk, tag each tree and record its exact GPS location to collect the data. Now, Jaskolski’s team can create a 3D scan, build a virtual jungle and train AI to identify each of the tree species.
Jaskolski is helping develop technology to be able to find every animal in a certain area/population. This will help conservation with animals. pic.twitter.com/Vzw39Cu22Y
— The Chautauquan Daily (@chqdaily) July 8, 2019
Although the introduction of these technologies has made it easier for conservationists to take on environmental threats, Jaskolski admits there is a long way to go. It is his hope that technology will continue to advance so future generations have a world just as robust to experience.
“It’s through developing technologies like this that we hope to really drive forward these new technologies of conservation,” he said. “(We hope to) take technology out of the dark ages and provide what technology industries are already using to really change the tide of war on conservation.”