Real-time Planning for Automated Multi-View Drone ... high-level plans and screen-space framing objectives as input(a), our method generates trajectories that respect the physical - [PDF Document] (2022)

  • Home
  • Documents
  • Real-time Planning for Automated Multi-View Drone ... high-level plans and screen-space framing objectives as input(a), our method generates trajectories that respect the physical

Click here to load reader

Report

(Video) 2022 Google Workspace Summit
  • Download
FacebookTwitterE-MailLinkedInPinterest

Embed Size (px)

TRANSCRIPT

  • Real-time Planning for Automated Multi-View Drone Cinematography

    (Video) Ethernet Testing Solutions - PacketCheck™

    TOBIAS NÄGELI, AIT Lab, ETH ZurichLUKAS MEIER, AIT Lab, ETH ZurichALEXANDER DOMAHIDI, Embotech GmbHJAVIER ALONSO-MORA, Del University of TechnologyOTMAR HILLIGES, AIT Lab, ETH Zurich

    Fig. 1. We propose a method to jointly optimize 3D trajectories and control inputs for automated drone videography in dynamic scenes. Taking user specifiedhigh-level plans and screen-space framing objectives as input(a), our method generates trajectories that respect the physical limits of the quadrotor andconstraints imposed by the environment while fulfilling the high-level plan and aesthetic objectives as well as possible (b). The method automates single-shotrecording of complex multi-view scenes in cluered and dynamic environments (d+c).

    We propose a method for automated aerial videography in dynamic and clut-tered environments. An online receding horizon optimization formulationfacilitates the planning process for novices and experts alike. e algorithmtakes high-level plans as input, which we dub virtual rails, alongside in-teractively dened aesthetic framing objectives and jointly solves for 3Dquadcopter motion plans and associated velocities. e method generatescontrol inputs subject to constraints of a non-linear quadrotor model anddynamic constraints imposed by actors moving in an a priori unknownway. e output plans are physically feasible, for the horizon length, andwe apply the resulting control inputs directly at each time-step, withoutrequiring a separate trajectory tracking algorithm. e online nature of themethod enables incorporation of feedback into the planning and control loop,makes the algorithm robust to disturbances. Furthermore, we extend themethod to include coordination between multiple drones to enable dynamicmulti-view shots, typical for action sequences and live TV coverage. ealgorithm runs in real-time on standard hardware and computes motionplans for several drones in the order of milliseconds. Finally, we evaluatethe approach qualitatively with a number of challenging shots, involvingmultiple drones and actors and qualitatively characterize the computationalperformance experimentally.

    is work is partially supported by a Microso Research grant.Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor prot or commercial advantage and that copies bear this notice and the full citationon the rst page. Copyrights for components of this work owned by others than ACMmust be honored. Abstracting with credit is permied. To copy otherwise, or republish,to post on servers or to redistribute to lists, requires prior specic permission and/or afee. Request permissions from [emailprotected]© 2017 ACM. 0730-0301/2017/7-ART132 $15.00DOI: hp://dx.doi.org/10.1145/3072959.3073712

    CCS Concepts: •Computing methodologies → Computer graphics;Robotic planning; Motion path planning;

    Additional Key Words and Phrases: aerial videography, multi-drone, collision-avoidance

    ACM Reference format:Tobias Nägeli, Lukas Meier, Alexander Domahidi, Javier Alonso-Mora, and Ot-mar Hilliges. 2017. Real-time Planning for Automated Multi-View DroneCinematography. ACM Trans. Graph. 36, 4, Article 132 (July 2017), 10 pages.DOI: hp://dx.doi.org/10.1145/3072959.3073712

    1 INTRODUCTIONAccessible quadrotor hardware now allows for end-user creation ofaerial videography which previously resided rmly in the realm ofhigh-end lm studios. However, designing trajectories that fulllaesthetic objectives and respect the physical limits of real robotsremains a challenging task both for non-experts and professionals.Especially when lming in dynamic environments with movingsubjects, the operator has to consider and trade o many degrees offreedom relating to subjects’ motions, aesthetic considerations andthe physical limits of the robot simultaneously, rendering manualapproaches infeasible.

    (Video) Building an Intelligent Utility Using Tableau at The City of Tallahassee

    Existing methods for planning of quadcopter trajectories [Geb-hardt et al. 2016; Joubert et al. 2015; Roberts and Hanrahan 2016]allow users to specify shots in 3D virtual environments and to gen-erate ight plans automatically. Typically, this is formulated asan oine optimization problem which generates a timed reference

    ACM Transactions on Graphics, Vol. 36, No. 4, Article 132. Publication date: July 2017.

  • 132:2 • T. Nägeli et. al.

    trajectory and control input parameters from user-specied 3D posi-tions and camera look-at directions, subject to a model of the robotdynamics. e resulting plan is then tracked online using a feedbackcontroller. Due to this feedforward, open-loop nature for trajec-tory planning and tracking, such algorithms are not well suitedto handle drastic environmental disturbances [Chen et al. 1992],typical for cluered environments with moving subjects. ereforethey are restricted to lming of mostly static scenes. In contrast,dynamic scenes require continuous re-planning in real-time to guar-antee collision-free trajectories and record the intended footage, forexample to keep a moving actor properly framed.

    In this paper, we propose a general method for planning of aerialvideography in cluered and dynamic environments. e methodjointly optimizes 3D motion paths, the associated velocities andcontrol inputs for a ying camera in an online fashion. Our methodtakes user specied, high-level plans alongside image-based framingobjectives as input (Fig. 1, a+b). e input paths do not need to bephysically feasible in the sense of [Roberts and Hanrahan 2016],since our method only uses them for guidance. Furthermore, theinputs can be updated interactively at every time-step by the user.e algorithm adapts the high-level plans in real-time to producedynamically feasible trajectories for the drones. It takes the motionof the lmed subjects into consideration and inherently accountsfor the dynamic constraints due to the actuation limits of the drone,which is crucial to generate collision-free paths.

    ese multiple objectives and constraints are expressed math-ematically in a non-linear model predictive contouring control(MPCC) formulation, solving for quadrotor states and control in-puts online and simultaneously in a receding horizon fashion: erst control move of the plan is applied to the quadrotors, andthe entire trajectory is re-computed at the next sampling instance.Solving non-linear MPCC problems numerically at the samplingrates required by fast mechanical systems, i.e. on the order of a fewmilliseconds, is a computationally demanding task and solving suchproblems in real-time has only recently become feasible thanks tospecialized solvers [Domahidi and Jerez 2016].

    Furthermore, the algorithm allows for planning of multi-angleshots and for the positioning of several quadrotors to lm one ormore dynamic subjects simultaneously. is is a common approachin lms and TV broadcasts when depicting moving subjects, suchas in action and sports sequences. In such seings, it is desirableto provide dierent views of the subjects, which have to be lmedin a single take, since humans struggle in precisely reproducingtheir motions from the recorded footage. To enable such shots, weextend our method to produce collision-free paths between multipledrones and subjects simultaneously. e formulation also minimizemutual visibility of multiple cameras so that the recorded shots areunobstructed and do not contain the other ying cameras.

    We demonstrate our method via several challenging, and in somecases previously impossible shots, involving multiple, moving sub-jects and using several ying cameras simultaneously. Furthermore,we report initial feedback elicited from an expert camera operator.Finally, we characterize the computational cost of our method incontrolled experiments and show that it is capable of generatingfeasible trajectories in the order of milliseconds, even for multiplesubjects and multiple drones.

    (Video) #57 - StaffPad, with David William Hearn

    2 RELATED WORKAerial videography design tools: Various tools support the task of

    planning quadrotor-based video shots. Commercially available ap-plications are oen limited to placing waypoints on a 2D map [apm2015; dji 2015; lit 2015] and some consumer-grade drones allowto interactively control the quadrotor’s camera as it tracks a pre-determined path (e.g., [3dr 2015]). ese tools generally do notprovide means to ensure feasibility of the resulting plans. In conse-quence, several algorithms for the planning of physically feasiblequadcopter trajectories have been proposed. Such tools allow forplanning of aerial shots in 3D virtual environments [Gebhardt et al.2016; Joubert et al. 2015; Roberts and Hanrahan 2016] and employoine optimization methods to ensure that both aesthetic objec-tives and robot modelling constraints are considered. Joubert etal.’s method [2015] computes control inputs along a pre-denedpath and detects violations of the robot model constraints. Howevercorrecting these violations is ooaded to the user. Gebhardt et al.[2016] generates feasible trajectories subject to a linearized quadro-tor model and hence requires conservative limits on the controlinputs. e method proposed in [Roberts and Hanrahan 2016] takesphysically infeasible trajectories and computes the closest possiblefeasible trajectory by re-timing the user-dened velocities subjectto a non-linear quadrotor model.

    While [Joubert et al. 2015] allows to adjust the velocity along theplanned trajectory at execution time, all of the above methods areoine and convert the user’s desired path into a time-dependen

Videos

1. The State of GIS
(Esri Events)
2. Webinar: Investing in Emerging Technology
(Horizons ETFs)
3. Collections as Data: Stewardship and Use Models to Enhance Access
(Library of Congress)
4. New World Building Features | Inside Unreal
(Unreal Engine)
5. Movie Render Queue Enhancements in 4.26 | Inside Unreal
(Unreal Engine)
6. Nanite | Inside Unreal
(Unreal Engine)

Top Articles

Latest Posts

Article information

Author: The Hon. Margery Christiansen

Last Updated: 11/07/2022

Views: 6125

Rating: 5 / 5 (50 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: The Hon. Margery Christiansen

Birthday: 2000-07-07

Address: 5050 Breitenberg Knoll, New Robert, MI 45409

Phone: +2556892639372

Job: Investor Mining Engineer

Hobby: Sketching, Cosplaying, Glassblowing, Genealogy, Crocheting, Archery, Skateboarding

Introduction: My name is The Hon. Margery Christiansen, I am a bright, adorable, precious, inexpensive, gorgeous, comfortable, happy person who loves writing and wants to share my knowledge and understanding with you.