Skip to content

suhang99/AsyncTrack-Motion-Solver

Repository files navigation

A Linear N-Point Solver for Structure and Motion from Asynchronous Tracks (ICCV 2025 Highlight)

ICCV 2025 Highlight arXiv Video Poster

Hang Su1, Yunlong Feng1, Daniel Gehrig2, Panfeng Jiang1, Ling Gao3, Xavier Lagorce1, Laurent Kneip1

1ShanghaiTech University     2University of Pennsylvania     3Amap, Alibaba Group


Abstract Structure and continuous motion estimation from point correspondences is a fundamental problem in computer vision that has been powered by well-known algorithms such as the familiar 5-point or 8-point algorithm. However, despite their acclaim, these algorithms are limited to processing point correspondences originating from a pair of views each one representing an instantaneous capture of the scene. Yet, in the case of rolling shutter cameras, or more recently, event cameras, this synchronization breaks down. In this work, we present a unified approach for structure and linear motion estimation from 2D point correspondences with arbitrary timestamps, from an arbitrary set of views. By formulating the problem in terms of first-order dynamics and leveraging a constant velocity motion model, we derive a novel, linear point incidence relation allowing for the efficient recovery of both linear velocity and 3D points with predictable degeneracies and solution multiplicities. Owing to its general formulation, it can handle correspondences from a wide range of sensing modalities such as global shutter, rolling shutter, and event cameras, and can even combine correspondences from different collocated sensors. We validate the effectiveness of our solver on both simulated and real-world data, where we show consistent improvement across all modalities when compared to recent approaches. We believe our work opens the door to efficient structure and motion estimation from asynchronous data.

Teaser Image
A linear N-point solver for recovering 3D points and the velocity of a camera undergoing quasi-linear motion, given a set of timestamped observations

Quick Start

Build using pixi(recommended)

Pixi is a fast, cross‑platform package and environment manager for reproducible builds. Install Pixi from the official website: https://pixi.sh/

Build steps:

  • Ensure Pixi is installed (see site above).
  • From the repository root, run:
    pixi run build
    The first run will resolve dependencies and create an isolated environment automatically.
  • To see available tasks:
    pixi run --list

Build from source

cmake -B build .
cmake --build build

Run experiments

After building, executables are placed in build/app.

Run the Noise Analysis Experiment:

pixi run noise_analysis

License

Copyright 2025 Hang Su, Yunlong Feng, Mobile Perception Lab

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •