National Center for Supercomputing Applications master calendar

skip to events

NCSA staff who would like to submit an item for the calendar can email

calendar tabs

  •  All 
  • Grid
  • Month
  • Week
  • Day
  • (Selected tab) Detail

Event Detail Information

Event Detail Information

Advanced Computing Fellowship Brown Bag: Autonomous Vision-based Progress Monitoring of Building and Infrastructure Construction Projects


Mani Golparvar-Fard, Civil and Environmental Engineering

Date Jan 22, 2014
Time 12:00 pm  

Room 1040, NCSA Building, 1205 W. Clark St., Urbana

Views 1463
Originating Calendar IACAT events

Mani Golparvar-Fard, assistant professor of Civil and Environmental Engineering (CEE), will give a brown bag talk outlining progress made during his  fellowship collaboration with NCSA. Rosati’s pizza and water will be provided.

Abstract: Construction is a $900 billion industry with many inefficiencies due to the complexity, scale, and time-sensitivity of its large projects. A substantial portion of that cost and delivery time could be reduced with tools that can detect deviations from plans, and understand allocation of resources (workers, equipment, and materials). Despite their importance, current monitoring practice includes manual site data collection, non-systematic reporting, and visually/spatially complex representations. To address current limitations, this talk presents ongoing research tasks that are focused on validating the feasibility of exploiting static and dynamic visual feeds of a building under construction, together with semantically-rich 4D (3D + time) CAD model to (1) generate knowledge about construction physical progress, (2) productivity of the construction operations, and (3) prediction of future performance given the current state of the sensed activities. We particularly leverage visual data collected with emerging Micro Aerial Vehicles (MAVs) and commodity smartphones used by field personnel and explore the followings: (1) developing D4AR – 4D augmented reality – models which integrate image-based 3D point clouds with 4D CAD models for understanding “when” performance deviations happen and how MAVs are supporting large-scale image data collection, (2) techniques for video-based detection, tracking, and activity recognition of construction resources – labor and equipment - to understand “why” performance deviations happen, and (3) a marker-less and infrastructure-independent mobile augmented reality system for real-time field reporting. Experimental results from testing our new methods for addressing fundamental underlying research challenges in computer vision, robotics, and construction management on several high-profile construction projects including the World Trade Center transportation hub construction site will be discussed.

PIs: Mani Golparvar-Fard (CEE), Tim Bretl (AE), and Derek Hoiem (CS)

link for robots only