National Center for Supercomputing Applications master calendar
NCSA staff who would like to submit an item for the calendar can email firstname.lastname@example.org.
|go to week of Sep 27, 2015||27||28||29||30||1||2||3|
|go to week of Oct 4, 2015||4||5||6||7||8||9||10|
|go to week of Oct 11, 2015||11||12||13||14||15||16||17|
|go to week of Oct 18, 2015||18||19||20||21||22||23||24|
|go to week of Oct 25, 2015||25||26||27||28||29||30||31|
Event Detail Information
Event Detail Information
Advanced Computing Fellowship Brown Bag: Autonomous Vision-based Progress Monitoring of Building and Infrastructure Construction Projects
Mani Golparvar-Fard, Civil and Environmental Engineering
Room 1040, NCSA Building, 1205 W. Clark St., Urbana
Mani Golparvar-Fard, assistant professor of Civil and Environmental Engineering (CEE), will give a brown bag talk outlining progress made during his fellowship collaboration with NCSA. Rosati’s pizza and water will be provided.
Abstract: Construction is a $900 billion industry with many inefficiencies due to the complexity, scale, and time-sensitivity of its large projects. A substantial portion of that cost and delivery time could be reduced with tools that can detect deviations from plans, and understand allocation of resources (workers, equipment, and materials). Despite their importance, current monitoring practice includes manual site data collection, non-systematic reporting, and visually/spatially complex representations. To address current limitations, this talk presents ongoing research tasks that are focused on validating the feasibility of exploiting static and dynamic visual feeds of a building under construction, together with semantically-rich 4D (3D + time) CAD model to (1) generate knowledge about construction physical progress, (2) productivity of the construction operations, and (3) prediction of future performance given the current state of the sensed activities. We particularly leverage visual data collected with emerging Micro Aerial Vehicles (MAVs) and commodity smartphones used by field personnel and explore the followings: (1) developing D4AR – 4D augmented reality – models which integrate image-based 3D point clouds with 4D CAD models for understanding “when” performance deviations happen and how MAVs are supporting large-scale image data collection, (2) techniques for video-based detection, tracking, and activity recognition of construction resources – labor and equipment - to understand “why” performance deviations happen, and (3) a marker-less and infrastructure-independent mobile augmented reality system for real-time field reporting. Experimental results from testing our new methods for addressing fundamental underlying research challenges in computer vision, robotics, and construction management on several high-profile construction projects including the World Trade Center transportation hub construction site will be discussed.
PIs: Mani Golparvar-Fard (CEE), Tim Bretl (AE), and Derek Hoiem (CS)