Lead Data Infrastructure Engineer
Redwood City , CA
Location/City : CA - Redwood City
Area Code : 650
Job Type : 1: Full Time
Id : 23695
#23695 Lead Data Infrastructure Engineer
Location: Redwood City, CA
Problem: In the $12T construction industry, 98% of projects are delivered, on average, 80% over budget. Our client is on a mission to stop that from happening.
Root cause? Inability to measure site progress in an objective, trustworth and frequent manner. Construction managers are frequently surprised to discover that their multi-million dollar projects have been running behind for several weeks or months. By that point, the money has already been spent and it's too late to fix the problem.
Solution: Use robots to survey the sites every day, proprietary deep learning and AI based algorithms to assess progress and turns the terabytes of data it collects into simple insights for project managers that enable them to react to issues in minutes, not months. This enables them to constantly correct site inefficiencies and has demonstrated RoI of eliminating coverage entirely and even delivering projects up to 11% below budget.
They have closed deals with Kaiser Permanente, Sutter Health and the Lucas Museum of Narrative Art.
Team: They are some of the brightest minds in Silicon Valley that includes PhDs, engineers, business leaders and civil engineering profession on the Forbes 30 Under 30 list, graduates from Stanford University and with experience at organizations such as Google Advanced Technologies and Projects.
Backed by Andreessen Horowitz - famed investor that also backed Facebook, Coinbase, Slack, Airbnb, Github and Lyft.
Our client is building a distributed infrastructure to capture physical data via LIDAR scanners. We'll have dozens of robots running in the field, multi-gigabyte data uploads from remote locations, and large data pipelines process data and applying cutting-edge ML techniques. Their customers expect fast data turn around and reliable reports. Your job will be to build the software that turns point clouds into insights for the world's top construction companies.
- Lead a team of 5-10 developers working with Big Data architectures (Airflow, Dataflow/Spark, BigQuery, etc.)
- Responsible for the company's key deliverables shipping on time and helping us land their next dozen customers
- Be a part of an early stage team and have a significant stake in defining its future with considerable potential to impact hundreds of thousands of construction personnel and $12T+ in construction spending.
- Expertise in at least one of: C++, Python, Go, Java (two or more preferred)
- 5+ years experience of working with Apache Beam/Dataflow or other big data architectures (Spark, Hadoop, Mapreduce) in high-volume environments.
- Experience building and managing ETL pipelines from inception to production rollout.
- Experience with workflow management tools: Airflow, Oozie, Azkaban etc.
- Experience supporting hosted services in a high-volume customer facing environment
- Proficiency with SQL (relational, redshift, Hive, Presto, Vertica)
- Ability in managing and communicating data warehouse project plans to internal clients
- Experience with computer vision or 3D scanning technologies
Please send resumes to firstname.lastname@example.org