hero

Portfolio jobs

Open roles at our portfolio companies

Sr. Manager, Data Engineering & Architecture

LVT (LiveView Technologies)

LVT (LiveView Technologies)

Software Engineering, IT, Data Science
american fork, ut, usa
Posted on Mar 10, 2026

ABOUT LVT

LVT is redefining how businesses operate in the physical world, moving beyond traditional security solutions to deliver AI-driven, actionable intelligence that makes sites smarter, safer, and more secure. Since pioneering our first mobile, solar-powered units, our commitment to scrappy, hands-on innovation has made us an established leader and one of the fastest-growing companies in intelligent site technology. We are building the next generation of solutions—from our physical units in the field to a powerful Agentic AI platform—that allows our customers to gain unprecedented visibility and control over safety, compliance, and operations. This is your chance to join a cutting-edge team that isn't just watching the world change, but actively building the technology that is changing it.

We’re a team that’s focused on growth and innovation, and we’re proud that our crew, products, and leadership are being recognized for it.

  • A Top-Tier Growth Company: Named one of the Financial Times’ Fastest Growing Companies 2025 and #10 on the Inc. 5000 Rocky Mountain Regional list for 2025.

  • Innovative Leadership: Our CEO, Ryan Porter, was named an EY Entrepreneur of the Year 2025, and our CTO, Steve Lindsey, was inducted into the Silicon Slopes CTO Hall of Fame in 2024.

  • Product & Software Excellence: We were named one of The Software Report’s Top 100 Software Companies of 2023 and are a winner of the Security Today Govies Award for 2025.

ABOUT THIS ROLE

As the Sr. Manager of Data Engineering and Architecture, you will be a hands-on leader responsible for defining and executing the data engineering strategy, architecture, and technology stack. You will be responsible for the foundational data infrastructure that powers analytics and decision-making across the organization.

A critical part of this role will be building, mentoring, and managing a team of 3 data engineers. You will guide the team's efforts while also contributing directly to the development of our modern data warehouse in Snowflake using dbt and SQL, transforming raw data into reliable and accessible datasets. Crucially, you will spearhead the data migration efforts for our ongoing Oracle Fusion Cloud deployment, designing a robust, cohesive data ecosystem that bridges our new Oracle environment with Snowflake.

Your leadership will ensure the successful design, build, and maintenance of robust data pipelines that ingest data from a variety of internal and external sources into Snowflake. Your team's work will support reporting, dashboarding, and analysis across the company, enabling teams to make informed decisions based on trusted data. Given the green-field nature of this initiative, the role is expected to be approximately 30% technical leadership, strategy, and people management, and 70% direct, hands-on engineering and architecture.

This position is based in a hybrid work environment and requires regular in-office collaboration. It offers an opportunity to lead with modern data tools, build a high-performing team, and make a direct, strategic impact on data quality and accessibility during a massive phase of enterprise scaling.

ROLE RESPONSIBILITIES

  • Data Strategy & Architecture: Define the long-term vision, strategy, and architecture for the company’s data platform. Design a cohesive hybrid architecture that maximizes the strengths of our full tech stack, ensuring it drives measurable business value and scales efficiently to support hyper-growth.

  • Team Leadership & Management: Build, mentor, and manage a team of 3 data engineers, fostering a culture of technical excellence, accountability, and continuous improvement.

  • Data Modeling & Transformation: Lead the team in building and maintaining robust data models using dbt and SQL that support complex analytics and reporting needs. Contribute directly as an individual contributor as needed.

  • Snowflake Development: Oversee the design and optimization of the Snowflake data warehouse to ensure performance, scalability, and usability. Participate directly in key development efforts.

  • Cross-Functional Collaboration: Act as the primary technical partner to analysts, business stakeholders, and data teams to deeply understand requirements and translate them into strategic engineering solutions and delivery plans.

  • Performance Tuning: Guide the optimization of SQL queries and data transformations to improve execution speed and resource efficiency across the platform.

  • Tooling & Automation: Identify, evaluate, and implement opportunities to automate data workflows, improve pipeline reliability, and establish a formal DataOps/MLOps framework using modern orchestration tools (e.g., Airflow, Prefect, cloud-native serverless functions).

  • BI Tool Support: Ensure the team provides clean, well-structured data models to enable effective use of BI tools like Looker, Sigma, Tableau, or similar platforms.

  • Pipeline Engineering: Direct the development and maintenance of scalable data ingestion pipelines that pull data from APIs and other sources into Snowflake, including exploring solutions for near real-time data feeds.

  • Data Governance & Quality: Champion best practices in data governance, data lifecycle management, and dimensional modeling. Implement data validation checks, documentation standards, and lineage tracking to maintain high data integrity across all of our systems.

  • Oracle Fusion Cloud Migration & Integration: Lead the complex data migration strategy for our active Oracle deployment. Architect, build, and maintain secure, high-performing data flows and syncs between Oracle and Snowflake to ensure operational continuity and analytical excellence.

OUR IDEAL CANDIDATE

  • Experience: 8+ years in data engineering or related roles, with a strong focus on data modeling and pipeline development.

  • Education: Bachelor’s degree in Computer Science, Engineering, Data Analytics, or a related field.

  • Technical Skills:

    • Proven experience leading complex data migration or implementation projects.

    • Advanced proficiency in SQL and experience with data modeling (e.g., star/snowflake schemas).

    • Hands-on experience with dbt for building modular and testable data transformations.

    • Experience developing data pipelines using Python and workflow orchestration tools (e.g., Airflow, Prefect).

    • Deep understanding of Snowflake.

    • Demonstrated ability to design and implement a modern data architecture from scratch.

  • BI & Analytics Tools:

    • Familiarity with BI platforms such as Looker, Tableau, or Sigma is helpful.

  • Infrastructure & Governance:

    • Understanding of ELT/ETL workflows, data governance, and monitoring practices.

    • Experience defining and enforcing organizational standards for data quality, metadata management, and cost optimization within a cloud data warehouse (Snowflake).

  • Communication & Problem Solving:

    • Ability to clearly explain technical details to both technical and non-technical stakeholders.

    • Strong analytical and debugging skills; attention to detail in code and data quality.

BENEFITS

We believe you do your best work when your whole life is supported. We invest in our crew’s health, families, and financial futures with a benefits package designed to support you inside and outside the office.

LVT IS PROUD TO BE AN EQUAL OPPORTUNITY EMPLOYER. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. All candidates must pass a drug screening and background check upon employment. Some roles may also require passing a federal background check and fingerprinting. Must be authorized to work in the U.S. If reasonable accommodation is needed to participate in the job application or interview process, and/or to perform essential job functions, please reach out to your recruiter.