Data & Infrastructure Engineer in Draper
Job DescriptionJob DescriptionSalary: Based on Experience
Company: WorkBay
Location: Salt Lake City, UT area | On-Site
Employment Type: Full-Time
Compensation: Based on experience
About WorkBay
WorkBay is the largest owner/operator of micro-bay industrial warehouse parks in the United
States. Fully vertically integrated, we develop, acquire, and manage micro-bay properties across
the country's top markets. We move fast, think strategically, and build systems that scale and
we're looking for an engineer who does the same.
Position Overview
We are seeking a skilled and motivated Data & Infrastructure Engineer to own and evolve the
data platform that powers WorkBay's operations, analytics, and strategic decision-making. You
will maintain and extend our data pipelines, API integrations, cloud deployments, and AI powered tools ensuring the integrity, security, and reliability of our proprietary systems as we continue to scale.
Key Responsibilities
- Maintain and extend scheduled data sync pipelines that pull from third-party APIs into our
PostgreSQL database - Monitor and troubleshoot cloud deployments across Vercel (serverless functions) and
Railway (container services), including uptime, logs, and performance - Build and manage API integrations designing endpoints, handling authentication, and
ensuring reliability across internal and external services - Enforce data integrity standards including validation, cleansing, deduplication, and quality
assurance across complex multi-source data - Implement and maintain data security protocols to protect WorkBay's proprietary
information, including access controls and encryption - Support and extend our AI infrastructure, prompt configurations, evaluation suites, and cost
monitoring - Own database schema migrations, indexing, and query optimization in PostgreSQL
- Manage version-controlled codebases using GitHub, including code reviews, CI/CD pipelines,
and deployment workflows - Collaborate with leadership to design data-driven tools and dashboards that support
operational decision-making - Document data architecture, runbooks, and processes to ensure organizational continuity
Qualifications
Required
- 35 years of professional experience in data engineering, platform engineering, or a related
field - Strong proficiency in Python and SQL
- Production experience with PostgreSQL (schema design, migrations, query optimization)
- Experience building and consuming REST APIs
- Proficiency with GitHub and modern version control workflows (branching, PRs, CI/CD)
- Experience with serverless platforms (Vercel, AWS Lambda, Cloudflare Workers, or similar)
- Comfort with Linux CLI, shell scripting, and remote server management
- Strong problem-solving skills with the ability to work independently and manage multiple
priorities
- Exposure to AI/LLM architecture understanding how large models consume
data and how to structure pipelines that support AI-driven applications - Experience with container-based deployment platforms (e.g., Railway, Docker, Fly.io)
- Experience with CRM integrations (e.g., Salesforce, HubSpot) or property management systems
(e.g., Yardi, MRI, RealPage) - Background in real estate, property management, or commercial real estate tech
- Experience with ETL/ELT patterns and workflow orchestration (e.g., APScheduler, Airflow,
Dagster) - Familiarity with React/TypeScript (not required, but helpful for full-stack debugging)
- Experience in a high-growth or scaling company environment
What We Offer
- Competitive compensation based on experience
- Generous PTO policy
- Opportunity to own critical infrastructure at a fast-scaling company in commercial real estate
- A lean, entrepreneurial environment where your work directly impacts company growth
- Cutting-edge AI/ML workflows for real estate you'll work with LLM APIs, prompt engineering, and AI-powered automation in a live business context
WorkBay is an equal opportunity employer. We celebrate and are committed to
creating an inclusive environment for all employees