Steneral Consulting

Senior Data Architect-locals

Job: Senior Data Architect

Duration: 4-6 months Contract to Hire

Location: Hybrid 3x a week onsite in Dallas, TX 75254

Also, I got the below updates yesterday

We are seeking the candidate strong in Azure Databricks.

  • Day to day: will be code reviews, building diagrams, and prepping for dev environment, and building jobs in Databricks.
    • Will be handling new things that come in and need to be build AND fixing broken things and making changes to pipelines. Must be able to read and understand code and understand the architecture of the set up to be able to say if you change X this is going to be the impact.
Looking For

  • Heavy DataBricks background – unity catalog, workspaces, work flows, setting up pipelines
  • Databricks Cert preferred
  • Spark background/ understanding of how Spark operates is most important thing along with Databricks and PySpark
  • Proficient in Python OR Scala, and PySpark
  • They do NOT use ADF- they use DataBricks only. ADF will be irrelevant.
  • Medallion architecture- bronze, silver, gold layers
  • Reviewing architectures, data flows, data pipelines, signing off on them and implementing them into Databricks in an orchestration format
  • Proficient in PySpark OR Scala required
  • Strong in Python and SQL—they will be developing code
  • Designing data models for the future of the logistics program
  • Will be working with Data Engineers, Data scientists, Developers, and BAs to understand the requirements. Will review different domains and dependencies in current DataBricks workflows to see what it will impact
  • Will be working with CICD process- they have Dev, QA, and Prod environments. They do versionings, so they have to keep lineage of products they release and scale those solutions without impacting what is currently in the system.

Job Description

Trinity Industries is at the forefront of expanding data infrastructure and analytics capabilities. In the role of Senior Data Architect, you will be instrumental in architecting, implementing, and optimizing our data processing and analytics platform based on Databricks for our customer facing Trinity Logistics Platform (TLP). This role requires a collaborative mindset to work with cross-functional teams, understanding business requirements, and ensuring the seamless integration of Databricks within our technology stack.

Responsibilities

  • Develop and oversee a comprehensive data architecture, aligning with business goals and integrating technologies such as Azure, Databricks, and Palantir to craft a forward-looking data management and analytics landscape
  • Lead the design of enterprise-grade data platforms addressing needs across Data Engineering, Data Science, and Data Analysis, capitalizing on the capabilities of Azure Databricks
  • Architect, develop, and document scalable data architecture patterns, ETL frameworks, and governance policies, adhering to Databricks’ best practices to support future and unknown use cases with minimal rework
  • Define cloud data standards, DevOps, Continuous Integration / Continuous Delivery (CI/CD) processes and participate in the proliferation of the corporate meta-data repository
  • Offer hands-on technical guidance and leadership across teams, driving the development of KPIs for effective platform cost management and the creation of repeatable data patterns for data integrity and governance
  • Direct the strategic implementation of Databricks-based solutions, aligning them with business objectives and data governance standards while optimizing performance and efficiency
  • Promote a culture of teamwork, leading evaluations of design, code, data assets, and security features, and working with key external data providers like Databricks and Microsoft to follow best practices. Create and deliver training materials, such as data flow diagrams, conceptual diagrams, UML diagrams, and ER flow diagrams, to explain data model meaning and usage clearly to a diverse audience of technical and non-technical users
  • Communicate complex technical concepts effectively to both technical and non-technical stakeholders, ensuring clear understanding and alignment across the organization
  • Implement robust audit and monitoring solutions, design effective security controls, and collaborate closely with operations teams to ensure data platform stability and reliability

Requirements

  • Bachelor’s or master’s degree in Computer Science, Information Technology, or a related field
  • 8+ years of experience in technical roles with expertise in Software/Data Engineering, Development Tools, Data Applications Engineering—must have Data Architect experience
  • Must have Azure Databricks experience
  • Proficiency in SQL, Python, Scala, or Java. Experience with big data technologies (e.g., Spark, Hadoop, Kafka), MPP databases, and cloud infrastructure
  • Strong background in data modeling, ETL/ELT workloads, and enterprise data architecture on Azure Databricks
  • Experience with data governance tools, BI tools (Tableau, PowerBI), version control systems, and CI/CD tools
  • Relevant certifications in Databricks, cloud technologies (AWS or Azure), or related fields are a plus
  • Seniority level

    Mid-Senior level
  • Employment type

    Contract
  • Job function

    Engineering and Information Technology
  • Industries

    Software Development

Referrals increase your chances of interviewing at Steneral Consulting by 2x

See who you know

Get notified about new Senior Data Architect jobs in Dallas, TX.

Sign in to create job alert

Similar jobs

People also viewed

Looking for a job?

Visit the Career Advice Hub to see tips on interviewing and resume writing.

View Career Advice Hub