Job Title:

Big Data Architect



NextHealth Technologies is seeking a Big Data Architect to partner with the Data Science and Engineering teams in creating requirements analysis, platform selection, design of the technical architecture, application design and development, testing, and deployment of the proposed data pipeline solution.

Successful candidates will need to thrive in a dynamic, fast-paced environment. They must be innovative and passionate about finding creative solutions to challenges. The candidate should enjoy collaborating with the team but also be a self-starter.

Job Responsibilities

Data modeling, administration, configuration management, monitoring, debugging, and performance tuning.

  • Handling structured, semi- structured and un-structured datasets
  • Develop a practical roadmap for an enterprise-wide BI reporting and analytics platform
  • Define the overall BI data architecture, including ETL processes, data marts and data lakes
  • Help program and project managers in the design, planning and governance of implementing projects
  • Perform detailed analysis of business problems and technical environments and use this in designing the solution
  • Benchmark systems, analyze system bottlenecks and propose solutions to eliminate them

Job Requirements

Skills and Abilities

Role of a big data solutions architect is a very technical one, but the successful candidate should also have some other skills that are important in designing the right architecture for the right need:

  • Ability to hire and build a strong data pipe line team
  • Experience with at least one of the large cloud-computing infrastructure solutions like Azure or Amazon Web Services
  • Experience with the major big data solutions like Hadoop, Spark, MapReduce, Pig, Hive, HBase, MongoDB, Cassandra or similar NoSQL data stores (exposure to Hadoop distribution like Hortonworks, Cloudera will be a big plus)
  • Thorough understanding of major programming/scripting languages like C#, Java, etc., as well as has experience in working with ETL tools such as Talend, Informatic, and/or Pentaho
  • Demonstrable experience in data collection, cleansing and predictive modeling environment, has real world experience in deploying models to production
  • Strong grasp of data pipeline, difference between batch and real-time data processing and familiarity with Lambda Architecture’s batch, speed and servicing layers
  • Ability to clearly articulate pros and cons of various technologies and platforms
  • Ability to document use cases, solutions and recommendations
  • Adept at working creatively and analytically in a problem-solving environment
  • Ability to work in a fast-paced agile development environment, both as a self-driven individual and as part of a cross-functional team
  • Excellent written and verbal communication skills, including ability to explain work in plain language


  • Bachelor’s degree in Computer Science or related field AND eight (8) years of experience in the area(s) of data architecture or database management, OR any equivalent combination of education and experience
  • Denver-based; this is not a remote position
Apply For This Job