Hiring a Data Warehouse Engineer/Architect (Cloud Native & AI/ML Focus) with extensive travel included across MENA region. The job requires you to have a Start Up mentality and you will have to build the team and tech from scratch. One of our clients which is a leading FMCG Distribution company in the MENA region and is looking at hiring a Data Warehouse Engineer/Architect (Cloud Native & AI/ML Focus) with 7+ years of experience. As a Data Warehouse Engineer/Architect, you will play a crucial role in designing, implementing, and maintaining our data warehouse solutions. Your primary responsibility will be to create scalable and efficient data architectures that support complex data analysis and reporting needs. You will work closely business stakeholders to ensure our data infrastructure aligns with our AI/ML initiatives and business objectives. Responsibilities: Data Architecture Design: Develop and implement data warehouse architectures, including data modeling, ETL processes, and data integration strategies. Design and optimize data pipelines to handle large volumes of structured and unstructured data in cloud-native environments. Cloud Integration: Deploy and manage data warehouse solutions on cloud platforms such as AWS, Google Cloud, or Azure. Utilize cloud-native tools and services (e.g., AWS Redshift, Google BigQuery, Azure Synapse) to enhance data processing and storage. AI/ML Integration: Collaborate with data scientists to integrate data pipelines with machine learning models and analytics tools. Ensure data is available and optimized for AI/ML workflows, including feature engineering, training, and deployment. Performance Optimization: Monitor and tune data warehouse performance to ensure fast query responses and efficient data processing. Implement data partitioning, indexing, and other techniques to enhance performance and scalability. Data Governance & Security: Ensure data integrity, privacy, and security by implementing best practices and compliance standards. Develop and maintain documentation related to data architecture, processes, and policies. Collaboration & Communication: Work closely with cross-functional teams, including data engineers, data scientists, and business analysts, to understand data requirements and deliver solutions. Technical Skills: Proficiency in cloud platforms (AWS, Google Cloud, Azure) and their data services. Experience with data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse). Strong knowledge of SQL, ETL processes, and data modeling techniques. Familiarity with AI/ML frameworks and tools (e.g., TensorFlow, PyTorch, scikit-learn) and their integration with data pipelines. Experience with data visualization tools (e.g., Tableau, Looker, Power BI) is a plus. Please send in your resume to jobs@jobstronaut.com You can also apply here : https://lnkd.in/giAyVhCw #Hiring #Jobs #MENAjobs #ITjobs #Datawarehousingjobs #AIjobs #MLjobs
Jobstronaut’s Post
More Relevant Posts
-
Big Data Architect Job Opportunity Rate: $130-140/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Must be located in U.S. Can't considering consulting companies representing candidates. Please email resume to al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company. We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Team Collaboration: Lead and mentor an elite team of data engineers, data scientists and product managers to understand requirements, build out data pipeline and deliver effective solutions. Documentation: Create and maintain detailed design documents, technical specifications, operational procedures and SLAs. Stay Updated: Keep abreast of the latest advancements in emerging Big Data technologies and industry trends to maximize data insights for commercialization and development of new products.
To view or add a comment, sign in
-
We are currently requesting resumes for the following position: Data Architect Resume Due Date: Friday, June 14, 2024 (5:00PM EST) Number of Vacancies: 2 Job ID: 24-079 Level: MP6 Duration: 4 months Hours of work: 40 Location: 700 University Ave (100% Remote) *For complete job description please click the link below* Job Overview Data Architecture and Design: Lead the design and implementation of modular and scalable data pipelines (ELT/ETL) and data infrastructure. This includes creating curated common data models that provide a single source of truth for analytics and downstream systems. Data Security and Standards: Collaborate with security teams to ensure data security during transfer and storage. Define data modeling and database system field standards for optimal join performance and data extensibility. Data Pipeline Development: Create code templates for data pipelines and transformations for various data types (structured, semi-structured, unstructured). Data Modeling Expertise: Develop modeling guidelines for building facts, dimensions, and other data model components following industry best practices. Data Transformation: Transform data into a more consumable semantic layer for business users, translating technical language into business-centric terms. Technology Adoption: Research and introduce new technologies into the environment through Proof-of-Concepts (POCs). Prepare POC code designs for development and production. Toolset: Proficient in Microsoft data tools like Azure Data Factory, Data Lake, SQL Databases, Data Warehouse, Synapse Analytics, Databricks, Purview, and Power BI. Data Pipeline Orchestration: Design and advise on data pipeline execution to meet customer latency expectations, manage dependencies, and ensure data freshness with minimal disruption. Data Governance: Ensure data security, access management, and data cataloging requirements. Mentorship: Guide data modelers, analysts, and scientists in building models for KPI delivery, operational system interaction, and enhanced machine learning predictability. Qualifications: University degree in computer science, software engineering, or related data fields (data engineering, analysis, AI, machine learning). 6-8 years of experience in data modeling, data warehouse design, and data solution architecture in a Big Data environment. Experience with cloud-based data lake ingestion and data modeling projects. Experience with relational and in-memory data models (star/snowflake schemas). Experience designing and implementing event-driven, near-real-time, or streaming data solutions for varied data types across platforms. Strong knowledge of data model design for problem solving, data pipeline design patterns, data structure optimization, low-latency processing, and common data model creation. To apply please send your resume to careers@cpus.ca or through the following link: https://lnkd.in/efDvK69G
We are currently requesting resumes for the following position: Data Architect Resume Due Date: Friday, June 14, 2024 (5:00PM EST) Number of Vacancies: 2 Job ID: 24-079 Level: MP6 Duration: 4 months Hours of work: 40 Location: 700 University Ave (100% Remote) Recruiter: Valerie Dziawa *For complete job description please click the link below* Job Overview Data Architecture and Design: Lead the design and implementation of modular and scalable data pipelines (ELT/ETL) and data infrastructure. This includes creating curated common data models that provide a single source of truth for analytics and downstream systems. Data Security and Standards: Collaborate with security teams to ensure data security during transfer and storage. Define data modeling and database system field standards for optimal join performance and data extensibility. Data Pipeline Development: Create code templates for data pipelines and transformations for various data types (structured, semi-structured, unstructured). Data Modeling Expertise: Develop modeling guidelines for building facts, dimensions, and other data model components following industry best practices. Data Transformation: Transform data into a more consumable semantic layer for business users, translating technical language into business-centric terms. Technology Adoption: Research and introduce new technologies into the environment through Proof-of-Concepts (POCs). Prepare POC code designs for development and production. Toolset: Proficient in Microsoft data tools like Azure Data Factory, Data Lake, SQL Databases, Data Warehouse, Synapse Analytics, Databricks, Purview, and Power BI. Data Pipeline Orchestration: Design and advise on data pipeline execution to meet customer latency expectations, manage dependencies, and ensure data freshness with minimal disruption. Data Governance: Ensure data security, access management, and data cataloging requirements. Mentorship: Guide data modelers, analysts, and scientists in building models for KPI delivery, operational system interaction, and enhanced machine learning predictability. Qualifications: University degree in computer science, software engineering, or related data fields (data engineering, analysis, AI, machine learning). 6-8 years of experience in data modeling, data warehouse design, and data solution architecture in a Big Data environment. Experience with cloud-based data lake ingestion and data modeling projects. Experience with relational and in-memory data models (star/snowflake schemas). Experience designing and implementing event-driven, near-real-time, or streaming data solutions for varied data types across platforms. Strong knowledge of data model design for problem solving, data pipeline design patterns, data structure optimization, low-latency processing, and common data model creation. To apply please send your resume to careers@cpus.ca or through the following link: https://lnkd.in/ekxaVHaZ
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
-
Big Data Architect Job Opportunity THIS IS NOT OPEN TO CONSULTING COMPANIES AND THEIR CONSULTANTS ON C2C. Rate: $140-145/hr. C2C/1099 Duration: 3 Month Contract to Hire 100% Remote Please email resume to: al@eQuestSolutions.com This is an outstanding opportunity for a Big Data Architect design and build out a Big Data Architecture from the ground up with their preferred models, technologies and tools that will determine the success of our DaaS (Data as a Service) startup company We are a DaaS (Data as a Service) company utilizing AI concepts and Data Science to create optimization solutions and are looking for a Big Data Data Architect. We believe in a growth mindset and lead with inclusion and value diversity of experience and perspectives to create more robust products and a more beautiful world. We are VC funded for a total of $24.5M to date and have been in business for 5+ years. The Big Data Architect will be responsible for gaining a thorough understanding of our business model, revenue streams and products. Based on this knowledge, the Big Data Architect will design, develop and implement robust, scalable and high-performance modern Big Data solutions based on their preferred technologies. This role requires a deep understanding of complex data sets, Big Data technologies, Data Modeling, Data Integration and Data Warehousing. Responsibilities: Architecture Design: Develop the overall Big Data architecture including data ingestion, processing, storage and retrieval for complex data sets. Data Modeling: Design and implement data models, schemas, workflows, mining and metadata standards to support business requirements, ensure data quality and consistency, create insights and generate new intellectual property. Technology Selection: Evaluate and select appropriate Big Data technologies and tools based on personal preferences, project needs and performance requirements. Create a road map and short term and long-term plans for Executive Management presentation. Data Integration: Design and implement data integration processes to consolidate data from several external and internal sources into a centralized repository. Performance Optimization: Identify and implement Big Data optimization strategies for performance, scalability and cost-efficiency. Cloud Integration: Integrate Big Data solutions with cloud platforms (e.g., AWS, Azure, GCP) to leverage cloud-based services and infrastructure. Requirements: 5+ years of experience in Big Data architecture and design. Strong proficiency in Big Data technologies (i.e. Apache Hadoop, Spark, Kafka, Airflow and Hive, Elasticsearch, Databricks, KNIME, Presto, NiFi, NoSQL etc.). Expertise in data modeling and data warehousing concepts and ETL processes and pipelines Experience with cloud platforms (i.e. AWS, Azure, GCP) is preferred.
To view or add a comment, sign in
61,711 followers