We're #hiring a new AWS Data Engineer in Chennai, Tamil Nadu. Apply today or share this post with your network.
TVS Next’s Post
More Relevant Posts
-
Solution Architect @Microsoft, Microsoft Intern FY'22, Judge/Speaker/Mentor, 4 ⭐ CodeChef, Pupil at Codeforces, SIH Winner, SOH Winner, Postman Leader CKA, CKAD, CKS, LCFT Mentor at Coding Ninjas, Kubestronaut Program
Open Opportunities
📣 Calling all Data Engineers! 📣 We're excited to announce that we're hiring for Data Engineers to join our dynamic team in Bangalore, Hyderabad and Noida. As a Data Engineer, you'll play a vital role in designing, developing, and maintaining our data infrastructure while optimizing data flow and collection for cross-functional teams. If you're passionate about data and have a strong background in data engineering, we want to hear from you! Our ideal candidate will have experience with Azure Data Factory, Azure Databricks, and Data Lakes. Apply now through the link below and take the next step in your career as a Data Engineer! https://lnkd.in/gfdkfwXT #DataEngineer #BangaloreJobs #HyderabadJobs #noidajobs #azuredatafactory #azuredatabricks #datalakes Veena Akella Vishnudas Prabhu Kunjibettu Satyajit Sur Jyothi Gopalan Rohit Kumar Mamta Sharma Jaideep Avasarala Ramya H R Sireesh Govindan
To view or add a comment, sign in
-
We are Hiring! AWS Data Engineer | 3+ Years | Mumbai. Interested candidates share your CV at aishna.hr@gmail.com or Aishna@serendipityservices.in Mandatory Skills - AWS Services, Python, Apache Spark, redshift , s3, anthena. Responsibilities: • Create and manage cloud resources in AWS • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations • Develop an infrastructure to collect, transform, combine and publish/distribute customer data. • Define process improvement opportunities to optimize data collection, insights and displays. • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible • Identify and interpret trends and patterns from complex data sets • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. • Key participant in regular Scrum ceremonies with the agile teams • Proficient at developing queries, writing reports and presenting findings • Mentor junior members and bring best industry practices. #awsdataengineer #dataengineer #mumbai #apachespark #awsservices
To view or add a comment, sign in
-
Dear Connections, I'm Sai Babu, Acquisition Specialist I'm reaching out to you might be interested in the below opportunity, please find the job description and share your updated resume to the following mail to process in further. Job Title : Azure Data factory-Support Engineer Experience Needed: 3 - 5 years Location: Hyderabad Notice Period: Immediate Joiners Job Description: · Experience in designing and hands-on development in cloud-based analytics solutions. Having 1-2 years of experience in Azure cloud services Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, Azure Blob, ADLS, Azure Data Lake Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. Designed and developed data ingestion pipelines from on-premises to different layers into Azure Built Dataflow transformations. Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. Pipeline execution methods (Debug vs Triggers). Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. Experience in creating alerts on the pipelines level and activity level. Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services,and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. If anyone Intrested Please share your updated resume to sai.a@s3staff.com #immediatehiring #azuredataengineer #mncjobs #azuredatafactory #datalake
To view or add a comment, sign in
-
DevOps Engineer ♾️| Linux🐧|AWS ☁️| Docker 🐳 | Kubernetes ☸️ | CiCd 🚀 | Terraform 🏗️ |anSibLe⚙️ | Jenkins🧑🔧| SheLL - ScriPting 💠 | Grafana⛄ | Git & GitHub 🐙 | Prometheus ♨️ |
Latest Opportunity🔥🔥🔥🔥🔔🔔🔔🚀 #devopsjob #devops #devopscommunity #trainwithshubham #90daysofdevops #90daysofdevopschallenge #awsdevops #azure #azurecloud #hiring #azure #devopsengineer #awscommunity
🌟 𝐄𝐱𝐜𝐢𝐭𝐢𝐧𝐠 𝐨𝐩𝐩𝐨𝐫𝐭𝐮𝐧𝐢𝐭𝐲 𝐚𝐥𝐞𝐫𝐭 ! 🌟 SPEC INDIA is #hiring 👨💻 𝐒𝐑. Azure Data Engineer #Exp : 4+ Years #location : Ahmedabad #Applynow ⬇ 📧 soham.barot@spec-india.com #hiringalert #azure #onsite #applynow
To view or add a comment, sign in
-
Dear #Techies, HIRING!!!! We are Hiring for an MNC client, Let's Connect and Share your resume at mentioned Email ID comment for better reach Saibabu Alla 📧 : sai.a@s3staff.com -Kindly Share this post with your Techie friends and Help me connecting more people, Job Title : Azure Data factory-Support Engineer Experience Needed: 3 - 5 years Location: Hyderabad Notice Period: Immediate Joiners * Starting 8 Months Could be Work On Night Shifts * Data movement Description: Roles and Responsibilities: · Experience in designing and hands-on development in cloud-based analytics solutions. Having 1-2 years of experience in Azure cloud services Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, Azure Blob, ADLS, Azure Data Lake Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. Designed and developed data ingestion pipelines from on-premises to different layers into Azure Built Dataflow transformations. Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. Pipeline execution methods (Debug vs Triggers). Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. Experience in creating alerts on the pipelines level and activity level. Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. Customer support: Strong customer engagement skills to understand customer needs for Analytics solutions fully. Experience in leading small and large teams in delivering analytics solutions for customers. Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. Strong problem solving and troubleshooting skills azure cloud services with azure resources like ADF or azure synapse or azure data or azure Blob or adls mandatory skill Azure SQL DB #Immediatehiring #azuredatafactory #mncjobs #microsoft
To view or add a comment, sign in
-
Databricks India : We're scaling like never before ! Software Engineering + Presales + Post Sales + Data Engineering + Partner Sales + Security , and a lot more open positions at #Databricks India !!! #databricksindia
Exciting news! Databricks India is expanding and looking to fill numerous positions across various verticals. This presents a great opportunity for job seekers to join a dynamic, growing company and contribute to their success. 1. Field Engineering - Partner Solutions Architect (Big Data) - Bengaluru : https://lnkd.in/gBhu7HNV *First PSA in India 2. Security Engineering (Remote) - Sr Security Engineer (Incident Response) : https://lnkd.in/g22MJUCV 3. Data Engineering (Remote) - Architect (Big Data) : https://lnkd.in/gwT2bkr6 - Sr Data Engineer : https://lnkd.in/gCwphAgY 4. Engineering (R&D) - Bangalore - Sr Software Engineer : https://lnkd.in/gWCDQh2f - Staff Software Engineer : https://lnkd.in/gptJrizu EXCITING updates : - https://lnkd.in/g-FQtWMJ - https://lnkd.in/gUv7SBqp - https://lnkd.in/gxbEaW33 - https://lnkd.in/dbTBEfDx It's an absolutely fantastic time to get on board and be part of the exciting journey ahead. Apply Away! Jeffry Issac | Akhila Aradhya (She/Her) | Debarshi Ghosh | Shivani Kanodia (She/Her/Hers) #databricksindia #apachespark #partnersolutionsarchitect #solutionsarchitect #dataandai #dataengineer #dataarchitect #seniorsoftwareengineer #staffsoftwareengineer #presales #incidentresponse #securityengineer #bigdatamanager #indiahiring #customerengineering #dbrx
To view or add a comment, sign in
-
Dear Connections, Hiring!!!! We are hiring for #AzureDataFactory Support role for an MNC Client, Let's Connect and share your resume to Mentioned Email ID or Comment to better reach, Saibabu Alla 📩: sai.a@s3staff.com Job Title : Azure Data factory-Support Engineer Experience Needed: 3 - 5 years Location: Hyderabad Notice Period: Immediate Joiners Job Type: Full Time Note - Starting 8 Months Could be Work On Night Shifts Job Description: · Experience in designing and hands-on development in cloud-based analytics solutions. . Having 1-2 years of experience in Azure cloud services . Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, Azure Blob, ADLS, Azure Data Lake . Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse . Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. . Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. . Designed and developed data ingestion pipelines from on-premises to different layers into Azure Built Dataflow transformations. . Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities . Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. . Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. . Pipeline execution methods (Debug vs Triggers). . Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. . Experience in creating alerts on the pipelines level and activity level. . Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. . Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. Customer support: . Strong customer engagement skills to understand customer needs for Analytics solutions fully. . Experience in leading small and large teams in delivering analytics solutions for customers. . Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. . Strong problem solving and troubleshooting skills azure cloud services with azure resources like ADF or azure synapse or azure data or azure Blob or adls mandatory skill Azure SQL DB #microsoftazure #azuredatafactory #databricks #adfsupport
To view or add a comment, sign in
-
Dear Connections, Hiring!!!! We are hiring for hashtag #AzureDataFactory Support role for an MNC Client, Let's Connect and share your resume to Mentioned Email ID or Comment to better reach, Mukesh Sunchu 📩: mukesh.s@s3staff.com Job Title : Azure Data factory-Support Engineer Experience Needed: 3 - 5 years Location: Hyderabad Notice Period: Immediate Joiners Job Type: Full Time Note - Starting 8 Months Could be Work On Night Shifts Job Description: · Experience in designing and hands-on development in cloud-based analytics solutions. . Having 1-2 years of experience in Azure cloud services . Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, Azure Blob, ADLS, Azure Data Lake . Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse . Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. . Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. . Designed and developed data ingestion pipelines from on-premises to different layers into Azure Built Dataflow transformations. . Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities . Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. . Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. . Pipeline execution methods (Debug vs Triggers). . Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. . Experience in creating alerts on the pipelines level and activity level. . Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. . Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. Customer support: . Strong customer engagement skills to understand customer needs for Analytics solutions fully. . Experience in leading small and large teams in delivering analytics solutions for customers. . Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. . Strong problem solving and troubleshooting skills azure cloud services with azure resources like ADF or azure synapse or azure data or azure Blob or adls mandatory skill Azure SQL DB hashtag #microsoftazure #azuredatafactory #databricks #adfsupport
To view or add a comment, sign in
69,484 followers