We're #hiring a new Cloud Data base- Lead in Bengaluru, Karnataka. Apply today or share this post with your network.
LanceSoft India ’s Post
More Relevant Posts
-
ACC HR - Assistant Manager Talent Acquisitions Hiring Specialist -Technical & Non-Technical Hiring/SAP Technical Hiring /Hiring for Multi- Cloud, Devops & Digitalization/Lateral Hiring/Team Building/Bulk Hiring
Hi Connections, Greetings! WE are looking for " AWS Solutions Architect " with one of the MNC at Mumbai Location. Please find below the detail job description.; Location - Andheri, Mumbai Experience - 3 To 6 Years. Work Mode - General Shift & Work from Office only. Cloud Architecture Design: Design and implement scalable, secure, and resilient cloud infrastructure solutions that align with business requirements. Provide expertise in selecting appropriate cloud services, technologies, and architectural patterns. Designing and implementing secure & scalable landing zone including architecture Designing and implementing innovative cloud solutions including architecture, data lake, ML, migration etc. Infrastructure as Code (IaC): Develop and maintain Infrastructure as Code scripts using tools like Terraform or CloudFormation for automated provisioning and configuration. Build full-stack automated quality - IaC, deployment, and monitoring and promote best practices for building and operating highly reliable systems, to our Clusters/function. Networking: Design and implement virtual networks, subnets, and routing to facilitate secure and efficient communication between cloud resources. Implement load balancing and content delivery strategies to optimize network performance. Storage and Data Management: Architect storage solutions, including object storage, block storage, and file storage, to meet performance and scalability requirements. Implement data backup and recovery strategies. Security and Compliance: Define and enforce security best practices for cloud environments. Collaborate with the security team to ensure compliance with industry standards and regulatory requirements. Identity and Access Management (IAM): Design IAM policies and roles to control access to cloud resources. Implement multi-factor authentication and other security measures. Automation and Orchestration: Collaboration and Communication: Analysing new, existing public cloud services, and including networking, provisioning, managed services etc. Extensive hands-on knowledge in AWS etc. and various IaaS, PaaS, and other cloud services like IoT/Event Hub, Streaming ETL, UI as well as code based ETL tools, storage, data lake/warehouse, ML, visualization, FinOps, DevOps, MLOps, governance, IaC etc. Develop and implement a Cloud governance framework, including best engineering practices, cloud policies, standards, quality, procedures, and guidelines for the management and use of Cloud. Troubleshooting Cloud related issues and providing technical support to developers/users. If interested then connect me on Shubhangi.sarode@acc.ltd OR 8459146002 immediately. #hiringimmediately #awssolutionsarchitect #awscloudengineer ---- Thanks & Regards, Shubhangi Sarode ACC HR - Sr. Talent Acquisitions Shubhangi.sarode@acc.ltd 8459146002
To view or add a comment, sign in
-
Top Data Analytics Jobs in Bangalore Explore the top data analytics job opportunities in Bangalore! Dive into roles at leading tech companies like Indium Software, Amazon, and more. Whether you’re an experienced data analyst or just starting, discover how you can leverage your skills in a thriving tech hub. Check out these exciting openings and advance your career today! Read more: https://zurl.co/6fp9 #DataAnalyticsJobs #BangaloreJobs #DataDriven #CareerGrowth #TechJobs #AnalyticsCareers #AINews #AnalyticsInsight #AnalyticsInsightMagazine
To view or add a comment, sign in
-
Dear #Techies, HIRING!!!! We are Hiring for an MNC client, Let's Connect and Share your resume at mentioned Email ID comment for better reach Saibabu Alla 📧 : sai.a@s3staff.com -Kindly Share this post with your Techie friends and Help me connecting more people, Job Title : Azure Data factory-Support Engineer Experience Needed: 3 - 5 years Location: Hyderabad Notice Period: Immediate Joiners * Starting 8 Months Could be Work On Night Shifts * Data movement Description: Roles and Responsibilities: · Experience in designing and hands-on development in cloud-based analytics solutions. Having 1-2 years of experience in Azure cloud services Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, Azure Blob, ADLS, Azure Data Lake Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. Designed and developed data ingestion pipelines from on-premises to different layers into Azure Built Dataflow transformations. Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. Pipeline execution methods (Debug vs Triggers). Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. Experience in creating alerts on the pipelines level and activity level. Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. Customer support: Strong customer engagement skills to understand customer needs for Analytics solutions fully. Experience in leading small and large teams in delivering analytics solutions for customers. Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. Strong problem solving and troubleshooting skills azure cloud services with azure resources like ADF or azure synapse or azure data or azure Blob or adls mandatory skill Azure SQL DB #Immediatehiring #azuredatafactory #mncjobs #microsoft
To view or add a comment, sign in
-
Solution Architect @Microsoft, Microsoft Intern FY'22, Judge/Speaker/Mentor, 4 ⭐ CodeChef, Pupil at Codeforces, SIH Winner, SOH Winner, Postman Leader CKA, CKAD, CKS, LCFT, IEEE Leadership Summit, Kubestronaut Program
Open Opportunities
📣 Calling all Data Engineers! 📣 We're excited to announce that we're hiring for Data Engineers to join our dynamic team in Bangalore, Hyderabad and Noida. As a Data Engineer, you'll play a vital role in designing, developing, and maintaining our data infrastructure while optimizing data flow and collection for cross-functional teams. If you're passionate about data and have a strong background in data engineering, we want to hear from you! Our ideal candidate will have experience with Azure Data Factory, Azure Databricks, and Data Lakes. Apply now through the link below and take the next step in your career as a Data Engineer! https://lnkd.in/gfdkfwXT #DataEngineer #BangaloreJobs #HyderabadJobs #noidajobs #azuredatafactory #azuredatabricks #datalakes Veena Akella Vishnudas Prabhu Kunjibettu Satyajit Sur Jyothi Gopalan Rohit Kumar Mamta Sharma Jaideep Avasarala Ramya H R Sireesh Govindan
To view or add a comment, sign in
-
If you are searching for new opportunities in Data Analytics, Data Engineering or even for any other roles then here's something that you should know 👨💻 ➡️ Your current location mentioned in your resume, naukri profile or any other portal directly influences the number of calls you get. For instance, when I was looking for a job change I hardly received any calls from Bangalore, Hyderabad, etc. Almost 98% of the calls were from Delhi NCR. ➡️ So, If your hometown does not have many opportunities and you are willing to relocate to any specific city for new opportunities, then you can mention the same location as your current location in your resume. This can enhance your chances of receiving more interview calls, and while discussing the opportunities, you can clarify the same to the recruiter. #dataanalytics #dataengineering #sql #businessintelligence
To view or add a comment, sign in
-
#Hiring #hiring #hiring #hiring #Principal / #Senior_Data_Engineer #Experience: 10 to 15 years #Location: Remote #Job_Description: We are seeking an experienced Principal/Senior Data Engineer with a strong background in architecting and building enterprise-scale data platforms. The ideal candidate will be responsible for ideating, designing, and developing a state-of-the-art enterprise data platform. You will lead the development of connector frameworks to source data from both on-premise and cloud systems, ensuring efficient data storage, processing, and advanced analytics capabilities. This role requires expertise in Google Cloud Platform (GCP) and a deep understanding of data governance and observability practices. #Key_Responsibilities: #Ideation & #Architecture: Lead the ideation, architecture, and design of a new enterprise data platform, ensuring it meets the highest standards of scalability, reliability, and performance. #Data_Integration: Design and develop connector frameworks and modern connectors to source data from a variety of systems, both on-premise and in the cloud. #Data_Storage & #Processing: Optimize data storage, processing, and querying to ensure efficient data management and retrieval. #Advanced_Analytics & #Machine_Learning: Develop and integrate advanced analytics and machine learning capabilities into the data platform. #Observability & #Data_Governance: Design and implement observability frameworks and data governance practices to ensure data integrity, quality, and transparency. #Deployment & #Release_Management: Drive the deployment and release cycles, ensuring a robust, scalable, and efficient data platform. #Required_Skills & #Experience: #Enterprise_Data_Platforms: Proven experience in architecting and building enterprise-scale data platforms, particularly in greenfield environments. #Google_Cloud_Platform (GCP): Expertise in building end-to-end data platforms and data services using GCP technologies, including BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, and Pub/Sub. #Metadata_Management: Strong experience in architecting and implementing metadata management solutions, including data catalogs, data lineage, data quality, and data observability for big data workflows. #Observability_Tools: Proficiency in observability tooling such as Grafana and Datadog, ensuring the data platform is monitored and maintained effectively. #Data_Mesh & #IoT_Architecture: Experience in Data Mesh architecture, building semantic layers for data platforms, and designing scalable IoT architectures. Interested can share their resume on shubhita.srivastava@orangeskill.com Or WhatsApp on 9580929238 #DataEngineer #SeniorDataEngineer #PrincipalDataEngineer #EnterpriseData #DataArchitecture #BigData #GCP #GoogleCloud #DataPlatform #MachineLearning #AdvancedAnalytics #DataGovernance #DataObservability #CloudComputing #DataMesh #IoT #MetadataManagement
To view or add a comment, sign in
-
#Hiring #hiring #hiring #hiring #Principal / #Senior_Data_Engineer #Experience: 10 to 15 years #Location: Remote #Job_Description: We are seeking an experienced Principal/Senior Data Engineer with a strong background in architecting and building enterprise-scale data platforms. The ideal candidate will be responsible for ideating, designing, and developing a state-of-the-art enterprise data platform. You will lead the development of connector frameworks to source data from both on-premise and cloud systems, ensuring efficient data storage, processing, and advanced analytics capabilities. This role requires expertise in Google Cloud Platform (GCP) and a deep understanding of data governance and observability practices. #Key_Responsibilities: #Ideation & #Architecture: Lead the ideation, architecture, and design of a new enterprise data platform, ensuring it meets the highest standards of scalability, reliability, and performance. #Data_Integration: Design and develop connector frameworks and modern connectors to source data from a variety of systems, both on-premise and in the cloud. #Data_Storage & #Processing: Optimize data storage, processing, and querying to ensure efficient data management and retrieval. #Advanced_Analytics & #Machine_Learning: Develop and integrate advanced analytics and machine learning capabilities into the data platform. #Observability & #Data_Governance: Design and implement observability frameworks and data governance practices to ensure data integrity, quality, and transparency. #Deployment & #Release_Management: Drive the deployment and release cycles, ensuring a robust, scalable, and efficient data platform. #Required_Skills & #Experience: #Enterprise_Data_Platforms: Proven experience in architecting and building enterprise-scale data platforms, particularly in greenfield environments. #Google_Cloud_Platform (GCP): Expertise in building end-to-end data platforms and data services using GCP technologies, including BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, and Pub/Sub. #Metadata_Management: Strong experience in architecting and implementing metadata management solutions, including data catalogs, data lineage, data quality, and data observability for big data workflows. #Observability_Tools: Proficiency in observability tooling such as Grafana and Datadog, ensuring the data platform is monitored and maintained effectively. #Data_Mesh & #IoT_Architecture: Experience in Data Mesh architecture, building semantic layers for data platforms, and designing scalable IoT architectures. Interested can share their resume on shubhita.srivastava@orangeskill.com Or WhatsApp on 9580929238 #DataEngineer #SeniorDataEngineer #PrincipalDataEngineer #EnterpriseData #DataArchitecture #BigData #GCP #GoogleCloud #DataPlatform #MachineLearning #AdvancedAnalytics #DataGovernance #DataObservability #CloudComputing #DataMesh #IoT #MetadataManagement
To view or add a comment, sign in
-
Assistant Manager - Talent Acquisition | Hiring| USA| UAE| REMOTE| @Lucidspire Private Limited, Bangalore
🚀 Lucidspire Hiring Alert: Thrilling Opportunity with Our Esteemed Client! 🚀 🔍 Position: BigQuery Consultant 📍 Location: Mumbai 🕒 Experience: 5 to 7 Years 🚀 Notice Period: Immediate Joiners to 30Days About Client: Our client is a technology company that specializes in providing cloud consulting and digital transformation services. They are known for their expertise in Google Cloud Platform (GCP) and offer a range of services, including cloud migration, application modernization, data analytics, AI/ML solutions, and managed cloud services. They aim to help businesses leverage cloud technologies to drive innovation, improve operational efficiency, and achieve their strategic goals. Job Description: General Duties: 🛠️ Design & maintain data solutions on Google platforms. 📅 Participate in team meetings & sprints. 🏗️ Work on data architecture using Google data tools. 👥 Client-facing role with strong communication. Technical Requirements: 📊 5+ years of experience, 2+ years in GCP Architecture & Data Warehousing. 🎯 Expertise in Google Cloud, Cloud Composer, Airflow, Dataflow, Java. ⚙️ Experience with Azure Databricks, Data Factory, Python, Apache Spark. ☁️ Hands-on with GCP Data Lake, Warehouse, & Migration. 🔄 Familiarity with CI/CD, Delta Lake, & GCP security. 🐍 Strong Python & SQL (MySQL, SQL Server). 🧩 Experience in ETL, complex SQL, & data models. Other Requirements: 🗣️ Strong communication, problem-solving, independent/teamwork. 👥 Client-facing experience required. Good to Have: 🤖 Knowledge of Vertex AI, machine learning, Agile, & large projects. 💼Please share your resume with reshma.k@lucidspire.com if you're interested #Design #DataSolutions #GoogleCloud #TeamMeetings #DataArchitecture #ClientFacing #GCP #CloudComposer #Airflow #Dataflow #Java #Python #PySpark #GoogleCloud #AzureDatabricks #AzureDataFactory #DataBricks #DataFactory #DataPipeline #DataSolutions #MLOps #DatabricksNotebooks #DataLakes #DataSecurity #DataModels #ApacheSpark #DataLake #DataWarehouse #DataMigration #CICD #DeltaLake #SecurityBestPractices #ETL #SQL #ProblemSolving #IndependentWork #TeamWork #VertexAI #MachineLearning #Agile #LargeProjects #Mumbai #ImmediateNotice #Architecture #ArchitecturalDesign #BigQuery #DataFlow #RecommendationSystems #TeamCollaboration
To view or add a comment, sign in
-
Dear #Techies, We are Hiring for #ADF for an MNC client, Let's Connect and Share your resume at mentioned Email ID comment for better reach G.Divya Keerthana 📧 : keerthana.g@s3staff.com -Kindly Share this post with your Techie friends and Help me connecting more people, Job Title : ADF Experience: 3 to 5 yrs Location: Hyderabad Notice Period: Immediate to 15 Days Job Description: * Experience in designing and hands-on development in cloud-based analytics solutions. * Having 1-2 years of experience in Azure cloud services * Having experience in Azure resources like Azure Data Factory, Azure Synapse Analytics, * Azure Blob, ADLS, Azure Data Lake * Storage Gen1 and Gen2, Logic Apps, Key Vault, Azure SQL DB and Synapse * Hands-on experience in Azure Data factoryand its Core Concept's like Linked Services, Datasets. * Data Flows, Pipelines, Activities and Triggers, Integration Runtime, Self-Hosted Integration Runtime. * Designed and developed data ingestion pipelines from on-premises to different layers into Azure * Built Data Flow transformations. * Worked on Copy Data activity, Get Metadata Activity, look up, Filter, Store Procedure, For-each, IF and execute Pipeline activities * Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline. * Strong Knowledge on Parameterization of Linked Services, Datasets Pipelines, Activities. * Pipeline execution methods (Debug vs Triggers). * Experience in scheduling the pipelines and monitoring the pipelines through the monitor tab. * Experience in creating alerts on the pipelines level and activity level. * Stack (Including Compute, Function App, Blobs, Resource Groups, Azure SQL, Cloud Services, and ARM), Focusing on high - availability, fault tolerance, and auto-scaling. * Performed custom activity using python, C#.net for unsupported task in Azure Data Factory V2. * Customer support: Strong customer engagement skills to understand customer needs for Analytics solutions fully. Experience in leading small and large teams in delivering analytics solutions for customers. Have demonstrated ability to define, develop and implement data models and supporting policies, standards, and guidelines. Strong problem solving and troubleshooting skills azure cloud services with azure resources like ADF or azure synapse or azure data or azure Blob or adls mandatory skill Azure SQL DB. #Azuredatafactory #azureblob #azureactivedirectory #synapse #immediatejoiners #mnchiring #permenentjobs #cfbr
To view or add a comment, sign in
49,968 followers